I find that it's heavily dependent on the drive speed, so I've leaned into getting current generation, very fast drives as much as possible when I put together new computers and sometimes a mid-generation upgrade. Considering I often do consulting work across random projects, I pretty often am having to figure out and install things in one mono repo managed with pnpm, another with yarn, etc... so the pain is relatively real, that said, fastest drive matters as much or more, especially with build steps.
When handling merge/pull requests, I'll often do a clean step (removing node_modules, and temp files) before a full install and build to test everything works. I know not everyone else is this diligent, but this can happen several times a day... Automation (usually via docker) can help a lot with many things tested through a CI/CD environment, that said, I'm also not a fan of having to wait for too long for that process... it's too easy to get side-tracked and off-task. I tend to set alarms/timers throughout the day just so I don't miss meetings. I don't want to take a moment to look at HN, and next I know it's a few hours later. Yeah, that's my problem... but others share it.
So, again, if you can make something take less than 15s that typically takes much more, I'm in favor... I went from eslint to Rome/Biome for similar reasons... I will switch to faster tooling to reduce the risk of going off-task and not getting back.