▲ | chr15m 5 days ago | |
The reason upgrades have become routine is because modern software is slop. We should strive for software to do one thing well (or at least be made up of modular parts that do one thing well) and prize backwards compatability, so that it does not require constant churn. The sane middle ground between "constant upgrades" and "never upgrade" is to upgrade when there is an actual vulnerability found in a dependency. Instead of churn for no reason, you update only with a good reason. | ||
▲ | esafak 5 days ago | parent [-] | |
It's not about managing churn; it's about making it easy to refactor by leaving open the option to use newer versions of libraries, or add dependencies that themselves depend on newer versions of libraries. And the way to do that is to continually pay off the tech debt in upgrading your dependencies. Do you only write utilities that are fire and forget? Most people write applications whose specifications constantly evolve. |