| ▲ | programmertote 9 hours ago | |
I know it's impossible in some software stack and ecosystem. But I live mostly in the data world, so I usually could get away from such issues by aggressively keeping my upstream dependency list lean. P.S. When I was working at Amazon, I remember that a good number of on-call tickets were about fixing dependencies (in most of them are about updating the outdated Scala Spark framework--I believe it was 2.1.x or older) and patching/updating OS'es in our clusters. What the team should have done (I mentioned this to my manager) is to create clusters dynamically (do not allow long-live clusters even if the end users prefer it that way), and upgrading the Spark library. Of course, we had a bunch of other annual and quarterly OKRs (and KPIs) to meet, so updating Spark got the lowest of priorities... | ||