| ▲ | skybrian 5 hours ago | ||||||||||||||||||||||||||||||||||||||||||||||
People don't realize how much software engineering has improved. I remember when most teams didn't use version control, and if we did have it, it was crappy. Go through the Joel Test [1] and think about what it was like at companies where the answers to most of those questions was "no." [1] https://www.joelonsoftware.com/2000/08/09/the-joel-test-12-s... | |||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | Towaway69 5 hours ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||
At the same time, systems have become far more complex. Back when version control was crap, there weren't a thousand APIs to integrate and a million software package dependencies to manage. Sure everything seems to have gotten better and that's why we now need AIs to understand our code bases - that we created with our great version control tooling. Fundamentally we're still monkeys at keyboards just that now there are infinitely many digital monkeys. | |||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | nradov 4 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||
Version control is useful but it has nothing to do with software engineering per se. Most software development is craft work which doesn't meet the definition of engineering (and that's usually fine). Conversely, it's possible to do real software engineering without having a modern version control system. | |||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | zer00eyz 2 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||||||||
> People don't realize how much software engineering has improved. It has, but we have gotten there by stacking turtles, by building so many layers of abstraction that things no longer make sense. Think about this hardware -> hypervisor -> vm -> container -> python/node/ruby run time all to compile it back down to Bytecode to run on a cpu. Some layers exist because of the push/pull between systems being single user (PC) and multi user (mainframe). We exacerbated the problem when "installable software" became a "hard problem" and wanted to mix in "isolation". And most of that software is written on another pile of abstractions. Most codebases have disgustingly large dependency trees. People keep talking about how "no one is reviewing all this ai generated code"... Well the majority of devs sure as shit arent reviewing that dependency tree... Just yesterday there was yet another "supply chain attack". How do you protect yourself from such a thing... stack on more software. You cant really use "sub repositories/modules" in git. It was never built that way because Linus didnt need that. The rest of us really do... so we add something like artifactory to protect us from the massive pile of stuff that you're dependent on but NOT looking at. It's all just more turtles on more piles. Lots of corporate devs I know are really bad at reviewing code (open source much less so). The PR code review process in many orgs is to either find the person who rubber-stamps and avoid the people who only bike shed. I suspect it's because we have spent the last 20 years on the leet code interview where memorizing algorithms and answering brain teasers was the filter. Not reading, reviewing, debugging and stepping through code... Our entire industry is "what is the new thing", "next framework" pilled because of this. You are right that it got better, but we got there by doing all the wrong things, and were going to have to rip a lot of things apart and "do better". | |||||||||||||||||||||||||||||||||||||||||||||||