▲ | torginus 2 days ago | |
Imo historically there was no connection between the two - demand for programmers increased, while at the same time, better tools came along. I remember Bill Gates once said (sometime in the 2000s) that his biggest gripe, is during his decades in the software industry, despite dramatic improvements in computing power and software tools, there has only been a modest increase in productivity. I started out programming in C for DOS, and once you got used to how things were done, you were just as productive. The stuff frameworks and other stuff help with, is 50% of the job at max, which means due to Amdahls law, productivity can at most double. In fact, I'd argue productivity actually got reduced (comparing my output now, vs back then). I blame this on 2 factors: - Distractions, it's so easy to d*ck around the internet, instead of doing what you need to do. I have a ton of my old SVN/CVS repos, and the amount of progress I made was quite respectable, even though I recall being quite lazy. - Tooling actually got worse in many ways. I used to write programs that ran on the PC, you could debug those with breakpoints, look into the logs as txt, deployment consisted of zipping up the exe/uploading the firmware to the uC. Nowadays, you work with CI/CD, cloud, all sorts of infra stuff, debugging consists of logging and reading logs etc. I'm sure I'm not really more productive. |