▲ | ido 3 days ago | |
Historically imporvments in programmer productivity (e.g. via better languages, tooling and hardware) didn't correlate to a decrease in demand for programemrs, but quite the opposite. | ||
▲ | scarface_74 3 days ago | parent | next [-] | |
This is completely different - said as someone who has been in the industry professionally for 30 years and as a hobbyist before then for a decade. There are projects I lead now that I would have at least needed one or maybe two junior devs to do the grunt work after I have very carefully specified requirements (which I would have to do anyway) and diagrams and now ChatGPT can do the work for me. That’s never been the case before and I’ve personally gone from programming in assembly, to C, to higher level languages and on the hardware side, personally managing the build out of a data center that had an entire room dedicated to a SAN with a whopping 3TB of storage to being able to do the same with a yaml/HCL file. | ||
▲ | torginus 2 days ago | parent | prev | next [-] | |
Imo historically there was no connection between the two - demand for programmers increased, while at the same time, better tools came along. I remember Bill Gates once said (sometime in the 2000s) that his biggest gripe, is during his decades in the software industry, despite dramatic improvements in computing power and software tools, there has only been a modest increase in productivity. I started out programming in C for DOS, and once you got used to how things were done, you were just as productive. The stuff frameworks and other stuff help with, is 50% of the job at max, which means due to Amdahls law, productivity can at most double. In fact, I'd argue productivity actually got reduced (comparing my output now, vs back then). I blame this on 2 factors: - Distractions, it's so easy to d*ck around the internet, instead of doing what you need to do. I have a ton of my old SVN/CVS repos, and the amount of progress I made was quite respectable, even though I recall being quite lazy. - Tooling actually got worse in many ways. I used to write programs that ran on the PC, you could debug those with breakpoints, look into the logs as txt, deployment consisted of zipping up the exe/uploading the firmware to the uC. Nowadays, you work with CI/CD, cloud, all sorts of infra stuff, debugging consists of logging and reading logs etc. I'm sure I'm not really more productive. | ||
▲ | 2 days ago | parent | prev [-] | |
[deleted] |