▲ | Workaccount2 2 days ago | |
>Because that path lies skill atrophy. I wonder how many programmers have assembly code skill atrophy? Few people will weep the death of the necessity to use abstract logical syntax to communicate with a computer. Just like few people weep the death of having to type out individual register manipulations. | ||
▲ | cmsj 2 days ago | parent | next [-] | |
I would say there's a big difference with AI though. Assembly is just programming. It's a particularly obtuse form of programming in the modern era, but ultimately it's the same fundamental concepts as you use when writing JavaScript. Do you learn more about what the hardware is doing when using assembly vs JavaScript? Yes. Does that matter for the creation and maintenance of most software? Absolutely not. AI changes that, you don't need to know any computer science concepts to produce certain classes of program with AI now, and if you can keep prompting it until you get what you want, you may never need to exercise the conceptual parts of programming at all. That's all well and good until you suddenly do need to do some actual programming, but it's been months/years since you last did that and you now suck at it. | ||
▲ | kmeisthax 2 days ago | parent | prev [-] | |
Most programmers don't need to develop that skill unless they need more performance or are modifying other people's binaries[0]. You can still do plenty of search-and-learning using higher-level languages, and what you learn at one particular level can generalize to the other. Even if LLMs make "plain English" programming viable, programmers still need to write, test, and debug lists of instructions. "Vibe coding" is different; you're telling the AI to write the instructions and acting more like a product manager, except without any of the actual communications skills that a good manager has to develop. And without any of the search and learning that I mentioned before. For that matter, a lot of chatbots don't do learning either. Chatbots can sort of search a problem space, but they only remember the last 20-100k tokens. We don't have a way to encode tokens that fall out of that context window into some longer-term weights. Most of their knowledge comes from the information they learned from training data - again, cheated from humans, just like humans can now cheat off the AI. This is a recipe for intellectual stagnation. [0] e.g. for malware analysis or videogame modding |