| ▲ | ryandvm 4 hours ago | ||||||||||||||||
I dunno, man. I've been doing this for 20+ years and I think we're at a really important fork in the road where there are two possibilities. The first is that AI is achieving human-level expertise and capability, but since they're now being increasingly trained on their own output they are fighting an uphill battle against model collapse. In that case, perhaps AI is going to just sort of max out at "knowing everything" and maybe agentic coding is just another massive paradigm shift in a long line of technological paradigm shifts and the tooling has changed but total job market collapse is unlikely. The other possibility is that we're going to continue to see escalating AI capability with regard to context, information retrieval, and most importantly "cognition" (whatever that means). Maybe we overcome the challenges of model collapse. Maybe we figure out better methodologies for training that don't end up just producing a chatbot version of Stack Overflow + Wikipedia + Reddit. Maybe we actually start seeing AI create and not just recreate. If it's the latter, then I think engineers who think they are going to stay ahead of AI sound an awful lot like saddle makers who said "pffft, these new cars can only go 5 miles per hour." | |||||||||||||||||
| ▲ | golddust-gecko 3 hours ago | parent | next [-] | ||||||||||||||||
100% this. I'll also add another factor: it's become increasingly clear at our company that AI-enabled humans are getting to the bottom of the backlog of feature ideas much quicker. This makes the 'good ideas' part of the business the rate limiting step. And those are definitely not increasing with AI, beyond that generated by the AI churn itself ("let's bolt on a chat experience or an MCP!") So maybe the coding assistants don't get a 10x improvement any time soon, but we see engineering job market contraction because there aren't really enough good ideas to turn into code. | |||||||||||||||||
| |||||||||||||||||
| ▲ | bborud 3 hours ago | parent | prev | next [-] | ||||||||||||||||
You are supposing that AI is achieving human level expertise and capability is a given. I am not so sure. Right now that's much further from the truth than one might think at first glance. | |||||||||||||||||
| ▲ | koonsolo 22 minutes ago | parent | prev | next [-] | ||||||||||||||||
Do you think the latter can be achieved with the LLM neural network architecture? I highly doubt it. Neural networks are very old tech, and it took us that long to get us here. I'm sure we'll reach AGI at some point, but looking at AI history, I don't see that coming any time soon. | |||||||||||||||||
| ▲ | WorldMaker 2 hours ago | parent | prev [-] | ||||||||||||||||
> max out at "knowing everything" LLMs know nothing but are great at giving the illusion that they know stuff. (It's "mansplaining as a service"; it is easier to give confident answers every time, even if they are wrong, than to program actual knowledge.) Even your first case seems wildly optimistic. The second case is a lot of "maybes" and "we don't know how but we might figure it out" that seems like a lot to bet an entire farm on, much less an entire industry of farms. We sure are looking at a shift in the job market, but I don't think it is a fork in the road so much as a Slow/Yield sign. Companies are signalling they are willing to take promises/hope to cut labor costs whether or not the results are real. I don't think anything about current AI can kill the software development industry, but I sure do think it can do a lot to make it a lot more miserable, lower wages, and artificially reduce job demand. I don't think this has anything to do with the real capabilities of today's AI and everything to do with the perception is enough of an excuse and companies were always looking for that excuse. (Just as ageism has always existed. AI is also just a fresh excuse for companies to carry on aging out experience from their staff, especially people with long enough memories/well schooled enough memories to remember previous AI booms and busts.) But also, yeah if some magic breakthrough makes this a real "buggy whip manufacturer moment" and not just an illusion of one, I don't mind being the engineer on that side of it. There's nothing wrong about lamenting the coming death of an industry that employs a lot of good people and tries to make good products. This is HN, you celebrate the failures, learn from them, and then you pivot or you try something new. If evidence tells me to pivot then I will pivot, I'm already debating trying something entirely new, but learning from the failures can also mean respecting "what went right?" and acknowledging how many people did a lot of good, hard work despite the outcome. | |||||||||||||||||
| |||||||||||||||||