| ▲ | oxag3n 19 hours ago |
| > We're thinking about AI wrong. And this write up is not an exception. Why even bother thinking about AI, when Anthropic and OpenAI CEOs openly tell us what they want (quote from recent Dwarkesh interview) - "Then further down the spectrum, there’s 90% less demand for SWEs, which I think will happen but this is a spectrum." So save thinking and listen to intent - replace 90% of SWEs in near future (6-12 months according to Amodei). |
|
| ▲ | Galanwe 19 hours ago | parent | next [-] |
| I don't think anyone serious believes this. Replacing developers with a less costly alternative is obviously a very market bullish dream, it has existed since as long as I've worked in the field. First it was supposed to be UML generated code by "architects", then it was supposed to be developers from developing countries, then no-code frameworks, etc. AI will be a tool, no more no less. Most likely a good one, but there will still need to be people driving it, guiding it, fixing for it, etc. All these discourses from CEO are just that, stock market pumping, because tech is the most profitable sector, and software engineers are costly, so having investors dream about scale + less costs is good for the stock price. |
| |
| ▲ | oxag3n 19 hours ago | parent | next [-] | | Ah, don't take me wrong - I don't believe it's possible for LLMs to replace 90% or any number of SWEs with existing technology. All I'm saying is - why to think what AI is (exoskeleton, co-worker, new life form), when its owners intent is to create SWE replacement? If your neighbor is building a nuclear reactor in his shed from a pile of smoke detectors, you don't say "think about this as a science experiment" because it's impossible, just call police/NRC because of intent and actions. | | |
| ▲ | xyzsparetimexyz 16 hours ago | parent [-] | | > If your neighbor is building a nuclear reactor in his shed from a pile of smoke detectors, you don't say "think about this as a science experiment" because it's impossible, just call police/NRC because of intent and actions. Only if you're a snitch loser |
| |
| ▲ | user3939382 4 hours ago | parent | prev [-] | | If you gave the LLM your carefully written UML maybe its output would be better lol. That’s what we’re missing, a mashup of the hype cycle tools. |
|
|
| ▲ | anyonecancode 4 hours ago | parent | prev | next [-] |
| If the goal is to reduce the need for SWE, you don’t need AI for that. I suspect I’m not alone in observing how companies are often very inefficient, so that devs end up spending a lot of time on projects of questionable value—something that seems to happen more often the larger the organization. I recall at one job my manager insisted I delegate building a react app for an internal tool to a team of contractors rather than letting me focus for two weeks and knock it out myself. It’s always the people management stuff that’s the hard part, but AI isn’t going to solve that. I don’t know what my previous manager’s deal was, but AI wouldn’t fix it. |
|
| ▲ | jacquesm 19 hours ago | parent | prev | next [-] |
| Not without some major breakthrough. What's hilarious is that all these developers building the tools are going to be the first to be without jobs. Their kids will be ecstatic: "Tell me again, dad, so, you had this awesome and well paying easy job and you wrecked it? Shut up kid, and tuck in that flap, there is too much wind in our cardboard box." |
| |
| ▲ | overgard 15 hours ago | parent | next [-] | | Couldn't agree more, isn't that the bizarre thing? "We have this great intellectually challenging job where we as workers have leverage. How can we completely ruin that while also screwing up every other white collar profession" | | |
| ▲ | entrox 9 hours ago | parent [-] | | Why is it bizarre? It is inevitable. After all, AI has not ruined creative professions, it merely disrupted and transformed them. And yes, I fully understand my whole comment here being snarky, but please bear with me. Let's rewind 4 years to this HN article titled "The AI Art Apocalypse": https://news.ycombinator.com/item?id=32486133 and read some of the comments. > Actually all progress will definitely will have a huge impact on a lot of lives—otherwise it is not progress. By definition it will impact many, by displacing those who were doing it the old way by doing it better and faster. The trouble is when people hold back progress just to prevent the impact. No one should be disagreeing that the impact shouldn't be prevented, but it should not be at the cost of progress. Now it's the software engineers turn to not hold back progress. Or this one: https://news.ycombinator.com/item?id=34541693 > [...] At the same time, a part of me feels art has no place being motivated by money anyway. Perhaps this change will restore the balance. Artists will need to get real jobs again like the rest of us and fund their art as a side project. Replace "Artists" with "Coders" and imagine a plumber writing that comment. Maybe this one: https://news.ycombinator.com/item?id=34856326 > [...] Artists will still exist, but most likely as hybrid 3d-modellers, AI modelers (Not full programmers, but able to fine-tune models with online guides and setups, can read basic python), and storytellers (like manga artists). It'll be a higher-pay, higher-prestige, higher-skill-requirement job than before. And all those artists who devoted their lives to draw better, find this to be an incredibly brutal adjustment. Again, replace "Artists" with coders and fill in the replacement. So, please get in line and adapt. And stop clinging to your "great intellectually challenging job" because you are holding back progress. It can't be that challenging if it can be handled by a machine anyway. | | |
| ▲ | tovej 7 hours ago | parent [-] | | The premise of those comments, just like the premise in this thread, is ridiculous and fantastical. The only way generative AI has changed the creative arts is that it's made it easier to produce low quality slop. I would not call that a true transformation. I'd call that saving costs at the expense of quality. The same is true of software. The difference is, unlike art, quality in software has very clear safety and security implications. This gen AI hype is just the crypto hype all over again but with a sci-fi twist in the narrative. It's a worse form of work just like crypto was a worse form of money. | | |
| ▲ | entrox 7 hours ago | parent | next [-] | | I do not disagree, in fact I'm feeling more and more Butlerian with every passing day. However, it is undeniable that a transformation is taking place -- just not necessarily to the better. | |
| ▲ | topocite 5 hours ago | parent | prev [-] | | I just don't understand this line of thinking. Gen AI is the opposite of crypto. The use is immediate, obvious and needs no explanation or philosophizing. You are basically showing your hand that you have zero intellectual curiosity or you are delusional in your own ability if you have never learned anything from gen AI. |
|
|
| |
| ▲ | rXwubXUGAm 5 hours ago | parent | prev | next [-] | | I'm assuming they all have enough equity that if they actually managed to build an AI capable of replacing themselves they'll be financially set for the rest of their lives. | |
| ▲ | metaltyphoon 19 hours ago | parent | prev | next [-] | | I have a feeling they internally say "not me, I won't be replaced" and just keep moving... | | | |
| ▲ | arcxi 11 hours ago | parent | prev | next [-] | | Is it the first time when workers directly work on their own replacement?
If so, software developer may go down in history as the dumbest profession ever. | |
| ▲ | moron4hire 16 hours ago | parent | prev [-] | | "Well son, we made a lot of shareholder value." |
|
|
| ▲ | overgard 15 hours ago | parent | prev | next [-] |
| The funny thing is I think these things would work much better if they WEREN'T so insistent on the agentic thing. Like, I find in-IDE AI tools a lot more precise and I usually move just as fast as a TUI with a lot less rework. But Claude is CONSTANTLY pushing me to try to "one shot" a big feature while asking me for as little context as possible. I'd much rather it work with me as opposed to just wandering off and writing a thousand lines. It's obviously designed for anthropic's best interests rather than mine. |
| |
|
| ▲ | IX-103 6 hours ago | parent | prev | next [-] |
| Where is this "90% less demand for SWEs" going to come from? Are we going to run out software to write? Historically when SWEs became more efficient then we just started making more complicated software (and SWE demand actually increased). |
| |
| ▲ | elevatortrim 6 hours ago | parent [-] | | That happens in times of bullish markets and growing economies. Then we want a lot of SWEs. In times of uncertainty and things going south, that changes to we need as little SWEs as possible, hence the current narrative, everyone is looking to cut costs. Had GPT 3 emerged 10-20 years ago, the narrative would be “you can now do 100x more thanks to AI”. |
|
|
| ▲ | dasil003 10 hours ago | parent | prev [-] |
| I sort of agree the random pontification and bad analogies aren't super useful, but I'm not sure why you would believe the intent of the AI CEOs has more bearing on outcomes than, you know, actual utility over time. I mean those guys are so far out over their skis in terms of investor expectations, it's the last opinion I would take seriously in terms of best-effort predictions. |