| ▲ | bborud 4 hours ago |
| Multiple times per week I have the same conversation. It goes something like this: - AI will make developers irrelevant
- Why?
- Because LLMs can write code
- Do you know what I do for a living?
- Yes, write code?
- Yes, about 2-5% of the time. Less now.
- But you said you are a developer?
- I did
- So what do you do 95-98% of the time?
- I understand things and then apply my ability to formulate solutions
- But I can do that!
- So why aren't you?
The developers who still think their job is about writing code will perhaps not have a job in the future. Brutal as it may sound: I'm fine with that. I'm getting old and I value my remaining time on the planet.Business owners who think they can do without developers because they think LLMs replace developers are fine by me too. Natural selection will take care of them in due course. |
|
| ▲ | doug_durham 2 hours ago | parent | next [-] |
| This is a bit of glib answer. Most of the time is spent coding which encompasses typing, retyping, and retyping again. It also includes banging your head against the wall while trying to get one of your rewrites to work against and under-documented API. OP's formulation makes SWE sound like a purely noble enterprise like mathematics. It's more like an oil rig worker banging on pieces of metal with large hammers to get the drill string put together. They went in with a plan, but the reality didn't agree and they are on a tight schedule. |
| |
| ▲ | sleight42 2 minutes ago | parent | next [-] | | This is far more true for junior and perhaps mid-career engineers, unless you're working in an extremely well-defined problem space (* see below). When working as a SWE, the longer I did it (~30 years) more of my time was spent understanding the problem, the edge cases, how to handle the edge cases, how to do all of it affordable, on time, and within budget. That's engineering. What you're describing is "writing code". That's lower value than "solving the problem". I imagine a response, "But agile development, etc." Yep. Part of solving the often sometimes involves creating prototypes to determine the essential viability of the solution. But that's only part of it. Which prototypes do you write? How much time do you allocate to same before accepting it's a dead end (at least for now) and punting on it? That's engineering. Me probably coming across as a dick today? Well, I was diagnosed autistic a year ago, and I'm on extended sabbatical/unemployment due to autistic burnout. And masking is part of how I got the burnout. * Why would someone be paying for that when there is likely someone else already doing it? Unless you're the rare person who hopes to "disrupt" the competition). | |
| ▲ | estebank 2 hours ago | parent | prev | next [-] | | Most of the time is spent figuring what the right thing to do is, not writing the implementation. Sometimes the process of writing the implementation surfaces new considerations about what the right thing is, but still, producing text to feed to a compiler is not the bulk of the work of a software engineer. It is to unearth requirements and turn them into repeatable software. | | |
| ▲ | powvans 40 minutes ago | parent [-] | | Feels like lately most of the time is spent arguing about or at least worrying about whether or not AI is going to replace all software developers. |
| |
| ▲ | ecocentrik an hour ago | parent | prev | next [-] | | Glib is called for. The amount of information asymmetry that's still on the table as vibe coders and vibe engineers and vibe doctors emerge is staggering. Professional experience is still incredibly valuable. Most software developers might spend more than 6% of their time coding but no Senior Developers are banging their heads for hours over typos. https://www.youtube.com/shorts/xBilK3gT5e0 | | |
| ▲ | pear01 a minute ago | parent [-] | | This is temporary. What is the SKILL.md equivalent going to be in five years? In ten? You don't already see a pattern emerging around solutions to encode that "professional experience" into the tools themselves? These LLMs can already incorporate our entire cultural corpus yet your "professional experience" is the threshold they won't cross? |
| |
| ▲ | pear01 15 minutes ago | parent | prev | next [-] | | Let's also not forget a lot of the market edge of SWEs comes in knowing how to navigate these parts. The fact you needed to be reasonably fluent in a language was already a barrier to entry which meant in better times new grads could earn six figures at their first job just for putting in that effort. Maybe you will still be needed. That is one question. How well you will be paid and treated when the barrier to entry is now "I can think" is another. As the parent indicates, most people doing software are not doing things akin to pure math. I don't think most SWEs want that lifestyle anyway. It's ok. You shouldn't fight the coming change. Instead use the time we still have to fight for more equal outcomes (vote for politicians that support UBI, Medicare for all). The longer you delude yourself that you are uniquely needed in an increasingly mechanized world the worse all our outcomes will be. | |
| ▲ | 2 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | bborud 2 hours ago | parent | prev | next [-] | | Are you, perchance, assuming that since you spend most of your time struggling with actual code, this is so for everyone else? Or are you saying that I'm lying. That I am secretly hammering away at my keyboard while pretending not to? No, writing code hasn't been how I spend most of my time for many decades now. | | |
| ▲ | therealdrag0 2 hours ago | parent [-] | | Are you a staff level engineer that has dozens of other engineers banging away at code projects you help define? | | |
| ▲ | eska an hour ago | parent | next [-] | | Try to write a design doc before you implement something (which people find they need to do for LLMs to work at all anyway). You’ll find that you spend much less time actually writing code. Write proper API documentation laying out the assumptions and intent, generate some good API docs, write a design and architecture document (which people find they need for LLMs to work at all anyway). You’ll find that you spend a lot less time reading code. | | |
| ▲ | dkersten an hour ago | parent [-] | | > which people find they need to do for LLMs to work at all anyway Everything we have to do for AI to function well, would help humans to function better too. If you take the things for AI, but do then for humans instead, that human will easily 2x or more, and someone will actually understand the code that gets written. |
| |
| ▲ | bborud an hour ago | parent | prev | next [-] | | It has varied over the years but it isn't actually relevant since I am talking about when I write software. Writing code just isn't what takes time. | | |
| ▲ | QuercusMax an hour ago | parent [-] | | Getting the code into a state where it actually does what you want takes time - but a lot of that is research, testing, experimentation, documentation, etc. Those can be faster with AI assistance but you still need to bang on it enough to make sure it works right. |
| |
| ▲ | kakacik 34 minutes ago | parent | prev [-] | | I am not, yet actual coding is miniscule part of workflow. The rest is cca un-automable by any llm - politics, meetings, discussions, brainstorming, organizing testing teams, stakeholders and so on. This is how big corporations look like, not some SV startups. |
|
| |
| ▲ | logicchains 36 minutes ago | parent | prev [-] | | >OP's formulation makes SWE sound like a purely noble enterprise like mathematics. It's more like an oil rig worker banging on pieces of metal with large hammers to get the drill string put together. Those two formulations represent different developers' approaches to the same task. The former being developers who are much better at planning than the latter. |
|
|
| ▲ | KronisLV 4 hours ago | parent | prev | next [-] |
| > Yes, about 2-5% of the time. There are also those for whom that percentage is higher, let’s say 6-50%. > I understand things and then apply my ability to formulate solutions The AI is coming for that too. You might just be lucky to be in circumstances that value your contributions or an industry or domain that isn’t well represented in the training data, or problem spaces too complex for AI. Not everyone is, not even the majority of devs. People knocking out Jira tickets and writing CRUD webapps will end up with their livelihood often taken away. Or bosses will just expect more output for same/less pay, with them having to use AI to keep up. |
| |
| ▲ | geodel 2 hours ago | parent | next [-] | | Agree. It is just like 2 totally separate groups are arguing. One very tiny slice of speciality/ rare industries where code is critical but overall small part of project costs. I can see if code / software is 5% of overall cost even heavy use of AI for code part is not moving the needle. So people in this group can feel confident in their indispensability. Second group is much larger and peddling CRUD / JS Frontends and other copy/paste junk. But as per industry classification they are just part of same Coder/Developer/IT Engineer group. And their bleak prospects is not some future scenario, it is playing out right now with tons of them getting laid off. And whole lot of people with IT degrees, certifications are not finding any jobs in this field. | | |
| ▲ | marcindulak 37 minutes ago | parent | next [-] | | After hearing various similarly sounding opinions about CRUD being easy for LLMs, I started tracking how well LLMs handle a standard CRUD Django app I'm familiar with at https://github.com/marcindulak/learning-api-styles-gen-ai-ex.... So far it appears that LLMs still require constant hand-holding, even for a small educational CRUD app. | |
| ▲ | hjort-e 2 hours ago | parent | prev [-] | | What makes you feel that a complex frontend would be easier for AI than a non-CRUD backend system? | | |
| ▲ | evilduck an hour ago | parent | next [-] | | Hubris. I don't mean this as a snarky jab. It's coming for anything software. I've used AI to accomplish front end development and reverse engineer proprietary USB hardware dongles in C, then rewriting the C into Rust to get easy desktop GUIs around it. Backend APIs, systems programming, embedded programming, they all seem equally threatened it's just a matter of time. Front end is easy to see in the AI web front ends but everything else is still easy pickings. | | |
| ▲ | hjort-e 35 minutes ago | parent | next [-] | | I 100% agree it's coming for everything. I'm just curious what the arguments would be for why frontend would be easier. | |
| ▲ | skydhash an hour ago | parent | prev [-] | | > I've used AI to accomplish front end development and reverse engineer proprietary USB hardware dongles in C, then rewriting the C into Rust to get easy desktop GUIs around it. Backend That is not hard. It’s just tedious and very slow to do manually. The hard part would be about designing a usb dongle and ensuring that the associated software has good UX. The reason you don’t see kernel devs REing devices is not because it’s impossible or that it requires expert knowledge. It’s because it’s like counting sands on the beach. |
| |
| ▲ | geodel an hour ago | parent | prev [-] | | It is irrelevant that complex frontend would be easy for AI or not. To me 1) how many unique complex frontends are needed out of total frontends that millions of sites out there need. 2) Will there be increase in need of such frontend engineers so other displaced folks can land a job there. I think it will be far fewer to have any positive impact on IT engineers' overall job prospects. | | |
| ▲ | hjort-e 40 minutes ago | parent [-] | | But that's equally true for any type of system. Frontend isn't inherently easier than other systems, so i was just wondering why you singled it out. To me AI just seems better at backends and database design | | |
| ▲ | geodel 21 minutes ago | parent [-] | | OK, my examples seemed like biased against frontend which was not the intention. The thrust was overall job prospects for people in software field. It is not that frontend is easy but it is definitely easy to get into. Considering there are far more frontend developers then say C++ system engineers or database designers so in sheer numbers they will be affect more. | | |
| ▲ | hjort-e 6 minutes ago | parent [-] | | Ah okay that's fair. In my country boot camps aren't a thing so frontend devs are rare and good frontend devs even more, so I think it depends on where in the world you are. We got an abundance of java devs here that i fear more for |
|
|
|
|
| |
| ▲ | nitwit005 33 minutes ago | parent | prev | next [-] | | > The AI is coming for that too. Yes, but if/when that happens, it won't just affect software engineers. An AI that can do that can replace any white collar worker. > People knocking out Jira tickets and writing CRUD webapps will end up with their livelihood often taken away. I'm not sure anyone is actually working on those. People talk about spending all day writing CRUD apps here, but if you suggest there are already low code tools to build those, they will promptly tell you it's too complex for that to work. | | |
| ▲ | laughing_man 16 minutes ago | parent [-] | | >Yes, but if/when that happens, it won't just affect software engineers. An AI that can do that can replace any white collar worker. Yes. Yes, that's exactly what we're going to see, and more swiftly than people are generally comfortable with. What are we going to do with all those cubicle dwellers? |
| |
| ▲ | dmazzoni 3 hours ago | parent | prev | next [-] | | There are periods of time where I might spend 80% of my time "coding", meaning I have minimal meetings and other responsibilities. However, even out of that 80% of my time, what fraction is actually spent "writing code"? AI can be an enormous accelerator for the time I'd normally spend writing lines of code by hand, but it doesn't really help with the rest of the work: - Understanding the problem
- Waiting for the build system and tests to run
- Manually testing the app to make sure it behaves as I'd like
- Reviewing the diff to make sure it's clear
- Uploading the PR and writing a description
- Responding to reviewer feedback There are times when AI can do the "write the code" portion 10x faster than I could, but if it's production code that actually matters, by the time I actually review the code, I doubt it's more than 2x. | | |
| ▲ | coldtea 2 hours ago | parent [-] | | >AI can be an enormous accelerator for the time I'd normally spend writing lines of code by hand, but it doesn't really help with the rest of the work: - Understanding the problem - Waiting for the build system and tests to run - Manually testing the app to make sure it behaves as I'd like - Reviewing the diff to make sure it's clear - Uploading the PR and writing a description - Responding to reviewer feedback What part of those you think it doesn't help with? | | |
| ▲ | malfist an hour ago | parent [-] | | There is no shortcut to understanding. No one can understand things for you |
|
| |
| ▲ | LPisGood 3 hours ago | parent | prev | next [-] | | > The AI is coming for that too. That may be true I’m not gonna say one way or the other, but if AI comes for that then almost all knowledge work is effectively dead, so all that’s left would be sales or physical labor. | | |
| ▲ | ge96 2 hours ago | parent | next [-] | | I wonder though, can AI make the next JS framework. I mean that in sincerity, there was the leap from jQuery to React for ex. If an AI only knows jQuery and no one makes React, will React come out of AI. | | |
| ▲ | wiseowise 6 minutes ago | parent | next [-] | | It can't. Framework hierarchy is largely based on social structure, rather than pure technical merit. Otherwise React would've been displaced long time ago. | |
| ▲ | scj an hour ago | parent | prev | next [-] | | A thought experiment: When all practical software is only written by AIs, will the AIs use goto? What will the programming language of AIs look like? My bet is something _like_ assembly, but not assembly. That being said, I think humans will still program for fun. Just like we paint portraiture in a world with cameras. | | |
| ▲ | ge96 40 minutes ago | parent [-] | | Yeah that's my thing for my hardware projects, I'm not going to reach for an LLM to do it, I want to write the code myself/be present. For something new I would consider using LLM to generate something, like a computer vision implementation or something I don't already know. The end result I would know how it works, just for POC. |
| |
| ▲ | ASalazarMX 2 hours ago | parent | prev | next [-] | | News: "AGI refuses to make another JS framework, rages on the follies of misguided developers and their wateful JS crutches" Developer community: Wow, we truly have become obsolete now! | | |
| ▲ | notpachet 33 minutes ago | parent | next [-] | | In a shocking twist, it turns out that Mootools is the agents' preferred framework | |
| ▲ | ge96 2 hours ago | parent | prev [-] | | Who will be the disrupters when there is nothing to disrupt |
| |
| ▲ | smrq 2 hours ago | parent | prev [-] | | People didn't leap from jQuery to React. It's a lot easier to imagine an AI looking at jQuery and [insert any server side MVC framework] and inventing Backbone. |
| |
| ▲ | BurningFrog 2 hours ago | parent | prev [-] | | The history of the last 250 years is inventing new professions as old ones are automated away. I expect that to continue. | | |
| ▲ | coldtea 2 hours ago | parent | next [-] | | The history of the last 250 was moving from agriculture to industrial work to service work. Now the last frontier is starting to be overtaken by automation too. (And in all of those transitions millions where left behind without work or with very worse prospects. The people that took the new jobs were often a different group, not people who knew the old jobs and were already in their 30s and 40s). And what would be the new professions that uniquely require humans, when even thinking and creative jobs are eaten by AI? Would there be a boom of demand for dancers and chefs, especially as millions lose their service jobs? | |
| ▲ | charlie90 8 minutes ago | parent | prev | next [-] | | Like doordashing and pokemon card reselling. | | | |
| ▲ | nitwit005 31 minutes ago | parent | prev | next [-] | | Given some sort of machine with human capabilities, there would be no reason to assign that profession to a human, excepting perhaps cost. | |
| ▲ | georgemcbay an hour ago | parent | prev [-] | | > The history of the last 250 years is inventing new professions as old ones are automated away. Even if this still holds true ("past performance is no guarantee of future results") the part about it that people handwave away without thinking about or addressing is how awful the transitional period can be. The industrial revolution worked out well for the human labor force in the long term, but there were multiple generations of people who suffered through a horrendous transition (one that was only alleviated by the rise of a strong labor movement that may not be replicable in the age of AI, given how it is likely to shift the leverage of labor vs. capital). If you want to lean on history as an indication that massive sudden productivity changes will make things better for humanity in the long run, then fine, but then you have to acknowledge that (based on that same history) the transition could still be absolutely chaotic and awful for the lifespan of anyone who is currently alive. |
|
| |
| ▲ | tjwebbnorfolk 2 hours ago | parent | prev | next [-] | | >> I understand things and then apply my ability to formulate solutions > The AI is coming for that too. If this is true, then you'd have to conclude that AI is coming for everything. I'm still not convinced by that. But I am convinced that the part of software development that involves typing code manually into an IDE all day is likely gone forever. | | |
| ▲ | itsafarqueue 2 hours ago | parent | next [-] | | > If this is true, then you'd have to conclude that AI is coming for everything. Now you’re getting it | |
| ▲ | flatline an hour ago | parent | prev [-] | | It really doesn't have to come for everything to feel like it's taking everything. If it eliminates 10% of white collar jobs over the next decade, the impact will be felt everywhere. |
| |
| ▲ | wiseowise 9 minutes ago | parent | prev | next [-] | | I swear to God, it's like majority of IT workers are spineless worms without an iota of self-respect. What are the reasons for this? Is it because most are nerds who were bullied in school, or some form weird elitism-inferiority complex? Only in tech I see people hating and not taking any pride in their work. Ask a lawyer or a doctor and they'll fight you to the death if you threaten their status. But here it is not only acceptable, but also for some reason encouraged to feel inferior. You're earning shitton of money? Well, you're just a lucky insect who taps on a keyboard while X does real job, about time AI has put you on the street. Disgusting. If some AI reckoning does come, I hope it will hit you first. | |
| ▲ | PunchyHamster 3 hours ago | parent | prev | next [-] | | > The AI is coming for that too. Current AI tech giants prove over and over and over again that this is not the case | | |
| ▲ | cromka 2 hours ago | parent [-] | | We've literally just started, what "over and over" do you refer to? | | |
| ▲ | malfist an hour ago | parent | next [-] | | I've been told the past four years that AI is coming for my job. And thats just not true. Its no closer to that than it was 4 years ago. | | |
| ▲ | laughing_man 2 minutes ago | parent | next [-] | | I'm not sure how anyone would know if it's closer or not. There's been a lot of progress in LLMs over the last four years. | |
| ▲ | KronisLV 22 minutes ago | parent | prev | next [-] | | > Its no closer to that than it was 4 years ago. There are people and companies out there releasing entire vibe coded projects and for some upwards of 80% of the code they develop is AI-assisted/generated. Since around the end of 2025 and models like Opus 4.6, the SOTA has gotten good enough to work agentically on all sorts of dev tasks with pretty good degrees of success (harnesses and how you use them still matters, ofc). | | |
| ▲ | wiseowise 5 minutes ago | parent [-] | | > There are people and companies out there releasing entire vibe coded projects and for some upwards of 80% of the code they develop is AI-assisted/generated. And how much revenue do they generate? |
| |
| ▲ | Danox 27 minutes ago | parent | prev | next [-] | | It is the lament of every generation of humans to think that they are the pinnacle of everything that has come before, we are just at the start of the so-called AI era, many very smart people coming up still haven’t really got their hands on all of the material available from a hardware and software standpoint. We are still at the early stages. I am very optimistic. I just wish I was younger to take advantage like Junior high, high school age with my current resources damn… The oldest lament in the books. | |
| ▲ | kakacik 27 minutes ago | parent | prev [-] | | It feels its just around the corner. But when you turn 20th corner and its still behind the next one, maybe things are a bit different than they seem / clueless emotions make us believe. Long term its bleak, but short/medium term - not so much, if I get fired it won't be llm replacing me but rather company politics, budget changes etc. Which was the only real (very real) risk for past 15 years too, consistently. But it helps to not work for US company. |
| |
| ▲ | hansmayer an hour ago | parent | prev | next [-] | | > We've literally just started 5+ years in the software world is like 30 years in others...So...given lacking use-cases and humongous amounts of capital already wasted on chatbots...It's more like "we" are closer to closing curtains than to "just started"... | |
| ▲ | ASalazarMX 2 hours ago | parent | prev | next [-] | | Hype cycles, AI has made developers obsolete like a dozen times in the las couple of years, at least according to their developers. | |
| ▲ | luckystarr 2 hours ago | parent | prev [-] | | Discovery of the best solution in a problem space is not generative but only verificative. Meaning: the LLM can see if a solution is better than another, but it can't generate the best one from the start. If you trust it, you'll get sub-par solutions. This is definitely an agent problem instead of an LLM problem. Anybody got something explorative like this working? | | |
| ▲ | coldtea 2 hours ago | parent [-] | | So? Hundreds of millions of office and devel jobs are about for developing "optimal solutions" to begin with. |
|
|
| |
| ▲ | no_op an hour ago | parent | prev | next [-] | | Even if AI advances continue, for quite a while there's likely still going to be the 'Steve Jobs' role. That is, even if AI coding agents can, in the future, replace entire teams of SWEs, competently making all implementation decisions with no guidance from a tech-savvy human, the best software will likely still involve a human deciding what should be built and being very picky about how, exactly, it should externally behave. I don't know if it makes sense to call that person an SWE, and some people currently employed as SWEs either won't be good at this or aren't interested in doing it. But the existing pool of SWEs is probably the largest concentration of people who'll end up doing this job, because it's the largest concentration of people who've thought a lot about, and developed taste with respect to, how software should work. | | |
| ▲ | laughing_man a few seconds ago | parent | next [-] | | How many Steve Jobs' do we need as a percentage of people developing software? | |
| ▲ | bmiedlar 8 minutes ago | parent | prev [-] | | This matches what I'm seeing. I've been building software for a long time, but building more now with AI than I ever could with a traditional team. But the throughput that's helpful is from knowing what to build and what tradeoffs matter. The AI doesn't have that. It's a force multiplier on experience, not a replacement for it. |
| |
| ▲ | Aperocky 2 hours ago | parent | prev | next [-] | | > The AI is coming for that too. That's where we fundamentally disagree about. Yes, AI is coming for solution formulation, absolutely, but not all of it, because it is actually a statistical machine with context limit. Until the day LLMs are not statistical machine with a context limit, this will hold. Someone need to make something that has intent and purpose, and evidently now not by adding another 10T to the LLM parameter count. | | |
| ▲ | bel8 2 hours ago | parent | next [-] | | > because it is actually a statistical machine with context limit. So are humans. Machines have surpassed humans by magnitudes in many capabilities already (how many billion multiplications can you do per second?) And I argue that current LLMs have surpassed many of my capabilities already. For example GPT/Opus can understand and document some ancient legacy project I never saw before in minutes. I would take a week+ to do the same and my report would probably have more mistakes and oversights than the one generated by the LLM. | | |
| ▲ | KalMann 24 minutes ago | parent | next [-] | | > So are humans. AI advocates are _way_ too confident about the nature human cognition. Questions that have been debated by philosophers and cognitive scientists for decades are now "obvious" according to you people, though you never provide any argument to support your statements. | |
| ▲ | Aperocky an hour ago | parent | prev [-] | | We are not pre-trained using the summary of all human knowledge over all of history. Yet we make certain decisions with much more ease. We are much more limited, but we fundamentally work differently. Hence adding more parameter like certain companies are doing isn't necessarily going to help. We need to rethink how LLM work, or how it work in tandem with something that's completely different. I think it's doable, I just don't believe it's LLM, and I don't think anyone now knows what it is. | | |
| ▲ | bel8 an hour ago | parent [-] | | > We are not pre-trained using the summary of all human knowledge over all of history. But we are? That's our education system. The only reason school doesn't try to shove more information in our brains is because we hit bandwidth limits. | | |
| ▲ | KalMann 21 minutes ago | parent [-] | | > But we are? That's our education system. That is not what the education system does. That's an obvious distortion of reality. People train over billions of documents to statistically predict the next word to gain and understanding of language. LLMs do this statistical processing in order to mimic humans natural language learning ability. And there has been continued evidence of the limitations of this approach to accurately mimic the totality of human cognition. |
|
|
| |
| ▲ | itsafarqueue 2 hours ago | parent | prev | next [-] | | Yours is a “God of the gaps” argument. You will remain technically correct (the best kind of correct!) long after the statistical machine has subsumed your practical argument, context limit and all. | | |
| ▲ | Aperocky an hour ago | parent [-] | | I fall into the "pessimistic heavy user" camp, I burn thousands of $ worth of SOTA tokens monthly but it just makes me more acutely aware of the limitation and amount of work I need to do to work around them and what kind of decision that I should reserve to myself instead of trusting the LLMs to do. |
| |
| ▲ | coldtea 2 hours ago | parent | prev [-] | | >but not all of it, because it is actually a statistical machine with context limit. And the human mind is not? | | |
| ▲ | KalMann 16 minutes ago | parent | next [-] | | I can give you the exact mathematical formula used to statistically optimize the output of a neural network from input examples. Can you do the same for the brain? | |
| ▲ | nothinkjustai 2 hours ago | parent | prev [-] | | It’s not. |
|
| |
| ▲ | bborud 3 hours ago | parent | prev | next [-] | | > The AI is coming for that too. To some degree yes, in practice, not so much. In practice you have to be in the world, talk to people, know how to talk to people, know how to listen, and be able to understand the difference between what people say and what they actually need. Not want. Need. This was something I learnt in my very first job in the 1980s. I worked for someone who did industrial automation beyond PLCs and suchlike. He spent 6 months working in the company. On the factory floor, in the logistics department, in procurement, in accounting and even shadowing the board. Then he delivered a proposal for how to restructure the parts of the company, change manufacturing processes, and show how logistics and procurement could be optimized if you saw them as two parts in a bigger dance. He redesigned the company so that it could a) be automated, and b) leverage automation to increase the efficiency of several parts of the business. THEN, he started planning how to write the software (this was the 80s after all), and then we started implementing it. Now think about what went into this. For instance we changed a lot of what happened on the factory floor. Because my boss had actually worked it. So he knew what pain points existed. Pain points even the factory workers didn't know how to address because they didn't know that they could be addressed. I was naive. I thought this was how everyone approached "software projects". People generally don't. But it did teach me that the job isn't writing code. It is reasoning about complex systems that often are not even known to those who are parts of it. And this is for _boring_ software that requires very little creativity and mostly zero novelty. Now imagine how you do novel things. > People knocking out Jira tickets and writing CRUD webapps will end up with their livelihood often taken away. Or bosses will just expect more output for same/less pay, with them having to use AI to keep up. You make it sound like it is a bad thing that certain tasks become easier. I spent a lot of time writing CRUD stuff. Because the things i really want to work on depend on them. I don't enjoy what is essentially boilerplate. Who does? If you can do the same job in 1/20 the time, then how is this a bad thing? It is only a bad thing if writing CRUD webapps is the limit of your ability. We don't argue for banning excavators because it puts people with shovels out of work. We find more meaningful things for them to do and become more productive. New classes of work becomes low-skilled jobs. If you have been doing software for a while, you are probably doing some subset of this. But these things are hard to articulate. It is hard to articulate because it is not something we think about. Like walking: easy for us to do, hard to program a robot to do it. | | |
| ▲ | coldtea 2 hours ago | parent | next [-] | | >To some degree yes, in practice, not so much. In practice you have to be in the world, talk to people, know how to talk to people, know how to listen, and be able to understand the difference between what people say and what they actually need. Not want. Need. 1 person needs to do that. The other 100 not doing that currently to begin with, but doing the AI-automatable work? | |
| ▲ | SoftTalker 3 hours ago | parent | prev [-] | | > To some degree yes, in practice, not so much. We used to say that (not long ago, even) about the code-writing part. Why do we believe that LLMs are going to stop there? Why do we think they won't soon be able to talk to people, listen, and determine what they need? I think it's mostly a cope. We have robots walking just fine now, by the way. | | |
| ▲ | sarchertech 3 hours ago | parent | next [-] | | If they can do those things they can effectively replace any white collar job. That’s about 45% of the workforce. Societies tend to collapse around 25-30% unemployment. Imagine 45% of higher than average paying jobs gone. If that happens we’ll either figure out a new economic system, or society will collapse. Also saying robots are walking just fine is misleading for any definition of just fine that is anywhere near as good as a human. | | |
| ▲ | ryandrake 2 hours ago | parent | next [-] | | Look at how the billionaires are talking about AI: Their clear, unambiguous goal is basically to replace all white collar "knowledge" jobs. And there's currently nothing regulatory that's stopping them--they just need to wait for the state of the art to improve. Once AI is "good enough" if it ever is, they won't even think twice about 45% unemployment. What are we unemployed workers going to do about it? There's no effective labor organization left. Workers have basically no political power or seat at the table. We're not going to get violent--the police/military are already owned by the billionaire class. We're just going to eventually become economically irrelevant and die off. | | |
| ▲ | geodel 2 hours ago | parent | next [-] | | > We're just going to eventually become economically irrelevant and die off. As harsh it may sound, it seems rather likely to me. It is not like s/w engineers have helped struggling workers in other sectors other than sanctimonious "Learn to code" advice. So software folks can't expect any solidarity or help from others. | |
| ▲ | kiba 2 hours ago | parent | prev | next [-] | | The fundamental issue isn't unemployment due to automation, but the fact that society cannot benefit from unemployment. It should be something for us to celebrate, because it means greater freedom for humans to pursue something else rather than spending time doing drudgery. | | |
| ▲ | shinryuu an hour ago | parent [-] | | Put it another way, the issue is that resources are not shared more equitably. This is especially egregious considering that LLMs are trained on all human knowledge. We've all been contributing to this enterprise, and what we may end up getting in return is unemployment. |
| |
| ▲ | monknomo 2 hours ago | parent | prev | next [-] | | 45% of folks sitting on their hands are going to have the free time to talk, and this group of people are skilled at organization. Are you planning on throwing your hands up and passively accepting whatever comes your way? | | |
| ▲ | rootusrootus an hour ago | parent [-] | | And at least in the US they have >45% of all the small arms weaponry. There is no bunker strong enough nor private army big enough if 100M people come for you. | | |
| ▲ | ryandrake 39 minutes ago | parent [-] | | They're probably be betting that the technology they will need to defend their bunkers, think autonomous kill-bots or whatever, will emerge before people start to riot. Or they're planning to build an Elysium-like colony in the ocean or space, to keep the billionaire class far from danger. |
|
| |
| ▲ | rootusrootus an hour ago | parent | prev | next [-] | | I get that it is popular to hate billionaires these days, but realistically, they did not get to be billionaires by being stupid. It runs directly counter to their own interests to induce anything like 45% unemployment. They will get poorer, the world they live in right along with the rest of us will get noticeably shittier, etc. More likely they figure out what to do with a bunch of idle talent. Or the coming generation of trillionaires will. | |
| ▲ | 2 hours ago | parent | prev [-] | | [deleted] |
| |
| ▲ | BurningFrog 2 hours ago | parent | prev [-] | | It's important (and calming) to understand that since the Industrial Revolution started ~250 years ago, we've automated away most jobs several times over, while employment levels have stayed pretty constant. "Automating half the jobs" is the same as "double productivity per worker". When the doubling happens in 5 years rather than 50, it might be more disruptive, but I'm convinced we're on the verge of huge improvements in human standard of living! | | |
| ▲ | wartywhoa23 2 hours ago | parent [-] | | What in the current state of world affairs outside of IT do you think is indicative of that potential for huge improvements in human standard of living? |
|
| |
| ▲ | bborud 3 hours ago | parent | prev | next [-] | | We never noticed how easy the code writing part had already become because it happened slowly. Through mechanical means, through the ability to re-use code, and through code generation. Heck, even long before LLMs about 10% to 30% of my code was already automatically generated. By tooling, by IDLs and by my editor just being able to infer what my most likely input would be. > We have robots walking just fine now, by the way. I don't think you got the point I was trying to make. | | |
| ▲ | SoftTalker 2 hours ago | parent [-] | | True, but I guess I see a distinction between scaffolded/templated boilerplate or autocomplete and actual application logic. People have generated boilerplate from templates for ages, as you say. RoR maybe a pretty good example, but there wasn't even early-days AI involved in doing that. |
| |
| ▲ | phkahler 2 hours ago | parent | prev | next [-] | | >> We used to say that (not long ago, even) about the code-writing part. Why do we believe that LLMs are going to stop there? Why do we think they won't soon be able to talk to people, listen, and determine what they need? Because they are currently "generative AI" meaning... autocomplete. They generate stuff but fall down at thinking and problem solving. There is talk of "reasoning models" but I think that's just clever meta-programming with LLMs. I can't say AI won't take that next step, but I think it will take another breakthrough on the order of transformers or attention. Companies are currently too busy exploiting the local maxima of LLMs. | | |
| ▲ | rootusrootus an hour ago | parent [-] | | > Companies are currently too busy exploiting the local maxima of LLMs I get the feeling we can already spot the next AI Winter. Which is okay, we need a breather, and the current technology is useful enough on its own. |
| |
| ▲ | terseus 3 hours ago | parent | prev [-] | | > Why do we believe that LLMs are going to stop there? Why do you believe they wont? I think it's reasonable to assume that we will hit a ceiling that current models will not be able to break. > We have robots walking just fine now, by the way. Walking and reasoning are unrelated abilities. | | |
| ▲ | SoftTalker 3 hours ago | parent [-] | | Walking was given as an example of "hard to program a robot to do it" by GP. Well, now we have robots that can walk. What evidence is there that LLMs have hit a ceiling at being able to do things like talk to users or stakeholders to elicit requirements? Using LLMs to help with design and architecture decisions is already a pretty common example that people give. |
|
|
| |
| ▲ | vga1 3 hours ago | parent | prev | next [-] | | >bosses The AI is coming for those too. | | |
| ▲ | snozolli 40 minutes ago | parent [-] | | Something like five to ten years ago, when AI hype was starting to hit media, one of the claims was that AI would come for middle-management first. Since middle-management can generally be described as collecting information from underlings and reporting information to upper management, their work was supposed to be easy to automate with AI. As far as I can tell, this hasn't proven to be true at all, and we software engineers proudly wrote ourselves out of work by constantly publishing our source code and discussing it openly. |
| |
| ▲ | at-fates-hands an hour ago | parent | prev | next [-] | | >> Or bosses will just expect more output for same/less pay, with them having to use AI to keep up. Anecdotal evidence to support this. I work with both dev and design teams. Upper management has already gone through several layoffs and offshoring of the two dev teams I work with. The devs they did keep were exactly what you said. The capable ones who reliably closed their Jira tickets. Never missed a deadline for building their features or components. And now? Their work has tripled and now the only help they get from management? "Start to figure out how to leverage AI, we're going to be a in hiring freeze for the next 10 months." The double whammy of losing onshore team members and not getting any help from management to fix the problem they just created and essentially just telling them to figure out how to use AI to keep up is pretty staggering. I would echo what one of the devs told me, "If this is the new "AI era" than you can count me right the fuck out of it." | |
| ▲ | oblio 3 hours ago | parent | prev | next [-] | | >> I understand things and then apply my ability to formulate solutions > The AI is coming for that too. In that case all [1] non manual work is doomed, until robotics has an LLM moment. [1] With the exception of all fields protected by politics or nepotism. | | |
| ▲ | rootusrootus an hour ago | parent [-] | | > all non manual work is doomed All work in general. Knowledge workers can still do manual work, and will compete to do so when there is no option to continue what they do today. |
| |
| ▲ | thisisit 2 hours ago | parent | prev [-] | | Lot of people don't seem to get that - It is easier to go from terrible to average but much harder to go from average to good. I am sure AI bros are same people who were convinced consumer grade fully automated driving was going happen "by end of the year" for last 7 years. | | |
| ▲ | Peanuts99 21 minutes ago | parent | next [-] | | I agree with the statement and think a lot of people miss this, but I also wonder how many people probably don't care for good, they only care for 'good enough'. | |
| ▲ | lostmsu 2 hours ago | parent | prev [-] | | No, I never believed in fully automated tale by Tesla, but as the LLMs improve my personal estimate for the date of human-level AGI is rapidly moving to "present". Before GPT-2 I had it somewhere in 2100, at GPT-2 I thought maybe by 2060 if we are lucky. Now I think it is 2035 or maybe even sooner. | | |
| ▲ | rootusrootus an hour ago | parent [-] | | I like to see the optimism, even if I don't share it. I think it's incredible hubris that humans think we are about to reinvent our own level of intelligence, just because we made a machine that talks pretty. |
|
|
|
|
| ▲ | 2 minutes ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | brandensilva an hour ago | parent | prev | next [-] |
| I remember being that kid in high school who ran math and logical problems hard which contributed to me being very technical and to learn to push through painful mental challenges on the regular. Out of most of my graduating class there were not many of us that went on to become engineers for a reason because it isn't easy work by any means and I'm guessing is quite draining for people who don't use their brain like we do. So while AI will change the industry I don't see any reputable company firing the smartest ones in the room for junior level intelligence. Even with it advancing someone has to be responsible for when it screws up which we know it will. |
|
| ▲ | hateful 3 hours ago | parent | prev | next [-] |
| Not sure where I first heard this, but I say it to my team all the time: "Programming is thinking, not typing" |
| |
| ▲ | strbean 2 hours ago | parent | next [-] | | I know a an accomplished CS professor, ACM fellow, cited in Knuth's TAOCP (as well as being an easter egg!), who still hunt-and-pecks. In fact, hunt-an-pecks incredibly slowly. Seeing him type really reinforced this idea. | | | |
| ▲ | the_hoffa 2 hours ago | parent | prev | next [-] | | I've always told my Jr Engineers to "think twice, code once". If I gave them a task and they immediately started typing it out, I would tell them to stop typing and ask them to explain to me what they were doing; they'd often just spit out what they thought the code should do, and I'd often point out edge cases they missed and would have missed had they just spit out code and a PR, wasting everyone's time. I would also insulate them from upper management to give them time to actually think (e.g. I wouldn't be coding so they could think then code). To your point and to the GP's point, and one point I keep raising with LLM's: "typing is not where my time sinks are" | |
| ▲ | CodeMage 2 hours ago | parent | prev [-] | | That's very true, which is why I find it insulting that so many AI proponents use the word "typing" to refer to writing code. It carries an implication that if you enjoy writing code by hand, you enjoy a mindless activity. |
|
|
| ▲ | czhu12 an hour ago | parent | prev | next [-] |
| Isn't the long term trend just that we don't need as many engineers, not that there will no more software engineers? Theres another, different loop I keep seeing which is: - Company A lays off engineers citing AI efficiencies
- People say its because of over hiring during 2020
- Company B lays off engineers citing AI efficiencies
- People say its because it was never a good business
- Company C lays off engineers citing AI efficiencies
- People say its because theres a recession
I guess to cite a counter example, unemployment is still super low, software jobs are still holding up, but the bear case is that eventually 5% of people will be able to do what people do today, and the demand for software won't grow at the same pace. |
| |
| ▲ | Xirdus a few seconds ago | parent [-] | | If company A is Amazon, company B is Ubisoft, and company C is Oracle, then I think it's very likely there isn't any pattern or "loop" here and it's just legitimately 3 different companies in 3 different situations doing layoffs for 3 different reasons but all 3 reaching for the same PR playbook. "We're leveraging AI to increase productivity" is the new "we're streamlining our business and focusing on our core products". |
|
|
| ▲ | sefrost 3 hours ago | parent | prev | next [-] |
| Only 5% of your time is spent writing code? That sounds like a low estimate for most software engineers I work with. May I ask if you could estimate how you spend the other 95% of the time? |
| |
| ▲ | Enginerrrd 34 minutes ago | parent | next [-] | | It sounds plausible to me since this is pretty en par with most other engineering disciplines. I’m a civil engineer. My responsibility is ultimately mostly to produce a constructable plan set. I spend far less than 5% of my time drafting or modeling. | |
| ▲ | hatthew an hour ago | parent | prev | next [-] | | In no particular order - Meetings
- Reading papers
- Understanding legacy code
- Reading internal news
- Ad hoc chats with coworkers
- Writing docs
- Editing configs
- Thinking about solutions
- Slacking off
- Analyzing results
- Testing code
- Reviewing PRs
- Understanding others' ongoing projects
| | |
| ▲ | PizzaBorsch 38 minutes ago | parent [-] | | AI can do everything you listed except chats with coworkers and slacking off. I just don't think you've utilized the most recent versions of codex or claude. |
| |
| ▲ | FatherOfCurses 17 minutes ago | parent | prev | next [-] | | Sneering at "kids these days" | |
| ▲ | varispeed an hour ago | parent | prev | next [-] | | The least experienced developer writes the most code. Juniors would be spending whole day in the IDE, typing, testing, typing etc.
Senior developers will go to a park for a few hours, think, then come back spent an hour or less typing code that just works or write nothing at all, maybe even delete code.
Instead they might update documents, ask clarifications about found edge cases or errors in planning that were not considered. | | |
| ▲ | nomel an hour ago | parent [-] | | Since software is in every industry of man, I think you'll need to mention which industry this perspective is coming from. This is definitely NOT the case in certain industries. |
| |
| ▲ | davidw 3 hours ago | parent | prev | next [-] | | Commenting on Hacker News? | | |
| ▲ | wartywhoa23 an hour ago | parent | next [-] | | For those who claim to be developers who code no more than 5% of their time and resort to arguments like "we're already not writing machine code by hand for 50 years, how is AI different from a higher level language?", it's not commenting, it's shilling for the AI corpocracy on HN. | |
| ▲ | icedchai 2 hours ago | parent | prev [-] | | In all seriousness, communications consumes a lot of time. Meetings, emails, Slack messages, pestering stake holders and other developers... | | |
| ▲ | hjort-e 2 hours ago | parent [-] | | If you spend 95% of your time on that stuff, you better be working on like critical infrastructure where nothing can go wrong, otherwise you are in an incredibly dysfunctional company. | | |
| ▲ | icedchai an hour ago | parent [-] | | I agree it would be absurd for it to take 95% of your time.
I have, however, seen that it takes a lot more time than one would think. I did some contracting work for a severely dysfunctional meeting heavy organization and it was about 2 hours of meetings for every hour of real technical work! | | |
| ▲ | hjort-e an hour ago | parent | next [-] | | Ah yes agreed, if it's more than 90% it just signals to me that a developers skills are probably being wasted too much on business/coordination stuff. But i guess if we mean actual time tapping your keyboard making code, then it's true some days for senior+ devs, but definitely not technical work overall. | |
| ▲ | fragmede an hour ago | parent | prev | next [-] | | So about 26 hours of meetings to 13 hours of "real technical work" per week, but that's is 33%, not 5%. | |
| ▲ | skydhash an hour ago | parent | prev [-] | | Even when it’s not dysfunctional, you spend a lot of time on communication and reading stuff other people wrote (including code). It’s very rare to work in isolation. | | |
| ▲ | hjort-e an hour ago | parent [-] | | I guess it depends on what you feel coding is. To me it's the architecture planning and reading other people code, not just writing code. If we say it's just typing, then 95% is not absurd no | | |
| ▲ | skydhash 42 minutes ago | parent [-] | | > it depends on what you feel coding is. To me it's the architecture planning and reading other people code, not just writing code And that would be where we disagree. I don’t read code to look at code. When I’m reading code, I’m looking for the contracts to follow when interacting with a system. It would be nice if it were documented, but more often than not you have to rely on code. It’s very rare that I plan with a technical mindset. Yes I use the jargon, but it’s all about the business needs. Which again create contracts. Same with writing code. Code is like English for me. If I don’t have a clear idea on what to write, I stop and do research (or ask someone). But when I do, it’s as straightforward as writing a sentence. | | |
| ▲ | hjort-e 22 minutes ago | parent [-] | | Huh? So you you don't research if something is technically feasible before you promise your stakeholders a delivery time/ price estimate? We all do the same stuff, the disagreement would just be what you feel coding is and if you think technical work is the same thing or a superset. If you as software dev aren't hands on with planning or working more than 5% of your time, you are basically a PO with a programing hobby |
|
|
|
|
|
|
| |
| ▲ | mxksisksm 3 hours ago | parent | prev [-] | | [dead] |
|
|
| ▲ | AlexCoventry 2 hours ago | parent | prev | next [-] |
| You don't think AI is going to be able to understand things and apply their ability to formulate solutions better than you, in the near future? |
| |
| ▲ | koonsolo 25 minutes ago | parent [-] | | In 2000 I learned about this old technology called "neural networks". AI really depends on long winters and rare breakthroughs. Deep neural network was the most recent breakthrough. The iterations you currently see it just adding more storage, but the fundamental neural network structure doesn't change. I'm confident AGI will not be achieved by the LLM architecture, and when the next AI breakthrough is, is anyones guess. But if you take history into account, it will take a while. |
|
|
| ▲ | dev_l1x_be an hour ago | parent | prev | next [-] |
| And most of the time the statistical aspect of LLMs result in a less creative solution that is more expensive to run and harder to maintain. LLMs at this stage are good at scaffolding, generating the boilerplate you do not want to write and glue things together quickly. It just makes engineers faster. |
|
| ▲ | dawnerd 24 minutes ago | parent | prev | next [-] |
| The problem is people think AI can replace the 95-95% that isn't code too. That's where we end up with massive unusable codebases that no one understands. |
|
| ▲ | timedude an hour ago | parent | prev | next [-] |
| > Business owners who think they can do without developers because they think LLMs replace developers are fine by me too. Natural selection will take care of them in due course. Thing is, natural selection will take care of you at the same time. Because you'll also come to rely on products they make, or services they offer, either directly or indirectly. So eventually, you too, will suffer the consequences of the enshloppification. |
|
| ▲ | hyperjeff 2 hours ago | parent | prev | next [-] |
| You’re a ”developer“, i guess, but not a coder (anymore), which is what your interlocutors are probably asking about. You’ve migrated to a middle manager job, not something they probably can just start doing competently. Essentially you’re agreeing with their initial sentiment, that coders will be made irrelevant. |
| |
| ▲ | onethought 2 hours ago | parent [-] | | I think it’s more nuanced. Even a “coder” spends the majority of their time, not coding. |
|
|
| ▲ | dakiol an hour ago | parent | prev | next [-] |
| That doesn't hold because the goal for executives is to increase revenue and the main sales pitch of Anthropic et al is to pay for agents instead of paying for engineers. That means 80% of the workforce is out no matter what. Whether or not one belongs to the remaining 20% is a different story, but obviously not all of us will be there. > I understand things and then apply my ability to formulate solutions AI is coming for that too. Don't be naive |
| |
| ▲ | varispeed an hour ago | parent [-] | | It will be interesting for governments using workers as proxy for taxing corporations. |
|
|
| ▲ | rpdillon an hour ago | parent | prev | next [-] |
| This is exactly it. The speed of light has not changed: we're limited by our ability to understand the system, and make decisions about what to do next. AI will speed that up, but the core work is the understanding and decision-making. Saying otherwise is sort of like reducing the task of writing a novel to typing. |
|
| ▲ | xhevahir 34 minutes ago | parent | prev | next [-] |
| The "apply my ability" is doing a lot of work, so to speak, in the above exchange. Work that might eventually well be automated away. |
|
| ▲ | madduci 2 hours ago | parent | prev | next [-] |
| Because that classifies in "developers" and "software engineers". And software engineering isn't going to disappear anytime soon |
| |
| ▲ | hellojesus 2 hours ago | parent [-] | | Weird. I call myself a developer because I don't have an engineering degree from an abet certified engineering program. I recognize, in some capacity, that this isn't the norm and in the US "professional engineer" is protected and not simply "engineer", but it feels akin to stolen valor to me. | | |
| ▲ | borski 2 hours ago | parent | next [-] | | If there were a license in the US for it, I’d agree with you. But as is, if you are “doing” engineering, you’re an engineer. If you are a licensed engineer of some kind, you’d state that outright. The equivalent of stolen valor would be claiming to be a licensed software engineer; except there is no such license so it would also be fraud, misrepresentation, etc. (I know this is different elsewhere) | | |
| ▲ | VonGallifrey 2 hours ago | parent [-] | | > If there were a license in the US for it, I’d agree with you. Yeah, that is basically the thing in my country. You can't call yourself an engineer without passing a test, but I can't take it because there isn't one for software engineering. Same thing for freelancing. Freelance jobs are defined in a list, and other jobs cannot benefit from the simplified tax rules that freelancers enjoy, but that list was written before software development was a thing. |
| |
| ▲ | traderj0e 2 hours ago | parent | prev | next [-] | | I call myself a computer programmer unless someone is asking for my official job title (software engineer) | |
| ▲ | bilbo0s 2 hours ago | parent | prev [-] | | I'm a software dev in the US and I never call myself "engineer" in that capacity. Always "programmer" or "developer". I agree. Engineers have to clear a much higher bar. Even though my career was spent in medical diagnostic software where we had to get 510k clearance, I was still keenly aware that this was a fundamentally different activity from actual engineering. | | |
| ▲ | whstl 2 hours ago | parent [-] | | I'm an electrical engineer that moved to software engineering and there's a lot of commonalities between what I do now and what I did previously as an electrical engineer. The bar might seem high, but that's the only way I know how to work, honestly. On the other hand, with the modern division of labour in a lot of companies and with the rhetoric I see here in HN and in other places: a lot of developers are indeed not even close to being engineers. |
|
|
|
|
| ▲ | fnordpiglet an hour ago | parent | prev | next [-] |
| Something missed in that computer science was a highly theory driven discipline where people were taught how to think critically about solving complex problems. Industry complained they weren’t teaching enough programming skills, so they dumbed down the thinking part and emphasized the vocational part. Now the vocational part is virtually useless, and the grounding of theory applied to complex problems is suddenly really relevant again. Schools will take time to retool their programs, teaching staff, and two generations if not three graduates will have entered into a work environment that doesn’t need what they learned. As someone 35 years into my career I agree this is the most exciting part of my career. I love programming and I do it all the time but I do it by reading code and course correction and explaining how to think about the problems and herding cats - just like working with a team of 100 engineers. But the engineers I’m working with now by and large listen, don’t snipe me on perf reviews, aren’t hallucinating intent based on hallway conversations with someone else, etc. This team of AI engineers I have can explain to me their work, mistakes, drift, etc without ego and it’s if not always 100% correct it’s at least not maliciously so. It understands me no matter how complex the domain I reach into, in fact it understands the domain better than I do, so instead of spending a few months convincing people with little knowledge or experience that X is a good idea, I can actually discuss X and explore if it’s a good idea or not and make a better informed decision. I’ve learned more in these discussions than I’ve learned in decades of convincing overly egoistic juniors and managers to listen to me about something I’m an industry authority on. However I see very clearly we will need very few of the team of 100 human engineers I can leave behind in my work. Some of will be there in a decade, but maybe less than 1:10. This is going to be a more brutal time than the Dotcom bust for CS grads, and I don’t think it will ever improve. Mostly because we simply won’t need the “my parents told me this makes money” people, just the passionate folks remain. But even then, we face a situation where the value of any software developed is very low because so much software is being developed. It’s going to turn into YouTube where software that is paid for is very small relative to the quantity of software developed. We already see this in the last few months with the rate of GitHub projects created. If the value of any software created is low, the compensation of the creator will be low unless they’re very rare talents. |
|
| ▲ | jchonphoenix an hour ago | parent | prev | next [-] |
| You miss the major factor in your compensation: pricing pressure due to supply/demand. By removing all the junior engineers, you've fundamentally changed the market forces longer term and most people expect that to negatively impact you in the supply demand curve regardless of whether or not the statements you've made above are true, which they most likely are for senior engineers. |
| |
| ▲ | fragmede an hour ago | parent [-] | | In removing junior developers, leaving only senior developers, wouldn't that reduce supply, making the price go up, not down? It's been a while since Econ 101 for me though. |
|
|
| ▲ | m463 an hour ago | parent | prev | next [-] |
| - Compilers will make developers irrelevant
...
- Compilers can write assembly language code
- Compilers have -O3 now
etc...Maybe we should rejoice. I remember dreading writing documentation, and now I would happily hand that off to AI. |
| |
| ▲ | geodel 35 minutes ago | parent [-] | | It is indeed exciting (for you at least). The problem is for most people is not that AI is spewing out code and reading documentation while developer do more interesting things. It is that companies are handing over the job of those developers to AI itself. So those ex-developers are free to do most interesting things in the world with little change of not relying on nice, steady paychecks every month. |
|
|
| ▲ | an hour ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | doctorpangloss 11 minutes ago | parent | prev | next [-] |
| In my community almost all problems are political. "Problem solving ability" matters if you are HFT, but everything else? Math can't tell you the best way to use land, educate a kid, what to pay for healthcare and how, how to prioritize biotech research, set a minimum wage, decide congressional maps, all sorts of stuff that actually I pay for or care a lot about. in fact I think you are totally misinterpreting what people are saying to you, you are 200% wrong: the 2-3% of your time spent coding was the valuable part, and your so called problem solving ability rarely solved any real problems. |
|
| ▲ | 2 hours ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | jstummbillig 25 minutes ago | parent | prev | next [-] |
| > Multiple times per week I have the same conversation. Really? I mean, good on you if it's true and you like the attention but that's sounds like an implausible amount of interest in someone and their relatively mundane profession. |
|
| ▲ | ryandvm 3 hours ago | parent | prev | next [-] |
| I dunno, man. I've been doing this for 20+ years and I think we're at a really important fork in the road where there are two possibilities. The first is that AI is achieving human-level expertise and capability, but since they're now being increasingly trained on their own output they are fighting an uphill battle against model collapse. In that case, perhaps AI is going to just sort of max out at "knowing everything" and maybe agentic coding is just another massive paradigm shift in a long line of technological paradigm shifts and the tooling has changed but total job market collapse is unlikely. The other possibility is that we're going to continue to see escalating AI capability with regard to context, information retrieval, and most importantly "cognition" (whatever that means). Maybe we overcome the challenges of model collapse. Maybe we figure out better methodologies for training that don't end up just producing a chatbot version of Stack Overflow + Wikipedia + Reddit. Maybe we actually start seeing AI create and not just recreate. If it's the latter, then I think engineers who think they are going to stay ahead of AI sound an awful lot like saddle makers who said "pffft, these new cars can only go 5 miles per hour." |
| |
| ▲ | golddust-gecko 3 hours ago | parent | next [-] | | 100% this. I'll also add another factor: it's become increasingly clear at our company that AI-enabled humans are getting to the bottom of the backlog of feature ideas much quicker. This makes the 'good ideas' part of the business the rate limiting step. And those are definitely not increasing with AI, beyond that generated by the AI churn itself ("let's bolt on a chat experience or an MCP!") So maybe the coding assistants don't get a 10x improvement any time soon, but we see engineering job market contraction because there aren't really enough good ideas to turn into code. | | |
| ▲ | hibikir 2 hours ago | parent [-] | | Yes, but as the price of getting work done goes down, a lot of companies that were priced out of custom software before now can hire devs, as the value hiring a few can provide just goes up. Fewer people per product, absolutely. No more teams of 10 or 20 working on the same thing. But there's so much out there that doesn't get done at all because you'd never be able to afford it. Simple marginal thinking: When you lower the price of something, it gets more use cases. A rich person might not take even more flights because they are cheaper, but more people will consider flying when they wouldn't have at old prices |
| |
| ▲ | bborud 3 hours ago | parent | prev | next [-] | | You are supposing that AI is achieving human level expertise and capability is a given. I am not so sure. Right now that's much further from the truth than one might think at first glance. | |
| ▲ | koonsolo 21 minutes ago | parent | prev | next [-] | | Do you think the latter can be achieved with the LLM neural network architecture? I highly doubt it. Neural networks are very old tech, and it took us that long to get us here. I'm sure we'll reach AGI at some point, but looking at AI history, I don't see that coming any time soon. | |
| ▲ | WorldMaker 2 hours ago | parent | prev [-] | | > max out at "knowing everything" LLMs know nothing but are great at giving the illusion that they know stuff. (It's "mansplaining as a service"; it is easier to give confident answers every time, even if they are wrong, than to program actual knowledge.) Even your first case seems wildly optimistic. The second case is a lot of "maybes" and "we don't know how but we might figure it out" that seems like a lot to bet an entire farm on, much less an entire industry of farms. We sure are looking at a shift in the job market, but I don't think it is a fork in the road so much as a Slow/Yield sign. Companies are signalling they are willing to take promises/hope to cut labor costs whether or not the results are real. I don't think anything about current AI can kill the software development industry, but I sure do think it can do a lot to make it a lot more miserable, lower wages, and artificially reduce job demand. I don't think this has anything to do with the real capabilities of today's AI and everything to do with the perception is enough of an excuse and companies were always looking for that excuse. (Just as ageism has always existed. AI is also just a fresh excuse for companies to carry on aging out experience from their staff, especially people with long enough memories/well schooled enough memories to remember previous AI booms and busts.) But also, yeah if some magic breakthrough makes this a real "buggy whip manufacturer moment" and not just an illusion of one, I don't mind being the engineer on that side of it. There's nothing wrong about lamenting the coming death of an industry that employs a lot of good people and tries to make good products. This is HN, you celebrate the failures, learn from them, and then you pivot or you try something new. If evidence tells me to pivot then I will pivot, I'm already debating trying something entirely new, but learning from the failures can also mean respecting "what went right?" and acknowledging how many people did a lot of good, hard work despite the outcome. | | |
| ▲ | anon84873628 39 minutes ago | parent [-] | | I'm skeptical of LLM "reasoning" but they sure as hell know a lot. That's what the embeddings are: a giant semantic relationship between concepts. | | |
| ▲ | koonsolo 19 minutes ago | parent [-] | | I agree with you, but a big drawback is that the accuracy or confidence of their output can't be estimated. So they surely know a lot, but you are never sure if the info is correct or not. |
|
|
|
|
| ▲ | bdangubic 26 minutes ago | parent | prev | next [-] |
| > Yes, about 2-5% of the time. Less now. I spent 2nd half of my 30y career fixing organizations and process where this was the case. so many things are wrong in places where this is the case (or alternatively you need a different job title :) ) |
|
| ▲ | bluegatty 27 minutes ago | parent | prev | next [-] |
| This is maybe a bit myopic. Dude - look what happened in the last 2 years on software. Now project out another 10. I totally agree with you 'as of now, in the current paradigm'. But that could very well change. |
|
| ▲ | boring-human 2 hours ago | parent | prev | next [-] |
| The true argument is about quantity - of people, not code. All qualitative arguments are missing the point. |
|
| ▲ | atoav 2 hours ago | parent | prev | next [-] |
| Saying being a programmer is about writing code is a bit like saying being an artist is about drawing lines on a canvas. Yeah technically drawing lines on canvases may be an very important part of being a painter, but it is hardly the core of what makes or breaks great art. |
|
| ▲ | insane_dreamer 3 hours ago | parent | prev | next [-] |
| What you described are senior developers and system architects. Junior developers spend most of their time writing code (when they're not forced to attend pointless standups, because Agile/blah/blah) > The developers who still think their job is about writing code will perhaps not have a job in the future. So you're saying the same thing everyone else is saying. SWEs won't go away, but they will be greatly reduced, because those whose job is about writing code -- junior devs -- will be replaced. (How will Sr Devs in the future be created? That's the question, isn't it.) |
| |
| ▲ | vineyardmike 2 hours ago | parent [-] | | > How will Sr Devs in the future be created? As an extreme example, maybe we’ll see long-running internships and trainings like doctors experience. Doctors don’t start their career until ~12+ years of prep and training. Pragmatically, software development has a lot of examples of teenagers making apps and college students building software companies. In the 12 years it takes for training, low-knowledge workers could be vibe coding continuously replacements of most commercial software products they’d be hired to build. So I doubt we’ll treat software development as a rarified high skill job. |
|
|
| ▲ | izacus 2 hours ago | parent | prev | next [-] |
| Note that just because you know the job is understanding things, the manager who'll boot you and leave you without income probably doesn't. They'll just get their political cookie points for saving money by replacing you with AI. |
|
| ▲ | coldtea 2 hours ago | parent | prev | next [-] |
| >- I understand things and then apply my ability to formulate solutions - Well, and AI can do part of that too, maybe more of it soon.
- ...
- Besides, you don't need 10 guys in a team to do that. A couple of them will do, then AI will do the coding. What will happen to the rest?
- ...
|
|
| ▲ | foldr 2 hours ago | parent | prev | next [-] |
| I think the future is pretty up in the air in this respect, but my guess is that AI will just lead to another shift in the set of knowledge that a 'real programmer' is expected to have. I'm old enough to remember when people would make fun of web developers for 'programming' using HTML and JavaScript. And of course, back in the day, you couldn't be a real programmer unless you wrote assembly language. In a few years' time, being able to write (as opposed to read) source code in any specific programming language will probably become a niche skill. The next generation will be able to read Python to about the same extent that I can read x86 assembly. Perceptions of what knowledge counts as 'low level' are constantly shifting. These days, if you write C, you're a low-level, close to the metal programmer. In the 70s, a lot of people made fun of Unix for being implemented in a high-level programming language (i.e. C) rather than assembly. |
|
| ▲ | keybored 3 hours ago | parent | prev | next [-] |
| Pure wage workers should consider dropping the attitude about how tech progress will just make their inferiors in the same line of work be out of a job (hrmph good riddance etc.). Because this pseudo-progress could creep up on them as well. Then you won’t have this just world of the deserving workers at all. Just formerly deserving workers and idiot billionaires like Musk (while the robots do all of the work). |
|
| ▲ | surgical_fire 3 hours ago | parent | prev [-] |
| I normally say that I have zero concerns regarding AI in terms of employment. At most I am concerned in learning the best practices on AI usage to stay on top of things. It's ability to write code is alright. Sometimes it impresses me, sometimes it leaves me underwhelmed. It certainly can't be left to do things autonomously if you are responsible for its output. Moderately useful tool, but hellishly expensive when not being subsidized by imbeciles that dream of it undrrmining labor. A fool and his money should be separated anyway. What I am really concerned about the incoming economic disaster being brewed. I suspect things will get very ugly pretty soon. |