| ▲ | samiv 6 hours ago |
| As a principal engineer I feel completely let down. I've spent decades building up and accumulating expert knowledge and now that has been massively devalued. Any idiot can now prompt their way to the same software. I feel depressed and very unmotivated and expect to retire soon. Talk about a rug pull! My experience is that people who weren't very good at writing software are the ones now "most excited" to "create" with a LLM. |
|
| ▲ | lovelearning 3 hours ago | parent | next [-] |
| > Any idiot can now prompt their way to the same software. I must say I find this idea, and this wording, elitist in a negative way. I don't see any fundamental problem with democratization of abilities and removal of gatekeeping. Chances are, you were able to accumulate your expert knowledge only because: - book writing and authorship was democratized away from the church and academia - web content publication and production were democratized away from academia and corporations - OSes/software/software libraries were all democratized away from corporations through open-source projects - computer hardware was democratized away from corporations and universities Each of the above must have cost some gatekeepers some revenue and opportunities. You were not really an idiot just because you benefited from any of them. Analogously, when someone else benefits at some cost to you, that doesn't make them an idiot either. |
| |
| ▲ | OneMorePerson an hour ago | parent | next [-] | | This is technically true in a lot of ways, but also intellectual and not identifying with what the comment was expressing. It's legitimately very frustrating to have something you enjoy democratized and feel like things are changing. It would be like if you put in all this time to get fit and skilled on mountain bikes and there was a whole community of people, quiet nature, yada yada, and then suddenly they just changed the rules and anyone with a dirt bike could go on the same trails. It's double damage for anyone who isn't close to retirement and built their career and invested time (i.e. opportunity cost) into something that might become a lot less valuable and then they are fearful for future economic issues. I enjoy using LLMs and have stopped writing code, but I also don't pretend that change isn't painful. | | |
| ▲ | lovelearning an hour ago | parent [-] | | The change is indeed painful to many of us, including me. I, too, am a software engineer. LLMs and vibe coding create some insecurity in my mind as well. However, our personal emotions need not turn into disparaging others' use of the same skills for their satisfaction / welfare / security. Additionally, our personal emotions need not color the objective analysis of a social phenomenon. Those two principles are the rationales behind my reply. | | |
| ▲ | OneMorePerson 35 minutes ago | parent [-] | | I appreciate that rationale, I also see the importance of those two principles and I think there's a lot of value there. I suppose I see "any idiot" as a more general phrase, like "idiot proof", not directly meaning that anyone who uses a LLM is an idiot. However I can also see how it would be seen as disparaging. Also, while there's a lot of examples of people entrenching into a certain behavior or status and causing problems, I also think society is a bit harsh on people who struggle with change. For people who are less predisposed to be ok with change feels like a lot of the time the response is "just deal with it and don't be selfish, this new XYZ is better for society overall". Society is pretty much made up of personal emotions on some level. I don't think we should go around attacking people, but very few things can be considered truly objective in the world of societal analysis. |
|
| |
| ▲ | card_zero 2 hours ago | parent | prev | next [-] | | So you put these all in the same category: gaining knowledge, gaining abilities, and just obtaining things. I gatekeep my bike, I keep it behind a gate. If you break the gate open and democratize my bike, you're an idiot. | | |
| ▲ | ipdashc 2 hours ago | parent | next [-] | | I'm not sure how you're getting that from their post? None of the four things mentioned (book publishing, web publishing, open-source software, computer hardware) involve stealing someone's property, he's saying that the ability to produce those things widened and the cost went down massively, so more people were able to gain access to them. Nobody stole your bike, but the bike patents expired and a bunch of bike factories popped up, so now everyone can get a cheap bike. | |
| ▲ | WillPostForFood an hour ago | parent | prev [-] | | it is more like: You gatekeep your bike, you keep it behind a gate, you don't let anyone else ride it. Your neighbor got a nicer bike for Christmas, rode it by your house and now you are sad because you aren't the special kid with the bike any more, you are just regular kid like your neighbor. |
| |
| ▲ | michaelhoney 3 hours ago | parent | prev | next [-] | | This is a good response. Progress has always been resisted by incumbents | |
| ▲ | slopinthebag an hour ago | parent | prev | next [-] | | People actually value the effort and dedication required to master a craft. Imagine we invent a drug that allows everyone to achieve olympic level athletic performance, would you say that it "democratises" sports? No, that would be ridiculous. | | |
| ▲ | lovelearning 20 minutes ago | parent [-] | | It does technically democratize the exhilarating experiences of that level of performance. Likely also democratizes negative aspects like injuries, extreme dieting, jealousy, neglecting relationships. That said, if we zoom out and review such paradigm shifts over history, we find that they usually result in some new social contracts and value systems. Both good expert writers and poor novice writers have been able to publish non-fiction books from a few centuries now. But society still doesn't perceive them as the same at all. A value system is still prevalent and estimated primarily from the writing itself. This is regardless of any other qualifications/disqualifications of authors based on education / experience / nationality / profession etc. At the individual level too, just because book publishing is easy doesn't mean most people want to spend their time doing that. After some initial excitement, people will go do whatever are their main interests. Some may integrate these democratized skills into their main interests. In my opinion, this historical pattern will turn out to be true with the superdrug as well as vibe coding. Some new value will be seen in the swimming or running itself - maybe technique or additional training over and above the drug's benefits. Some new value will be discovered in the code itself - maybe conceptual clarity, algorithmic novelty, structural cleanliness, readability, succinctness, etc. Those values will become the new foundations for future gatekeeping. |
| |
| ▲ | ares623 an hour ago | parent | prev | next [-] | | how is 2-3 centralized providers of this new technology "democratization"? | | |
| ▲ | lovelearning an hour ago | parent | next [-] | | It's _relatively_ democratic when compared to these counterfactual gatekeeping scenarios: - What if these centralized providers had restricted their LLMs to a small set of corporations / nations / qualified individuals? - What if Google that invented the core transformer architecture had kept the research paper to themselves instead of openly publishing it? - What if the universities / corporations, who had worked on concepts like the attention mechanism so essential for Google's paper, had instead gatekept it to themselves? - What if the base models, recipes, datasets, and frameworks for training our own LLMs had never been open-sourced and published by Meta/Alibaba/DeepSeek/Mistral/many more? | |
| ▲ | satvikpendem an hour ago | parent | prev [-] | | There are lots of open weight models |
| |
| ▲ | anonnon 21 minutes ago | parent | prev [-] | | > elitist in a negative way. It's funny you say that, because I've seen plenty of the reverse elitism from "AI bros" on HN, saying things like: > Now that I no longer write code, I can focus on the engineering or > In my experience, it's the mediocre developers that are more attached to the physical act of writing code, instead of focusing on the engineering As if getting further and further away from the instructions that the CPU or GPU actually execute is more, not less, a form of engineering, instead of something else, maybe respectable in its own way, but still different, like architecture. It's akin to someone claiming that they're not only still a legitimate novelist for using ChatGPT or a legitimate illustrator for using stable diffusion, but that delegating the actual details of the arrangement of words into sentences or layers and shapes of pigment in an image, actually makes them more of a novelist or artist, than those who don't. | | |
|
|
| ▲ | atonse 5 hours ago | parent | prev | next [-] |
| > My experience is that people who weren't very good at writing software are the ones now "most excited" to "create" with a LLM. I've been a tech lead for years and have written business critical code many times. I don't ever want to go back to writing code. I am feeling supremely empowered to go 100x faster. My contribution is still judgement, taste, architecture, etc. And the models will keep getting better. And as a result, I'll want to (and be able to) do even more. I also absolutely LOVE that non-programmers have access to this stuff now too. I am always in favor of tools that democratize abilities. Any "idiot" can build their own software tailored to how their brains think, without having to assemble gobs of money to hire expensive software people. Most of them were never going to hire a programmer anyway. Those ideas would've died in their heads. |
| |
| ▲ | suzzer99 2 hours ago | parent | next [-] | | > I also absolutely LOVE that non-programmers have access to this stuff now too. I am always in favor of tools that democratize abilities. Here's the other edge of that sword. A couple back-end devs in my department vibe-coded up a standard AI-tailwind front-end of their vision of revamping our entire platform at once, which is completely at odds with the modular approach that most of the team wants to take, and would involve building out a whole system based around one concrete app and 4 vaporware future maybe apps. And of course the higher-ups are like “But this is halfway done! With AI we can build things in 2 weeks that used to six months! Let’s just build everything now!” Nevermind that we don’t even have the requirements now, and nailing those down is the hardest part of the whole project. But the higher-ups never live through that grind. | | |
| ▲ | ministryofwarp 6 minutes ago | parent [-] | | It reemphasizes the question of importance. Would a user accept their data
needing a AI implementation of a ("manual") migration and their flow completely changing?
Does reliability to existing users even matter in the companies plans? If it isn't a product that needs to solve problems reliably over time then
it was kind of silly to use a DBA that cost twice the Backend engineer
and only handled the data niche. We progressed from there or regressed
from there depending on why we are developing software. |
| |
| ▲ | samiv 4 hours ago | parent | prev | next [-] | | What you bring to the table night be fine, but how long do you think you'll find emoloyers willing to still pay for this? One thing is for sure LLMs will bring down down the cost of software per some unit and increase the volume. But..cost = revenue. What is a cost to one party is a revenue to another party. The revenue is what pays salaries. So when software costs go down the revenues will go down too. When revenues go down lay offs will happen, salary cuts will happen. This is not fictional. Markets already reacted to this and many software service companies took a hit. | | |
| ▲ | atonse 3 hours ago | parent | next [-] | | I don't have an answer for this, and won't pretend to. But my take on this is that accountability will still be a purely human factor. It still is. I recently let go of a contractor who was hired to run our projects as a Scrum/PM, and his tickets were so bad (there were tickets with 3 words in them, one ticket was in the current sprint, that was blocked by a ticket deep in the backlog, basic stuff). When I confronted him about them, he said the AI generated them. So I told him that: 1. That's not an excuse, his job is to verify what it generated and ensure it's still good. 2. That actually makes it look WORSE, that not only did he do nearly 0 work, that he didn't even check the most basic outputs. And I'm not anti-AI, I expressly said that we should absolutely use AI tools to accelerate our work. But that's not what happened here. So you won't get to say (at least I think for another few years) "my AI was at fault" – you are ultimately responsible, not your tools. So people will still want to delegate those things down the chain. But ultimately they'll have to delegate to fewer people. | | |
| ▲ | jgilias an hour ago | parent | next [-] | | In general I agree. But it’s somehow very unlikely for the AI to generate a three word ticket. That’s what humans do. AI might generate an overly verbose and specific ticket instead. | |
| ▲ | eisa01 an hour ago | parent | prev [-] | | What drives that behavior is what I like to call human slop :) |
| |
| ▲ | post-it 2 hours ago | parent | prev | next [-] | | If AI completely erases the profession of software developer, I'll find something else to do. Like I can't in good faith ever oppose a technology just because it's going to make my job redundant, that would be insane. | | |
| ▲ | rapnie 2 hours ago | parent | next [-] | | Take that to its extreme. Suppose there was a technology that you do not own that would make everyone's job redundant. Everyone out of a job. There is no need for education, for skills to be mastered, for expertise. Would it still be insane to complain? | | | |
| ▲ | ipaddr 2 hours ago | parent | prev [-] | | There may not be a job for you in an office setting. What would you do? | | |
| ▲ | satvikpendem an hour ago | parent [-] | | That's when the problem shifts from individual to systemic, and only systemic solutions fix systemic problems. |
|
| |
| ▲ | linsomniac 3 hours ago | parent | prev | next [-] | | >What you bring to the table night be fine, but how long do you think you'll find emoloyers willing to still pay for this? I'm assuming that the software factory of the future is going to need Millwrights https://en.wikipedia.org/wiki/Millwright But, builders are builders. These tools turn ideas into things, a builders dream. | |
| ▲ | codebolt 3 hours ago | parent | prev | next [-] | | Any given system will still need people around to steer the AI and ensure the thing gets built and maintained responsibly. I'm working on a small team of in-house devs at a financial company, and not worried about my future at all. As an IC I'm providing more value than ever, and the backlog of potential projects is still basically endless- why would anyone want to fire me? | | |
| ▲ | kristiandupont an hour ago | parent | next [-] | | Why would it need people to steer the AI? I can easily see a future where companies that don't rely on the physical world (like manufacturing) are completely autonomous, just machines making money for their owner. | | |
| ▲ | codebolt an hour ago | parent [-] | | It's easy to imagine but there's still a vast amount of innovation and development that has to happen before something like that becomes realistic. At that point the whole system of capitalism would need to be reconsidered. Not going to happen in the foreseeable future. |
| |
| ▲ | anonnon 17 minutes ago | parent | prev [-] | | > why would anyone want to fire me? Because they can hire some "prompt engineer" to "steer the AI" for $30-50k instead of $150-$250k. |
| |
| ▲ | jmalicki 2 hours ago | parent | prev | next [-] | | "One thing is for sure LLMs will bring down down the cost of software per some unit and increase the volume. But..cost = revenue." That is Karl Marx's Labor theory of value that has been completely disproven. You don't charge what it costs to build something, you charge the maximum the customer is willing to pay. | |
| ▲ | rps93 4 hours ago | parent | prev [-] | | Just sold a house/moved out after being laid off in mid-January from a govt IT contractor(there for 8 great years and mostly remote). I started my UX Research, Design and Front End Web Design coding career in 2009, but now I think it's almost a stupid go nowhere vanishing career, thanks to AI. I think much like you that AI is and will just continue to destroy the economy! At least I got to sell a house and make a profit--stash it away for when the big AI market crash happens (hopefully not a 2030 great depression tho). As then it's a down market and buying stocks, bitcoin and houses is always cheaper. |
| |
| ▲ | kazinator 3 hours ago | parent | prev [-] | | The models will not keep betting better. We have pased "peak LLM" already, by my estimate. Some of the parlour tricks that are wrapped around the models will make some incremental improvements, but the underlying models are done. More data, more parameters, are no longer doing to do anything. AI will have to take a different direction. |
|
|
| ▲ | bri3d 4 hours ago | parent | prev | next [-] |
| This is really interesting to me; I have the opposite belief. My worry is that any idiot can prompt themselves to _bad_ software, and the differentiator is in having the right experience to prompt to _good_ software (which I believe is also possible!). As a very seasoned engineer, I don't feel personally rugpulled by LLM generated code in any way; I feel that it's a huge force multiplier for me. Where my concern about LLM generated software comes in is much more existential: how do we train people who know the difference between bad software and good software in the future? What I've seen is a pattern where experienced engineers are excellent at steering AI to make themselves multiples more effective, and junior engineers are replacing their previous sloppy output with ten times their previous sloppy output. For short-sighted management, this is all desirable since the sloppy output looks nice in the short term, and overall, many organizations strategically think they are pointed in the right direction doing this and are happy to downsize blaming "AI." And, for places where this never really mattered (like "make my small business landing page,") this is an complete upheaval, without a doubt. My concern is basically: what will we do long term to get people from one end to another without the organic learning process that comes from having sloppy output curated and improved with a human touch by more senior engineers, and without an economic structure which allows "junior" engineers to subsidize themselves with low-end work while they learn? I worry greatly that in 5-10 years many organizations will end up with 10x larger balls of "legacy" garbage and 10x fewer knowledgeable people to fix it. For an experienced engineer I actually think this is a great career outlook and I can't understand the rug pull take at all; I think that today's strong and experienced engineer will be command a high amount of money and prestige in five years as the bottom drops out of software. From a "global outcomes" perspective this seems terrible, though, and I'm not quite sure what the solution is. |
| |
| ▲ | kristiandupont an hour ago | parent | next [-] | | >For short-sighted management, this is all desirable since the sloppy output looks nice in the short term It was a sobering moment for me when I sat down to look at the places I have worked for over my career of 20-odd years. The correlation between high quality code and economic performance was not just non-existing, it was almost negative. As in: whenever I have worked at a place where engineering felt like a true priority, tech debt was well managed, principles followed, that place was not making any money. I am not saying that this is a general rule, of course there are many places that perform well and have solid engineering. But what I am saying is that this short-sighted management might not be acting as irrationally as we prefer to think. | |
| ▲ | socalgal2 3 hours ago | parent | prev | next [-] | | My guesses are 1. We'll train the LLMs not to make sloppy code. 2. We'll come up with better techinques to make guardrails to help Making up examples: * right now, lots of people code with no tests. LLMs do better with tests. So, training LLMs to make new and better tests. * right now, many things are left untested because it's work to build the infrastructure to test them. Now we have LLMs to help us build that infrustructure so we can use it make better tests for LLMs. * ...? | | |
| ▲ | jgilias an hour ago | parent [-] | | * better languages and formal verification. If an LLM codes in Rust, there’s a class of bugs that just can’t happen. I imagine we can develop languages with built-in guardrails that would’ve been too tedious for humans to use. |
| |
| ▲ | joeevans1000 2 hours ago | parent | prev [-] | | Good software, bad software, and working software. |
|
|
| ▲ | jv22222 3 hours ago | parent | prev | next [-] |
| > Any idiot can now prompt their way to the same software. It may look the same, but it isn't the same. In fact if you took the time to truly learn how to do pure agentic coding (not vibe coding) you would realize as a principal engineer you have an advantage over engineers with less experience. The more war stories, the more generalist experience, the more you can help shape the llm to make really good code and while retaining control of every line. This is an unprecedented opportunity for experienced devs to use their hard won experience to level themselves up to the equivalence of a full team of google devs. |
| |
| ▲ | kazinator 3 hours ago | parent [-] | | > while retaining control of every line What I want when I'm coding, especially on open source side projects, is to retain copyright licensing over every line (cleanly, without lying about anything). Whoops! | | |
| ▲ | jv22222 3 hours ago | parent | next [-] | | Hmm. TIL: The real exposure isn't Anthropic, OpenIA claiming your code, it's you unknowingly distributing someone else's GPL code because the model silently reproduced it, with essentially zero recourse for the model owner. | |
| ▲ | satvikpendem an hour ago | parent | prev [-] | | I wonder why people still believe in intellectual property, it's a concept that has long since lived past its usefulness, especially technologically. | | |
| ▲ | bitwize an hour ago | parent [-] | | Because IP democratizes returns on the creative process. | | |
| ▲ | satvikpendem an hour ago | parent [-] | | Maybe it used to but with companies like Disney lengthening copyright times way beyond the original intention, or corporations patenting absurd things, it seems to be more of a way to entrench power than any sort of democratization. I'm glad generative AI seem to be bypassing all this and actually democratizing returns on the creative process, by flagrantly violating the concept of IP. |
|
|
|
|
|
| ▲ | JKCalhoun 4 hours ago | parent | prev | next [-] |
| I echo another reply here, if anything my experience coding feels even more valuable now. It was never about writing the code—anyone can do that, students in college, junior engineers… Experience is being able to recognize crap code when you see it, recognizing blind alleys long before days or weeks are invested heading down them. Creating an elegant API, a well structured (and well-organized) framework… Keeping it as simple as possible that just gets the job done. Designing the code-base in a way that anticipates expansion… I've never felt the least bit threatened by LLMs. Now if management sees it differently and experienced engineers are losing their jobs to LLMs, that's a tragedy. (Myself, I just retired a few years ago so I confess to no longer having a dog I this race.) |
| |
| ▲ | mk89 4 hours ago | parent | next [-] | | Sorry for the dumb question but how could you feel threatened by LLMs if you retired just a few years ago? Considering the hype started somewhere in 2022-2023. | | |
| ▲ | JKCalhoun 2 hours ago | parent | next [-] | | You're right, as I say, I no longer have skin in the game. Retired, I have continued to code, and have used Claude to vibe code a number of projects—initially I dod so out of curiosity as to how good LLM are, and then to handle things like SwiftUI that I am hesitant to have to learn. It's true then that I am not in a position of employment where I have to consider a performance review, pleasing my boss or impressing my coworkers. I don't doubt that would color my perception. But speaking as someone who has used LLMs to code, while they impress me, again, I don't feel the threat. As others have pointed out in past threads here on HN, on blogs, LLMs feel like junior engineers. To be sure they have a lot of "facts" but they seem to lack… (thinking of a good word) insight? Foresight? And this too is how I have felt as I was aging-out of my career and watched clever, junior engineers come on board. The newness, like Swift, was easy for them. (They no doubt have rushed headlong into Swift UI and have mastered it.) Never though did I feel threatened by them though. The career itself, I have found, does in fact care little for "grey beards". I felt by age 50 I was being kind of… disregarded by the younger engineers. (It was too bad, I thought, because I had hoped that on my way out of the profession I might act more as mentor than coder. C'est la vie!) But for all the new engineer's energy and eagerness, I was comfortable instead with my own sense of confidence and clarity that came from just having been around the block a few times. Feel free to disregard my thoughts on LLMs and the degree to which they are threatening the industry. They may well be an existential threat. But, with junior engineers as also a kind of foil, I can only say that I still feel there is value in my experience and I don't disparage it. | |
| ▲ | latenightcoding 3 hours ago | parent | prev [-] | | and they only got really good like last December. |
| |
| ▲ | mmasu 2 hours ago | parent | prev [-] | | how would you suggest someone who just started their career moves ahead to build that “taste” for lean and elegant solutions? I am onboarding fresh grads onto my team and I see a tendency towards blindly implementing LLM generated code. I always tell people they are responsible for the code they push, so they should always research every line of code, their imported frameworks and generated solutions. They should be able to explain their choices (or the LLM’s). But I still fail to see how I can help people become this “new” brand of developer. Would be very happy to hear your thoughts or how other people are planning to tackle this. Thanks! | | |
| ▲ | JKCalhoun 2 hours ago | parent [-] | | My "taste" (like perhaps all other "tastes") comes from experience. Cliche, I know. When you have had to tackle dozens of frameworks/libraries/API over the years, you get to where you find you like this one, dislike that one. Get/Set, Get/Set… The symmetry is good… Calling convention is to pass a dictionary: all the params are keys. Extensible, sure, but not very self-documenting, kind of baroque? An API that is almost entirely call-backs. Hard to wrap your head around, but seems to be pretty flexible… How better to write a parser API anyway? (You get the idea.) And as you design apps/frameworks yourself, then have to go through several cycles of adding features, refactoring, you start to think differently about structuring apps/frameworks that make the inevitable future work easier. Perhaps you break the features of a monolithic app into libraries/services… None of this is novel, it's just that doing enough of it, putting in the sweat and hours, screwing up a number of times) is where "taste" (insight?) comes from. It's no different from anything else. Perhaps the best way to accelerate the above though is to give a junior dev ownership of an app (or if that is too big of a bite, then a piece of a thing). "We need an image cache," you say to them. And then it's theirs. They whiteboard it, they prototype it, they write it, they fix the bugs, they maintain it, they extend it. If they have to rewrite it a few times over the course of its lifetime (until it moves into maintenance mode), that's fine. It's exactly how they'll learn. But it takes time. | | |
| ▲ | tstrimple 27 minutes ago | parent [-] | | This answer probably feels unsatisfying and I agree. But some things actually need repetition and ongoing effort. One of my favorite quotes is from Ira Glass about this very topic. > Nobody tells this to people who are beginners, and I really wish somebody had told this to me. > All of us who do creative work, we get into it because we have good taste. But it's like there is this gap. For the first couple years that you're making stuff, what you're making isn't so good. It’s not that great. It’s trying to be good, it has ambition to be good, but it’s not that good. > But your taste, the thing that got you into the game, is still killer. And your taste is good enough that you can tell that what you're making is kind of a disappointment to you. A lot of people never get past that phase. They quit. > Everybody I know who does interesting, creative work they went through years where they had really good taste and they could tell that what they were making wasn't as good as they wanted it to be. They knew it fell short. Everybody goes through that. > And if you are just starting out or if you are still in this phase, you gotta know its normal and the most important thing you can do is do a lot of work. Do a huge volume of work. Put yourself on a deadline so that every week or every month you know you're going to finish one story. It is only by going through a volume of work that you're going to catch up and close that gap. And the work you're making will be as good as your ambitions. > I took longer to figure out how to do this than anyone I’ve ever met. It takes awhile. It’s gonna take you a while. It’s normal to take a while. You just have to fight your way through that. > —Ira Glass |
|
|
|
|
| ▲ | elevation 6 hours ago | parent | prev | next [-] |
| I’m with you here. I grew up without a mentor and my understanding of software stalled at certain points. When I couldn’t get a particular os API to work, in Google and stack overflow didn’t exist, and I had no one around me to ask. I wrote programs for years by just working around it. After decades writing software I have done my best to be a mentor to those new to the field. My specialty is the ability to help people understand the technology they’re using, I’ve helped juniors understand and fix linker errors, engineers understand ARP poisoning, high school kids debug their robots. I’ve really enjoyed giving back. But today, pretty much anyone except for a middle schooler could type their problems into a ChatGPT and get a more direct answer that I would be able to give. No one particularly needs mentorship as long as they know how to use an LLM correctly. |
| |
| ▲ | atonse 5 hours ago | parent | next [-] | | Today every single software engineer has an extremely smart and experienced mentor available to them 24/7. They don't have to meet them for coffee once a month to ask basic questions. That said, I still feel strongly about mentorship though. It's just that you can spend your quality time with the busy person on higher-level things, like relationship building, rather than more basic questions. | | |
| ▲ | Ronsenshi 4 hours ago | parent [-] | | How would this affect future generations of ... well anyone, when they have 24/7 access to extremely smart mentor who will find solution to pretty much any problem they might face? Can't just offload all the hard things to the AI and let your brain waste away. There's a reason brain is equated to a muscle - you have to actively use it to grow it (not physically in size, obviously). | | |
| ▲ | atonse 4 hours ago | parent | next [-] | | I agree with you about using our brains. I honestly have no idea. But I can tell you that, just like with most things in life, this is yet another area where we are increasingly getting to do just the things we WANT to do (like think about code or features and have it appear, pixel pushing, smoothing out the actual UX, porting to faster languages) and not have to do things most people don't want to do, like drudgery (writing tests, formatting code, refactoring manually, updating documentation, manually moving tickets around like a caveman). Or to use a non tech example, having to spend hours fixing word document formatting. So we're getting more spoiled. For example, kids have never waited for a table at a restaurant for more than 20 mins (which most people used to do all the time before abundant food delivery or reservation systems). Not that we ever enjoyed it, but learning to be bored, learning to not just get instant gratification is something that's happening all over in life. Now it's happening even with work. So I honestly don't know how it'll affect society. | |
| ▲ | ipaddr 2 hours ago | parent | prev [-] | | Just because you have every instruction manual doesn't mean you can follow and perform the steps or have time to or can adapt to a real world situation. |
|
| |
| ▲ | socalgal2 2 hours ago | parent | prev | next [-] | | I have this feeling as well. At one point I thought when I got older it might be nice to teach - Steve Wozniak apparently does. But, it doesn't feel like I can really add much. Students have infinite teachers on youtube, and now they have Gemini/Claude/ChatGPT which are amazing. Sure, today, maybe, I could see myself as mostly a chaperone in some class to once in a while help a student out with some issue but that possibility seems like it will be gone in 1 to 2 years. | |
| ▲ | simonw 4 hours ago | parent | prev [-] | | "No one particularly needs mentorship as long as they know how to use an LLM correctly." The "as long as they know how..." is doing a lot of work there. I expect developers with mentors who help give them the grounding they need to ask questions will get there a whole lot faster than developers without. |
|
|
| ▲ | YZF 4 hours ago | parent | prev | next [-] |
| I consider myself very good at writing software. I built and shipped many projects. I built systems from zero. Embedded, distributed, SaaS- you name it. I'm having a lot of fun with AI. Any idiot can't prompt their way to the same software I can write. Not yet anyways. |
|
| ▲ | seanmcdirmid 2 hours ago | parent | prev | next [-] |
| > My experience is that people who weren't very good at writing software are the ones now "most excited" to "create" with a LLM. My experience is the opposite. Those with a passion for the field and the ability to dig deeply into systems are really excited right now (literally all that power just waiting to be guided to do good...and oh does it need guidance!). Those who were just going through the motions and punching a clock are pretty unmotivated and getting ready to exit. Sometimes I dream about being laid off from my FAANG job so I have some time to use this power in more interesting than I'm doing at work (although I already get to use it in fairly interesting ways in my job). |
|
| ▲ | ilc 5 hours ago | parent | prev | next [-] |
| As a Principal SWE, who has done his fair share of big stuff. I'm excited to work with AI. Why? Because it magnifies the thing I do well: Make technical decisions. Coding is ONE place I do that, but architecture, debugging etc. All use that same skill. Making good technical decisions. And if you can make good choices, AI is a MEGA force multiplier. You just have to be willing to let go of the reins a hair. |
| |
| ▲ | herdymerzbow 3 hours ago | parent [-] | | As a self teaching beginner* this is where I find AI a bit limiting. When I ask ChatGPT questions about code it is always about to offer up a solution, but it often provides inappropriate responses that don't take into account the full context of a project/task. While it understands what good structure and architecture are, it's missing the awareness of good design and architecture and applying to the questions I have, and I don't have have the experience or skill set to ask those questions. It often suggests solutions (I tend to ask it for suggestions rather than full code, so I can work it out myself) that may have drawbacks that I only discover down the line. Any suggestions to overcome this deficit in design experience? My best guess is to read some texts on code design or alternatively get a job at a place to learn design in practice. Mainly learning javascript and web app development at the moment. *Who has had a career in a previous field, and doesn't necessarily think that learning programming with lead to another career (and is okay with that). |
|
|
| ▲ | visarga an hour ago | parent | prev | next [-] |
| > I've spent decades building up and accumulating expert knowledge and now that has been massively devalued. Any idiot can now prompt their way to the same software. Do you like the craft of programming more than the outcomes? Now you are in a better position than ever to achieve things. |
|
| ▲ | BatFastard 6 hours ago | parent | prev | next [-] |
| IMHO any idiot can create a piece of crap.
It takes experience to create good software.
Use your experience Luke! Now you have a team of programmers to create what ever you fancy! Its been great for me, but I have only been programming C++ for 36 years. |
|
| ▲ | Ronsenshi 4 hours ago | parent | prev | next [-] |
| Same here, although hopefully won't be retiring soon. What's missing from this is that iconic phrase that all the AI fans love to use: "I'm just having fun!" This AI craze reminds me of a friend. He was always artistic but because of the way life goes he never really had opportunity to actively pursue art and drawing skills. When AI first came out, and specifically MidJourney he was super excited about it, used it a lot to make tons and tons of pictures for everything that his mind could think of. However, after awhile this excitement waned and he realized that he didn't actually learn anything at all. At that point he decided to find some time and spend more time practicing drawing to be able to make things by himself with his own skills, not by some chip on the other side of the world and he greatly improved in the past couple of years. So, AI can certainly help create all the "fun!!!" projects for people who just want to see the end result, but in the end would they actually learn anything? |
| |
| ▲ | pizza 4 hours ago | parent [-] | | I mean. Sounds like the guy had existing long term goals, needed to overcome an activation threshold, and used AI as a catalyst to just get started. Seems like, behaviorally, AI was pivotal for him to learn things, even if the things he learned came from elsewhere / his own effort. | | |
| ▲ | Ronsenshi 3 hours ago | parent [-] | | I suppose, yes, AI was like a kickstart. But the point is - he didn't just stick to AI, he realized that in terms of skill and fulfillment it's a no-go direction. Because you neither learn anything, nor create anything yourself. | | |
| ▲ | bitwize an hour ago | parent [-] | | I feel the same way. But this is a new economy now, software is cheap, and regarding the skill and fulfillment you derive writing it yourself, to quote Chris Farley: "that and a nickel will get you a nice hot cup of JACK SQUAT!!!" |
|
|
|
|
| ▲ | ipaddr 2 hours ago | parent | prev | next [-] |
| I find fun in using opencode and Claude to create projects but I can't find the energy to run the project or read the code. Watching this program do stuff is more enjoyable then using or looking at the stuff produced. But it doesn't produce code that looks or is designed the way I would normally. And it can't do the difficult or novel things. |
|
| ▲ | pelcg 4 hours ago | parent | prev | next [-] |
| > As a principal engineer I feel completely let down. I've spent decades building up and accumulating expert knowledge and now that has been massively devalued. Any idiot can now prompt their way to the same software. I feel depressed and very unmotivated and expect to retire soon. Talk about a rug pull! Really? The vibe coders are running into a dark forest with a bunch of lobsters (OpenClaw) getting lost and confused in their own tech debt and you're saying they can prompt their way to the same software? Someone just ended up wiping their entire production database with Claude and you believe that your experience is for nothing, towards companies that need stable infrastructure and predictability. Cognitive debt is a real thing and being unable to read / write code that is broken is going to be an increasing problem which experienced engineers can solve. Do not fall for the AI agent hype. |
| |
| ▲ | Ronsenshi 4 hours ago | parent [-] | | > Do not fall for the AI agent hype. Problem is, it's the people in higher positions who should be aware of that, except they don't care. All they would see is how much more profit company can make if it reduces workforce. Plenty of engineers do realize that AI is not some magical solution to everything - but the money and hype tends to overshadow cooler heads on HN. |
|
|
| ▲ | oulu2006 2 hours ago | parent | prev | next [-] |
| I don't find the same, like you, principle/CTO engineer, there's a world of difference between simplistic prompt/vibe coding and building a properly architected/performant/maintainable system with agentic coding. |
|
| ▲ | dilap 3 hours ago | parent | prev | next [-] |
| I thought this was parody until the last sentence. |
|
| ▲ | michaelhoney 3 hours ago | parent | prev | next [-] |
| I think it’s important for you to understand that there were always way more people who loved programming than were able to work professionally as high-level coders. Sure, if you spent most of your working life writing code, you’d be very proficient. But for many, many others, they haven’t been able to spend the time developing those muscles. Modern LLMs really are a joyful experience for people who enjoy software creation but haven’t had the 10,000 hours. |
|
| ▲ | JSR_FDED 4 hours ago | parent | prev | next [-] |
| I urge you to actually try these tools. You will very quickly realize you have nothing to worry about. In the hands of a knowledgeable engineer these tools can save a lot of drudge work because you have the experience to spot when they’re going off the rails. Now imagine someone who doesn’t have the experience, and is not able to correct where necessary. Do you really think that’s going to end well? |
| |
| ▲ | slopinthebag 39 minutes ago | parent [-] | | Yeah, even just now I had to go and correct some issues with LLM output that I only knew were an issue because I have extensive experience with that domain. If I didn't have that I would not have caught it and it would have been a major issue down the line. LLM's remove much of the drudgery of programming that we unfortunately sort of did to ourselves collectively. |
|
|
| ▲ | LPisGood 5 hours ago | parent | prev | next [-] |
| On the bright side, working in tech between 2006 and 2026 means you should be extremely wealthy and able to retire comfortably. |
| |
| ▲ | niorad 16 minutes ago | parent | next [-] | | In SV probably. As a lead FE dev with 14 yoe in Munich I‘m at 85k€, thats not even enough to pay off a loan for a house around here. | |
| ▲ | bcrosby95 4 hours ago | parent | prev [-] | | Uh if you worked for a top company or something. Most tech workers have made relatively ordinary salaries the last 20 years. |
|
|
| ▲ | therealdrag0 4 hours ago | parent | prev | next [-] |
| No offense but you sound more like a “principle coder”, not a principle engineer. At least in many domains and orgs, Most principal engineers are already spending most their time not coding. But -engineering- still take sip much or most of their time. I felt what you describe feeling. But it lasted like a week in December. Otherwise there’s still tons of stuff to build and my teams need me to design the systems and review their designs. And their prompt machine is not replacing my good sense. There’s plenty of engineering to do, even if the coding writes itself. |
| |
| ▲ | dionian 4 hours ago | parent [-] | | I make documentation and diagrams for myself rather than writing code much of the time |
|
|
| ▲ | bitfilped an hour ago | parent | prev | next [-] |
| I'm not sure why you feel devalued or let down, LLM code is a joke and will be a thing of the past after everyone has had their production environment trashed for the nth time by "AI." |
|
| ▲ | nurettin 2 hours ago | parent | prev | next [-] |
| Completely the opposite experience here! I am a tech lead with decades of experience with various programming languages. When it comes to producing code with an llm, most noobs get stuck producing spaghetti and rolling over. It is so bad that I have to go prompt-fix their randomly generated architecture, de-duplicate, vectorize and simplify. If they lack domain knowledge on top of being a noob it is a complete disaster. I saw llm code pick a bad default (0) for a denominator and then "fix" that by replacing with epsilon. It isn't the end, it is a new beginning. And I'm excited. |
|
| ▲ | bcrosby95 4 hours ago | parent | prev | next [-] |
| Really? I love LLMs because I can't stand the process of taking the model in my brain and putting it in a file. Flow State is so hard for me to hit these days. So now I spec it out, feed it to an LLM, and monitor it while having a cup of tea. If it goes off the rails (it usually does) I redirect it. Way better than banging it out by hand. |
| |
| ▲ | bitfilped an hour ago | parent [-] | | It's only going to get harder to achieve if you keep letting your skills and resoning abilities rot from LLM reliance. |
|
|
| ▲ | jorl17 23 minutes ago | parent | prev | next [-] |
| In my experience, the truly best in class have gone from being 10x engineers to being 100x engineers, assuming they embrace AI. It's incredible to watch. I wouldn't say I'm a 10x-er, but I'm comfortable enough with my abilities nowadays to say I am definitely "above average", and I feel beyond empowered. When I joined college 15 years ago, I felt like I was always 10 steps ahead of everyone else, and in recent years that feeling had sort of faded. Well, I've got that feeling back! So much of the world around me feels frozen in place, whereas I am enjoying programming perhaps as much as when I learned it as a little kid. I didn't know I MISSED this feeling, but I truly did! Everything in my daily life (be it coding or creating user stories — who has time to use a mouse when you can MCP to JIRA/notion/whatever?) is happening at an amazing speed and with provable higher levels of quality (more tests, better end-user and client satisfaction, more projects/leads closed, faster development times, less bug reports, etc.). I barely write lines of code, and I barely type (often just dictate to MacWhisper). I completely understand different people like different things. Had you asked me 5 years ago I probably would have told you I would be miserable if I stopped "writing" code, but apparently what I love is the problem solving, not the code churning. I'm not trying to claim my feelings are right, and other people are "wrong" for "feeling upset". What is "right" or "wrong" in matters of feelings? Perhaps little more than projection or a need for validation. There is no "right" or "wrong" about this! If I now look at average-to-low-tier-engineers, I think they are a mixed bag with AI on their hands. Sometimes they go faster and actually produce code as good as or better than before. Often, though, they lack the experience, "taste" or "a priori knowledge" to properly guide LLMs, so they churn lots of poorly designed code. I'd say they are not a net-positive. But Opus 4.6 is definitely turning the tide here, making it less likely that average engineers do as much damage as before (e.g. with a Sonnet-level model) On top of this divide within the "programming realm", there's another clear thing happening: software has finally entered the DIY era. Previously, anyone could already code, but...not really. It would be very difficult for random people to hack something quickly. I know we've had the terms "Script kiddies" for a long time, but realistically you couldn't just wire your own solution to things like you can with several physical objects. In the physical world, you grab your hammer and your tools and you build your DIY solutions — as a hobby or out of necessity. For software...this hadn't really been the case....until now! Yes, we've had no-code solutions, but they don't compare. I know 65 year olds who have never even written a line of code that are now living the life by creating small apps to improve their daily lives or just for the fun of it. It's inspiring to see, and it excites me tremendously for the future. Computers have always meant endless possibilities, but now so many more people can create with computers! To me it's a golden age for experimentation and innovation! I could say the same about music, and art creation. So many people I know and love have been creating art. They can finally express themselves in a way they couldn't before. They can produce music and pictures that bring tears to my eyes. They aren't slop (though there is an abundance of slop out there — it's a problem), they are beautiful. There is something to be said about the ethical implications of these systems, and how artists (and programmers, to a point?) are getting ripped off, but that's an entirely different topic. It's an important topic, but it does not negate that this is a brand new world of brand new artists, brand new possibilities, and brand new challenges. Change is never easy — often not even fair. |
|
| ▲ | bitwize 4 hours ago | parent | prev | next [-] |
| What I keep hearing is that the people who weren't very good at writing software are the ones reluctant to embrace LLMs because they are too emotionally attached to "coding" as a discipline rather than design and architecture, which are where the interesting and actually difficult work is done. |
| |
| ▲ | Ronsenshi 4 hours ago | parent [-] | | Really? To me it seems that quite the opposite is true - people who were never very good at writing code are excited about LLMs because suddenly they can pretend to be architects without understanding what's happening in the codebase. Same as with AI-art, where people without much drawing skills were excited about being able to make "art". | | |
| ▲ | sawmurai 3 hours ago | parent [-] | | Perhaps you are both right. People who see coding as a means to an end enjoy LLMs while people who saw it as the most enjoyable part don’t. |
|
|
|
| ▲ | rendall an hour ago | parent | prev | next [-] |
| I know that your post has lots of comments, but I'd like to weigh in kindly too. > I've spent decades building up and accumulating expert knowledge and now that has been massively devalued. Listen to the comments that say that experience is more valuable than ever. > Any idiot can now prompt their way to the same software. No they cannot. You and an LLM can build something together far more powerful and sophisticated than you every could have dreamt, and you can do it because of your decades of experience. A newbie cannot recognize the patterns of a project gone bad without that experience. > I feel depressed and very unmotivated and expect to retire soon. Welcome to the industry. :) It happens. Why not take a break? Work on a side project, something you love to do. > My experience is that people who weren't very good at writing software are the ones now "most excited" to "create" with a LLM. Once upon a time painters and illustrators were not "artists", but archivists and documenters. They were hired to archive what something looked like, and they were largely evaluated on that metric alone. When photography took that role, painters and illustrators had to re-evaluate their social role, and they became artists and interpreters. Impressionism, surrealism, conceptualism are examples of art movements that, in my interpretation, were still attempting to grapple with that shift decades, nearly a century later. Today, we SWE are grappling with a very similar shift. People using LLMs to create software are not poor coders any more (or less) than photographers were poor painters. Painters and illustrators became very valuable after the invention of photography, arguably more valuable socially than before. |
|
| ▲ | adampunk 4 hours ago | parent | prev [-] |
| Why did you leave this as a comment on someone talking about how happy they were about their own experience? |