| ▲ | GalaxyNova 5 hours ago | ||||||||||||||||||||||||||||
> I don’t read code anymore Never thought this would be something people actually take seriously. It really makes me wonder if in 2 - 3 years there will be so much technical debt that we'll have to throw away entire pieces of software. | |||||||||||||||||||||||||||||
| ▲ | subsection1h 3 hours ago | parent | next [-] | ||||||||||||||||||||||||||||
> Never thought this would be something people actually take seriously The author of the article has a bachelor's degree in economics[1], worked as a product manager (not a dev) and only started using GitHub[2] in 2025 when they were laid off[3]. [1] https://www.linkedin.com/in/benshoemaker000/ | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | sho_hn 4 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
> Never thought this would be something people actually take seriously. You have to remember that the number of software developers saw a massive swell in the last 20 years, and many of these folks are Bootcamp-educated web/app dev types, not John Carmack. They typically started too late and for the wrong reasons to become very skilled in the craft by middle age, under pre-AI circumstances and statistically (of course there are many wonderful exceptions; one of my best developers is someone who worked in a retail store for 15 years before pivoting). AI tools are now available to everyone, not just the developers who were already proficient at writing code. When you take in the excitement you always have to consider what it does for the average developer and also those below average: A chance to redefine yourself, be among the first doing a new thing, skip over many years of skill-building and, as many of them would put it, focus on results. It's totally obvious why many leap at this, and it's even probably what they should do, individually. But it's a selfish concern, not a care for the practice as-is. It also results in a lot of performative blog posting. But if it was you, you might well do the same to get ahead in life. There's only to so many opportunities to get in on something on the ground floor. I feel a lot of senior developers don't keep the demographics of our community of practice into account when they try to understand the reception of AI tools. | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | sixdimensional 5 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
Half serious - but is that really so different than many apps written by humans? I've worked on "legacy systems" written 30 to 45 years ago (or more) and still running today (things like green-screen apps written in Pick/Basic, Cobol, etc.). Some of them were written once and subsystems replaced, but some of it is original code. In systems written in the last.. say, 10 to 20 years, I've seen them undergo drastic rates of change, sometimes full rewrites every few years. This seemed to go hand-in-hand with the rise of agile development (not condemning nor approving of it) - where rapid rates of change were expected.. and often the tech the system was written in was changing rapidly also. In hardware engineering, I personally also saw a huge move to more frequent design and implementation refreshes to prevent obsolescence issues (some might say this is "planned obsolescence" but it also is done for valid reasons as well). I think not reading the code anymore TODAY may be a bit premature, but I don't think it's impossible to consider that someday in the nearer than further future, we might be at a point where generative systems have more predictability and maybe even get certified for safety/etc. of the generated code.. leading to truly not reading the code. I'm not sure it's a good future, or that it's tomorrow, but it might not be beyond the next 20 year timeframe either, it might be sooner. | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | Aeolun 4 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
> 2 - 3 years there will be so much technical debt that we'll have to throw away entire pieces of software. That happens just as often without AI. Maybe the people that like it all thave experience with trashing multiple sets of products over the course of their life? | |||||||||||||||||||||||||||||
| ▲ | binsquare 5 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
Reading and understanding code is more important than writing imo | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | strken 3 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
I'm torn between running away to be an electrician or just waiting three years until everyone realises they need engineers who can still read. Sometimes it feels like pre-AI education is going to be like low-background steel for skilled employees. | |||||||||||||||||||||||||||||
| ▲ | joriJordan 4 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
Remember though this forum is full of people who consider code objects when it's just state in a machine. We have been throwing away entire pieces of software forever. Where's Novell? Who runs 90s Linux kernels in prod? Code isn't a bridge or car. Preservation isn't meaningful. If we aren't shutting the DCs off we're still burning the resources regardless if we save old code or not. Most coders are so many layers of abstraction above the hardware at this point anyway they may as well consider themselves syntax artists as much as programmers, and think of Github as DeviantArt for syntax fetishists. Am working on a model of /home to experiment with booting Linux to models. I can see a future where Python in my screen "runs" without an interpreter because the model is capable of correctly generating the appropriate output without one. Code is ethno objects, only exists socially. It's not essential to computer operations. At the hardware level it's arithmetical operations against memory states. Am working on my own "geometric primitives" models that know how to draw GUIs and 3D world primitives, text; think like "boot to blender". Rather store data in strings, will just scaffold out vectors to a running "desktop metaphor". It's just electromagnetic geometry, delta sync between memory and display: https://iopscience.iop.org/article/10.1088/1742-6596/2987/1/... | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | j_bizzle 2 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
The coincidental timing between the rapid increase in the number of emergency fixes coming out on major software platforms and the proud announcement of the amount of code that's being produced by AI at the same companies is remarkable. I think 2-3 years is generous. Don't get me wrong, I've definitely found huge productivity increases in using various LLM workflows in both development as well as operational things. But removing a human from the loop entirely at this point feels reckless bordering on negligent. | |||||||||||||||||||||||||||||
| ▲ | bloomca 2 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
If the models don't get to the point where they can correct fixes on their own, then yeah, everything will be falling apart. There is just no other way around increasing entropy. The only way to harness it is to somehow package code producing LLMs into an abstraction and then somehow validate the output. Until we achieve that, imo doesn't matter how closely people watch out the output, things will be getting worse. | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | straydusk 3 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
I actually think this is fair to wonder about. My overall stance on this is that it's better to lean into the models & the tools around them improving. Even in the last 3-4 months, the tools have come an incredible distance. I bet some AI-generated code will need to be thrown away. But that's true of all code. The real questions to me are - are the velocity gains be worth it? Will the models be so much better in a year that they can fix those problems themselves, or re-write it? I feel like time will validate that. | |||||||||||||||||||||||||||||
| ▲ | Computer0 5 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
I have wondered the same but for the projects I am completely "hands off" on, the model improvements have overcome this issue time and time again. | |||||||||||||||||||||||||||||
| ▲ | rustyhancock 4 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
I'm 2-3 years from now if coding AI continues to improve at this pace I reckon people will rewrite entire projects. I can't imagine not reading the code I'm responsible for any more than I could imagine not looking out the windscreen in a self driving Tesla. But if so many people are already there, and mostly highly skilled programmers imagine in 2 years time with people who've never programmed! | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | well_ackshually 3 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
Also take something into account: absolutely _none_ of the vibe coding influencer bros make anything more complicated than a single-feature, already implemented 50 times webapp. They've never built anything complicated either, or maintained something for more than a few years with all the warts that it entails. Literally, from his bio on his website: > For 12 years, I led data and analytics at Indeed - creating company-wide success metrics used in board meetings, scaling SMB products 6x, managing organizations of 70+ people. He's a manager that made graphs on Power BI. They're not here because they want to build things, they're here to shit a product out and make money. By the time Claude has stopped being able to pipe together ffmpeg commands or glue together 3 JS libraries, they've gone on to another project and whoever bought it is a sucker. It's not that much different from the companies of the 2000s promising a 5th generation language with a UI builder that would fix everything. And then, as a very last warning: the author of this piece sells AI consulting services. It's in his interest to make you believe everything he has to say about AI, because by God is there going to be suckers buying his time at indecently high prices to get shit advice. This sucker is most likely your boss, by the way. | |||||||||||||||||||||||||||||
| ▲ | farnsworth 4 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
Yes, and you can rebuild them for free | |||||||||||||||||||||||||||||
| ▲ | RA_Fisher 5 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
Claude, Codex and Gemini can read code much faster than we can. I still read snippets, but mostly I have them read the code. | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | Hamuko 4 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
I've seen software written and architected by Claude and I'd say that they're already ready to be thrown out. Security sucks, performance will probably suck, maintainability definitely sucks, and UX really fucking sucks. | |||||||||||||||||||||||||||||
| ▲ | ekidd 4 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||
I have a wide range of Claude Code based setups, including one with an integrated issue tracker and parallel swarms. And for anything really serious? Opus 4.5 struggles to maintain a large-scale, clean architecture. And the resulting software is often really buggy. Conclusion: if you want quality in anything big in February 2026, you still need to read the code. | |||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||
| ▲ | cdfuller 4 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||
As LLMs advance so rapidly I think that all the AI slop code written today will be easily digestible by the LLMs a few generations down the line. I think there will be a lot of improvements in making user intent clearer. Combined with a bad codebase and larger context windows, refactoring wont be a challenge. | |||||||||||||||||||||||||||||