Remix.run Logo
The next two years of software engineering(addyosmani.com)
92 points by napolux 9 hours ago | 55 comments
babblingfish 6 hours ago | parent | next [-]

My experience hasn't been LLMs automate coding, just speeds it up. It's like I know what I want the solution to be and I'll describe it to the LLM, usually for specific code blocks at a time, and then build it up block-by-block. When I read hacker news people are talking like it's doing much more than that. It doesn't feel like an automation tool to me at all. It just helps me do what I was gonna do anyways, but without having to look up library function calls and language specific syntax

Aurornis 6 hours ago | parent | next [-]

> My experience hasn't been LLMs automate coding, just speeds it up.

This is how basically everyone I know actually uses LLMs.

The whole story about vibecoding and LLMs replacing engineers has become a huge distraction from the really useful discussions to be had. It’s almost impossible to discuss LLMs on HN because everyone is busy attacking the vibecoding strawman all the time.

kylecazar 5 hours ago | parent [-]

Half strawman -- a mudman, perhaps. Because we're seeing proper experts with credentials jump on the 'shit, AI can do all of this for me' realization blog post train.

eaurouge 5 hours ago | parent [-]

So another strawman?

jvans 5 hours ago | parent | prev | next [-]

i notice a huge difference between working on large systems with lots of microservices and building small apps or tools for myself. The large system work is what you describe, but small apps or tools I resonate with the automate coding crowd.

I've built a few things end to end where I can verify the tool or app does what I want and I haven't seen a single line of the code the LLM wrote. It was a creepy feeling the first time it happened but it's not a workflow I can really use in a lot of my day to day work.

antonymoose 6 hours ago | parent | prev | next [-]

It’s a better Google for me. Instead of searching AWS or StackOverflow it hallucinates a good enough output that I can refactor into an output.

petesergeant 5 hours ago | parent | prev [-]

I’m doing both. For production code that I care about, I’m reading every line the LLM writes, correcting it a lot, chatting with an observer LLM who’s checking the work the first LLM and I are writing. It’s speeding stuff up, it also reduces the friction on starting on things. Definitely a time saver.

Then I have some non-trivial side projects where I don’t really care about the code quality, and I’m just letting it run. If I dare look at the code, there’s a bunch of repetition. It rarely gets stuff right the first time, but that’s fine, because it’ll correct it when I tell it it doesn’t work right. Probably full of security holes, code is nasty, but it doesn’t matter for the use-cases I want. I have produced pieces of software here that are actively making my life better, and it’s been mostly unsupervised.

ch4s3 5 hours ago | parent | prev | next [-]

> junior developer employment drops by about 9-10% within six quarters, while senior employment barely budges. Big tech hired 50% fewer fresh graduates over the past three years.

This study showing 9-10% drop is odd[1] and I'm not sure about their identification critria.

> We identify GenAI adoption by detecting job postings that explicitly seek workers to implement or integrate GenAI technologies into firm workflows.

Based on that MIT study it seems like 90+% of these projects fail. So we could easily be seeing an effect where firms posting these GenAI roles are burning money on the projects in a way that displaces investment in headcount.

The point about "BigTech" hiring 50% fewer grads is almost orthogonal. All of these companies are shifting hiring towards things where new grads are unlikely to add value, building data centers and frontier work.

Moreover the TCJA of 2017 caused software developers to not count for R&D tax write offs (I'm oversimplifying) starting in 2022. This surely has more of an effect than whatever "GenAI integrator roles" postings correlates to.

[1] https://download.ssrn.com/2025/11/6/5425555.pdf

wefzyn 4 hours ago | parent | next [-]

AI became very popular suddenly. This is something that wasn't in anyone's budget. I believe cost savings from hiring freezes and layoffs are to pay for AI projects and infrastructure.

ch4s3 4 hours ago | parent [-]

Right so you shift budget away from other things. The “study” looked at ai integration job listings. You have to budget those.

garbawarb 5 hours ago | parent | prev [-]

Hiring was booming until about 2020 though.

ch4s3 4 hours ago | parent [-]

The TCJA change (of 2017) went into effect in 2022, I should have been more clear.

garbawarb 4 hours ago | parent [-]

I didn't know that but that makes perfect sense. A lot of layoffs and outsourcing coincided with that. Are there any signs it'll be reintroduced?

ch4s3 4 hours ago | parent [-]

It was late last year.

hncoder12345 43 minutes ago | parent | prev | next [-]

Sometimes I wonder if I made the wrong choice with software development. Even after getting to a senior role, according to this article, you're still expected to get more education and work on side projects outside of work. Am I supposed to want to code all the time? When can I pursue hobbies, a social life, etc.

johnfn 32 minutes ago | parent | next [-]

To put it very directly - if you are OK with being good but not exceptional at your job, this is totally fine. If you want to be exceptional you will probably need to put in the extra work. Not everyone is OK with this tradeoff and it's totally fine to "just" be good and care more about having outside hobbies and a social life and etc.

jedberg 34 minutes ago | parent | prev [-]

It's funny you should ask this. When I started out, 30 years ago, here were the answers you'd get from most people:

> Am I supposed to want to code all the time?

Yes.

> When can I pursue hobbies,

Your hobby should be coding fun apps for yourself

> a social life, etc.

You social life should be hanging out with other engineers talking about engineering things.

And the most successful people I know basically did exactly that.

I'm not saying y'all should be doing that now, I'm just saying, that is in fact how it used to be.

stack_framer 2 hours ago | parent | prev | next [-]

Funny that he mentions people not pivoting away from COBOL. My neighbors work for a bank, programming in COBOL every day. When I moved in and met them 14 years ago, I wondered how much longer they would be able to keep that up.

They're still doing it.

austin-cheney 8 hours ago | parent | prev | next [-]

I have been telling people that, titles aside, senior developers were the people not afraid to write original code. I don’t see LLMs changing this. I only envision people wishing LLMs would change this.

HarHarVeryFunny 7 hours ago | parent | next [-]

I disagree.

1) Senior developers are more likely to know how to approach a variety of tasks, including complex ones, in ways that work, and are more likely to (maybe almost subconsciously) stick to these proven design patterns rather than reinvent the wheel in some novel way. Even if the task itself is somewhat novel, they will break it down in familar ways into familar subtasks/patterns. For sure if a task does require some thinking outside the box, or a novel approach, then the senior developer might have better intuition on what to consider.

The major caveat to this is that I'm an old school developer, who started professionally in the early 80's, a time when you basically had to invent everything from scratch, so certainly there is no mental block to having to do so, and I'm aware there is at least a generation of developers that grew up with stack overflow and have much more of a mindset of building stuff using cut an paste, and less having to sit down and write much complex/novel code themselves.

2) I think the real distinction of senior vs junior programmers, that will carry over into the AI era, is that senior developers have had enough experience, at increasing levels of complexity, that they know how to architect and work on large complex projects where a more junior developer will flounder. In the AI coding world, at least for time being, until something closer to AGI is achieved (could be 10-20 years away), you still need to be able to plan and architect the project if you want to achieve a result where the outcome isn't just some random "I let the AI choose everything" experiment.

austin-cheney 7 hours ago | parent | next [-]

I completely agree with your second point. For your first point my experience tells me the people least afraid to write original code are the people least oppositional to reinventing wheels.

The distinguishing behavior is not about the quantity of effort involved but the total cost after consideration of dependency management, maintenance time, and execution time. The people that reinvent wheels do so because they want to learn and they also want to do less work on the same effort in the future.

BoiledCabbage 6 hours ago | parent | prev [-]

> in the early 80's, a time when you basically had to invent everything from scratch, so certainly there is no mental block to having to do so, and I'm aware there is at least a generation of developers that grew up with stack overflow and have much more of a mindset of building stuff using cut an paste, and less having to sit down and write much complex/novel code themselves.

I think this is really underappreciated and was big in driving how a lot of people felt about LLMs. I found it even more notable on a site named Hacker News. There is an older generation for whom computing was new. 80s through 90s probably being the prime of that era (for people still in the industry). There constantly was a new platform, language, technology, concept to learn. And nobody knew any best practices, nobody knew how anything "should work". Nobody knew what anything was capable of. It was all trying things, figuring them out. It was way more trailblazing / exploring new territory. The birth of the internet being one of the last examples of this from that era.

The past 10-15 years of software development have been the opposite. Just about everything was evolutionary, rarely revolutionary. Optimizing things for scale, improving libraries, or porting successful ideas from one domain to another. A lot of shifting around deck chairs on things that were fundamentally the same. Just about every new "advance" in front-end technology was this. Something hailed as ground breaking really took little exploration, mostly solution space optimization. There was almost always a clear path. Someone always had an answer on stack overflow - you never were "on your own". A generation+ grew up in that environment and it felt normal to them.

LLMs came about and completely broke that. And people who remembered when tech was new and had potential and nobody knew how to use it loved that. Here is a new alien technology and I get to figure out what makes it tick, how it works how to use it. And on the flip side people who were used to there being a happy path, or a manual to tell you when you were doing it wrong got really frustrated as their being no direction - feeling perpetually lost and it not working the way they wanted.

I found it especially ironic being on hacker news how few people seemed to have a hacker mindset when it came to LLMs. So much was, "I tried something it didn't work so I gave up". Or "I just kept telling it to work and it didn't so I gave up". Explore, pretend you're in a sci-fi movie. Does it work better on Wednesdays? Does it work better if you stand on your head? Does it work differently if you speak pig-latin? Think side-ways. What behavior can you find about it that makes you go "hmm, that's interesting...".

Now I think there has been a shift very recently of people getting more comfortable with the tech - but still was surprised of how little of a hacker mindset I saw on hacker news when it came to LLMs.

LLMs have reset the playing field from well manicured lawn, to an unexplored wilderness. Figure out the new territory.

Terr_ 5 hours ago | parent | next [-]

To me, the "hacker" distinction is not about novelty, but understanding.

Bashing kludgy things together until they work was always part of the job, but that wasn't the motivational payoff. Even if the result was crappy, knowing why it was crappy and how it could've been better was key.

LLMs promise an unremitting drudgery of the "mess around until it works" part, facing problems that don't really have a cause (except in a stochastic sense) and which can't be reliably fixed and prevented going forward.

The social/managerial stuff that may emerge around "good enough" and velocity is a whole 'nother layer.

layer8 4 hours ago | parent | prev | next [-]

No, the negative feelings about LLMs are not because they are new territory, it’s because they lack the predictability and determinism that draw many people to computers. Case in point, you can’t really cleverly “hack” LLMs. It’s more a roll of the dice that you try to affect using hit-or-miss incantations.

hooverd 2 hours ago | parent | prev [-]

an unexplored wilderness that you pour casino chips into (unless you're doing local model stuff yea yea)

CSSer 8 hours ago | parent | prev [-]

I almost think what a lot of people are coming to grips is with is how much code is unoriginal. The ones who've adjusted the fastest were humble to begin with. I don't want to claim the title, but I can certainly claim the imposter syndrome! If anything, LLMs validated something I always suspected. The amount of truly unique, relevant to success, code in a given project is often very small. More often than not, it's not grouped together either. Most of the time it's tailored to a given functionality. For example, a perfectly accurate Haversine distance is slower than an optimized one with tradeoffs. LLMs have not yet become adept at housing the ability to identify the need for those tradeoffs in context well or consistently, so you end up with generic code that works but not great. Can the LLM adjust if you explicitly instruct it to? Sure, sometimes! Sometimes it catches it in a thought loop too. Other times you have to roll up your sleeves and do the work like you said, which often still involves traditional research or thinking.

osigurdson 4 hours ago | parent | prev | next [-]

>> The skillset is shifting from implementing algorithms to knowing how to ask the AI the right questions and verify its output.

The question is, how much faster is verification only vs writing the code by hand? You gain a lot of understanding when you write the code yourself, and understanding is a prerequisite for verification. The idea seems to be a quick review is all that should be needed "LGTM". That's fine as long as you understand the tradeoffs you are making.

With today's AI you either trade speed for correctness or you have to accept a more modest (and highly project specific) productivity boost.

mellosouls 8 hours ago | parent | prev | next [-]

On the junior developer question:

A humble way for devs to look at this, is that in the new LLM era we are all juniors now.

A new entrant with a good attitude, curiosity and interest in learning the traditional "meta" of coding (version control, specs, testing etc) and a cutting-edge, first-rate grasp of using LLMs to assist their craft (as recommended in the article) will likely be more useful in a couple of years than a "senior" dragging their heels or dismissing LLMs as hype.

We aren't in coding Kansas anymore, junior and senior will not be so easily mapped to legacy development roles.

tommica 34 minutes ago | parent | prev | next [-]

One thing that fucks with juniors is the expecration of paying for subscriptions for AI models. If you need to know how the AI tools work, you need to learn them with your own money.

Not everyone can afford it, and then we are at the point of changing the field that was so proud about just needing a computer and access to internet to teach oneself into a subscription service.

ares623 7 minutes ago | parent [-]

If the AI gets so good then they shouldn’t need to pre-learn.

Eong 5 hours ago | parent | prev | next [-]

Love the article, I had a struggle with my new identity and thus had to write https://edtw.in/high-agency-engineering/ for myself, but also came to the realisation that the industry is shifting too especially for junior engineers.

Curious about how the Specialist vs Generalist theme plays out, who is going to feel it more *first* when AI gets better over time?

PraddyChippzz 7 hours ago | parent | prev | next [-]

The points mentioned in the article, regarding the things to focus on, is spot on.

bradleyjg 5 hours ago | parent | prev | next [-]

The bottom up and top down don’t seem to match.

Where is all the new and improved software output we’d expect to see?

streetcat1 3 hours ago | parent | prev | next [-]

For some reason miss two important points:

1) The AI code maintainence question - who would maintain the AI generated code 2) The true cost of AI. Once the VC/PE money runs out and companies charge the full cost, what would happen to vibe coding at that point ?

NitpickLawyer an hour ago | parent [-]

I think this post is a great example of a different point made in this thread. People confuse vibe-coding with llm-assisted coding all the time (no shade for you, OP). There is an implied bias that all LLM code is bad, unmaintainable, incomprehensible. That's not necessarily the case.

1) Either you, the person owning the code, or you + LLms, or just the LLMs in the future. All of them can work. And they can work better with a bit of prep work.

The latest models are very good at following instructions. So instead of "write a service that does X" you can use the tools to ask for specifics (i.e. write a modular service, that uses concept A and concept B to do Y. It should use x y z tech stack. It should use this ruleset, these conventions. Before testing run these linters and these formatters. Fix every env error before testing. etc).

That's the main difference between vibe-coding and llm-assisted coding. You get to decide what you ask for. And you get to set the acceptance criteria. The key po9int that non-practitioners always miss is that once a capability becomes available to these models, you can layer them on top of previous capabilities and get a better end result. Higher instruction adherence -> better specs -> longer context -> better results -> better testing -> better overall loop.

2) You are confusing the fact that some labs subsidise inference costs (for access to data, usage metrics, etc) with the true cost of inference on a given model size. Youc an already have a good indication on what the cost is today for any given model size. 3rd party inference shops exist today, and they are not subsidising the costs (they have no reason to). You can do the math as well, and figure out an average cost per token for a given capability. And those open models are out, they're not gonna change, and you can get the same capability tomorrow or in 10 years. (and likely at lower costs, since hardware improves, inference stack improves, etc).

ahmetomer 8 hours ago | parent | prev | next [-]

> Junior developers: Make yourself AI-proficient and versatile. Demonstrate that one junior plus AI can match a small team’s output. Use AI coding agents (Cursor/Antigravity/Claude Code/Gemini CLI) to build bigger features, but understand and explain every line if not most. Focus on skills AI can’t easily replace: communication, problem decomposition, domain knowledge. Look at adjacent roles (QA, DevRel, data analytics) as entry points. Build a portfolio, especially projects integrating AI APIs. Consider apprenticeships, internships, contracting, or open source. Don’t be “just another new grad who needs training”; be an immediately useful engineer who learns quickly.

If I were starting out today, this is basically the only advice I would listen to. There will indeed be a vacuum in the next few years because of the drastic drop in junior hiring today.

ares623 4 minutes ago | parent [-]

What. That’s written in a way that’s like “men writing women”. Not putting themselves in the shoes of a junior who has no context or almost no opportunities.

wakawaka28 7 hours ago | parent | prev | next [-]

The outlook on CS credentials is wrong. You'll never be worse off than someone without those credentials, all other things equal. Buried in this text is some assumption that the relatively studious people who get degrees are going to fall behind the non-degreed, because the ones who didn't go to school will out-study them. What is really going to happen generally is that the non-degreed will continue to not study, and they will lean on AI to avoid studying even the few things that they might have otherwise needed to study to squeak by in industry.

doug_durham 8 hours ago | parent | prev | next [-]

The author has a bizarre idea of what a computer science degree is about. Why would it teach cloud computing or dev ops? The idea is you learn those on your own.

happytoexplain 7 hours ago | parent | next [-]

If that's "the idea", then clearly we need a more holistic, useful degree to replace CS as "the" software degree.

kibwen 6 hours ago | parent | next [-]

Despite what completely uninformed people may think, the field "computer science" is not about software development. It's a branch of mathematics. If you want an education in software development, those are offered by trade schools.

AnimalMuppet 6 hours ago | parent [-]

What I want is for universities to offer a degree in Software Engineering. That's a different field from Computer Science.

You say that belongs in a trade school? I might agree, if you think trade schools and not universities should teach electrical engineering, mechanical engineering, and chemical engineering.

But if chemical engineering belongs at a university, so does software engineering.

collingreen 5 hours ago | parent | next [-]

Plenty of schools offer software engineering degrees alongside computer science, including mine ~20 years ago.

The bigger problem when I was there was undergrads (me very much included) not understanding the difference at all when signing up.

none2585 5 hours ago | parent | prev | next [-]

Saying this as a software engineer that has a degree in electrical engineering - software "engineering" is definitely not the same as other engineering disciplines and definitely belongs in a trade school.

xboxnolifes 4 hours ago | parent | prev | next [-]

Many do. Though, the one I'm familiar with is basically a CS-lite degree with software specific project design and management courses.

Glad I did CS, since SE looked like it consisted of mostly group projects writing 40 pages of UML charts before implementing a CRUD app.

pkaye 5 hours ago | parent | prev | next [-]

My university had Electrical Engineering, Computer Engineering, Software Engineering and Computer Science degrees (in additional to all the other standard ones.)

mxkopy 6 hours ago | parent | prev [-]

Last I checked ASU does, and I’m certain many other universities do too.

throwaway7783 6 hours ago | parent | prev | next [-]

The degree is (should be) about CS fundamentals and not today's hotness. Maybe a "trades" diploma in CS could teach today's hotness.

wrs 7 hours ago | parent | prev | next [-]

Cloud computing is not some new fundamental area of computer science. It’s just virtual CPUs with networks and storage. My CS degree from 1987 is still working just fine in the cloud, because we learned about CPUs, virtualization, networks, and storage. They’re all a lot bigger and faster, with different APIs, but so what?

Devops isn’t even a thing, it’s just a philosophy for doing ops. Ops is mostly state management, observability, and designing resilient systems, and we learned about those too in 1987. Admittedly there has been a lot of progress in distributed systems theory since then, but a CS degree is still where you’ll find it.

School is typically the only time in your life that you’ll have the luxury of focusing on learning the fundamentals full time. After that, it’s a lot slower and has to be fit into the gaps.

wakawaka28 7 hours ago | parent | prev [-]

There has to be a balance of practical skills and theory in a useful degree, and most CS curricula are built that way. It should not be all about random hot tech because that always changes. You can easily learn tech from tutorials, because the tech is simple compared to theory. Theory is also important to be able to judge the merits of different technology and software designs.

tibbar 8 hours ago | parent | prev [-]

Why is this necessarily true?

sys_64738 7 hours ago | parent [-]

A CS degree is there to teach you concepts and fundamentals that are the foundation of everything computing related. It doesn't generally chase after the latest fads.

tibbar 5 hours ago | parent [-]

Sure, but we need to update our definitions of concepts/fundamentals. A lot of this stuff has its own established theory and has been a core primitive for software engineering for many years.

For example, the primitives of cloud computing are largely explained by papers published by Amazon, Google, and others in the early '00s (DynamoDB, Bigtable, etc.). If you want to explore massively parallel computation or container orchestration, etc, it would be natural to do that using a public cloud, although of course many of the platform-specific details are incidentals.

Part of the story here is that the scale of computing has expanded enormously. The DB class I took in grad school was missing lots of interesting puzzle pieces around replication, consistency, storage formats, etc. There was a heavy focus on relational algebra and normalization forms, which is just... far from a complete treatment of the necessary topics.

We need to extend our curricula beyond the theory that is require to execute binaries on individual desktops.

michaelsalim 3 hours ago | parent [-]

I do agree that the scale has expanded a lot. But this is true with any other fields. Does that mean that you need to learn everything? Well at some point it becomes unfeasible.

See doctors for example, you learn a bit of everything. But then if you want to specialise, you choose one.

gassi 6 hours ago | parent | prev [-]

> Addy Osmani is a Software Engineer at Google working on Google Cloud and Gemini

Ah, there it is.