Remix.run Logo
fzeroracer 2 days ago

> To do so, Mr. Giorgi has his own timesaving helper: an A.I. coding assistant. He taps a few keys and the software tool suggests the rest of the line of code. It can also recommend changes, fetch data, identify bugs and run basic tests. Even though the A.I. makes some mistakes, it saves him up to an hour many days.

> Still, nearly two-thirds of software developers are already using A.I. coding tools, according to a survey by Evans Data, a research firm.

> So far, the A.I. agents appear to improve the daily productivity of developers in actual business settings between 10 percent and 30 percent, according to studies. At KPMG, an accounting and consulting firm, developers using GitHub Copilot are saving 4.5 hours a week on average and report that the quality of their code has improved, based on a survey by the firm.

We're in for a really dire future where the worst engineers you can imagine are not only shoveling out more garbage code but the ability to assess it for problems or issues is much more difficult.

JumpCrisscross 2 days ago | parent | next [-]

> We're in for a really dire future where the worst engineers you can imagine are not only shoveling out more garbage code but the ability to assess it for problems or issues is much more difficult

It will probably still be more productive. IDEs, Stack Exchange...each of these prompted the same fears and realised some of them. But the benefits of having more code quicker and cheaper, even if more flawed, outweighed those of quality. The same way the benefits of having more clothes and kitchenware and even medicine quicker and cheaper outweighed the high-quality bespoke wares that preceded them. (Where it doesn't, and where someone can pay, we have artisans.)

In the mean time, there should be an obsolescence premium [1] that materialises for coders who can clean up the gloop. (Provided, of course, that young and cheap coders of the DOGE variety stop being produced.)

[1] https://www.sciencedirect.com/science/article/abs/pii/S01651...

xigoi a day ago | parent | next [-]

You say “more code” as if it was a good thing. On the contrary, I want my codebases to have as little code as possible to achieve the given task.

fzeroracer 2 days ago | parent | prev [-]

The problem with 'more code, quicker and cheaper' is that when you fall behind a baseline of quality it actually ends up costing you and your business, significantly. Companies learned this the hard way during the outsourcing booms, and the usage of AI amplifies this problem 10 fold much like it's doing with spam.

JumpCrisscross 2 days ago | parent | next [-]

> Companies learned this the hard way during the outsourcing booms

Sure. Then we learned how to do it and in which contexts. Since WFH trained executives state-side how to communicate asynchronously and over Zoom, I've seen engineering offshoring done quite productively, possibly for the first time in my career.

fzeroracer 2 days ago | parent [-]

I've seen attempts at engineering offshoring again and it causes problems because the core part of software engineering isn't the actual coding part, it's communication, design and understanding.

If you offshore your coding, how do you know they're not using AI as well? How do you verify code quality? And what do you do when you're left cleaning up a mess? This isn't really a problem of the quality of engineers, but the inherent relationship between a company and contracted firms.

JumpCrisscross 2 days ago | parent [-]

> If you offshore your coding, how do you know they're not using AI as well? How do you verify code quality?

All of these apply to remote-only teams. We know they work. (Note: I'm commenting more on offshoring than outsourcing. Outsourcing is fraught with issues. Offshoring once was, but is increasingly becoming viable, particularly with WFH and AI.)

xyzzy123 a day ago | parent [-]

IMHO the primary problem with offshoring vs using employees was always principal / agent problem and incentives.

There was never a strong incentive for your "partner" to actually finish the job, they got more money from overruns, fixing defects, sunk costs, requirements churn, overstaffing, giving you juniors and billing for seniors, etc etc etc.

There were also issues with cultural differences, communication and lack of understanding of the "customer" - but they are minor and resolvable compared to the core problem.

slothtrop 2 days ago | parent | prev [-]

Growing pains. Reviewing carefully is still less work.

nyarlathotep_ a day ago | parent | prev | next [-]

> At KPMG, an accounting and consulting firm, developers using GitHub Copilot are saving 4.5 hours a week on average and report that the quality of their code has improved, based on a survey by the firm.

I don't have any specific experience with KPMG, but considering the other "big name" firms' work I've encountered, there's, uh, lots of room for improvement.

sirsinsalot 2 days ago | parent | prev | next [-]

I made good money cleaning up after the 2000s outsourcing boom.

It was lucrative cleaning up shit code from Romania and India.

I'm hoping enough people churn out enough hot garbage that needs fixing now that I can jack up my day rate.

I remember when the West would have no coders because Indian coders are cheaper.

I remember when nocode solutions would replace programmers.

I remember.

cootsnuck 2 days ago | parent | next [-]

What did/do you call your services you offer? I sincerely love debugging (fixing tech of any kind digital or analog). Never thought I could offer services just fixing instead of building from scratch...

ethagnawl 2 days ago | parent | prev [-]

If you missed this recent post, I think you'll appreciate it: https://defragzone.substack.com/p/techs-dumbest-mistake-why-...

> Now, let’s talk about the real winners in all this: the programmers who saw the chaos coming and refused to play along. The ones who didn’t take FAANG jobs but instead went deep into systems programming, AI interpretability, or high-performance computing. These are the people who actually understand technology at a level no AI can replicate.

> And guess what? They’re about to become very expensive. Companies will soon realize that AI can’t replace experienced engineers. But by then, there will be fewer of them. Many will have started their own businesses, some will be deeply entrenched in niche fields, and others will simply be too busy (or too rich) to care about your failing software department.

> Want to hire them back? Hope you have deep pockets and a good amount of luck. The few serious programmers left will charge rates that make executives cry. And even if you do manage to hire them, they won’t stick around to play corporate politics or deal with useless middle managers. They’ll fix your broken systems, invoice you an eye-watering amount, and walk away.

rybosworld 2 days ago | parent [-]

This entire article reads like hopium. And it seems predicated on the false belief that companies are going to try to replace their entire workforce with AI overnight:

> "Imagine a company that fires its software engineers, replaces them with AI-generated code, and then sits back"

It should go without saying this is not even possible at the moment. Will it be possible one day? Yes, probably. And when that day comes, the fantasies this author has dreamed up will be irrelevant.

I've said it before and I'll say it again: It shocks me that a forum filled with tech professionals, is so blindly biased against AI that they refuse to acknowledge what changes are coming.

All of these conversations boil down to: "The LLM's of today couldn't replace me." That's probably true for most folks.

What's also true is that ChatGPT was released less than 3 years ago. And we've seen it go from a novelty with no real use, to something that can write actually decent research papers and gets better at coding by the month.

"B-b-but there's no guarantee it will continue to improve!" is one of the silliest trains of thought a computer scientist could hold.

vunderba a day ago | parent | next [-]

Couple things:

1. The "LLMs are still in their infancy" argument is frequently trotted out but let's be clear - GPTs were introduced back in 2018 - so SEVEN years ago.

2. It shocks me that a forum filled with tech professionals, is so blindly biased against AI that they refuse to acknowledge what changes are coming. This feels like a corollary to the Baader-Meinhof phenomenon. I don't think you can extrapolate that a few dozen loudly dissenting voices is necessarily representative of majority opinion.

3. I would like to see a citation of ChatGPT releasing actual "decent research papers".

4. If AIs get to the point of actually acting in a completely autonomous fashion and replace software engineers - then there's no reason to believe that they won't also obliterate 90% of other white-collar jobs (including other STEM) so at that point we're looking at needing to completely re-evaluate our economic system possibly with UBI, etc.

throw234234234 a day ago | parent [-]

I actually think sadly it can replace "just software engineers" at least in the short and medium term. Not because it can't do other careers (if employed effectively) but because that's what they are actively targeting and they have the domain knowledge in it, and it being a public open profession amenable to RL. There's millions of code pieces online, lots of public job briefs and success criteria defined in it, etc etc. They will throw every research and ML trick at it just to displace SWE's because that's what they really really want to do. IMO this is particularly true for OpenAI. Other jobs, once seeing the bargaining power of SWE's fall and be destroyed, will resist integration of AI from the big corps and see a much slower disruption - this is the most rational thing to do to preserve your enterprise. Especially given most intellectual economic jobs are at best oligopolies at the large end.

OpenAI just released their "Lancer" benchmark which basically shows their intent - replace the economic value of software development from coding to engineering manager tasks. Nothing is safe pretty much - I don't recommend people enter the industry anymore; its just anxiety you don't need (i.e. companies together collectively worth in the trillions are trying to destroy your economic value).

Not that it will take "good high economic mobility jobs"; my disappointment is more that this effort for society is better spent in many other domains (medicine, building, robotics) which at least have some benefit from the disruption but no - its all about SWE's. Must be what their VC's want from them and/or keeps the fear/hype train going most effectively.

itsoktocry 19 hours ago | parent | prev | next [-]

>I've said it before and I'll say it again: It shocks me that a forum filled with tech professionals, is so blindly biased against AI that they refuse to acknowledge what changes are coming.

I have the exact same reaction reading this stuff on HN. It's hilarious, scary and sad, all at the same time.

The speed at which these tools have improved has completely convinced me that these things are going to take over. But I don't fear it, I'm excited about it. But I don't write code for code's sake; I'm using code to solve problems.

dmix 2 days ago | parent | prev [-]

Even if we don't write code software engineers (or technically minded people) will be able to coordinate hundreds of AI bots better than the average person and manage the systems. If there's a day where programming is not as valuable I'm pretty confident I can find some way to be useful in the future economy.

And if it's real AGI, not airplanes flying themselves with a pilot stuff, then we probably will have to re-think employment anyway

rybosworld a day ago | parent [-]

> Even if we don't write code software engineers (or technically minded people) will be able to coordinate hundreds of AI bots better than the average person and manage the systems.

There might be a period of time where product management and qa are still done by people, but I think that period will be transitory and short.

I think software engineers in general are grossly underestimating the probability they will be replaced. On a 20-30 year timeline, that probability might be close to 100%. Probably, it will also be gradual, and those who are displaced (starting with least experienced, to most), will not be able to find similar employment.

We are all more or less opting into this without a fight.

dmix a day ago | parent [-]

> We are all more or less opting into this without a fight.

I don’t fear my dev job becoming less valuable as large chunks become automated. Like I said if it happens I will figure out a way to be useful to society in other ways, even if it’s inventing things for AI to do or controlling AI to do the job of a bunch of devs. I will adapt like everyone has in history and make myself valuable in new ways.

“Nothing is static. Everything is evolving. Everything is falling apart.”

slothtrop 2 days ago | parent | prev | next [-]

The increase in productivity means you need fewer inexperienced and/or bad engineers to a project. On the other hand, they may be retained to go after bolder, more numerous targets.

deadbabe 2 days ago | parent | prev [-]

I don’t think that future will happen, because eventually someone will realize there is a competitive advantage in building a truly good product with people who actually know what they’re doing, and when other companies catch on they will start doing that and bad prompt kiddy engineers will be gone.

cootsnuck 2 days ago | parent [-]

> because eventually someone will realize there is a competitive advantage in building a truly good product with people who actually know what they’re doing

Doesn't / Shouldn't that competitive advantage inherently exist already? But don't we still see a small group of big players that put out broken / mediocre / harmful tech dominating the market?

The incentives are to make the line keep going up – nothing else. Which is how we get search engines that don't find things, social media that's anti-social, users that are products, etc.

I'm not at all hopeful that an already entrenched company that lays off 50% of its workers and replaces them with an AI-slop-o-matic will lose to a smaller company putting out well-made principled tech. At least not without leveraging other differentiating factors.

(I say all of this as someone that is excited about the possibilities AI can now afford us. It's just that the possibilities I'm excited about are more about augmentation, accessibility, and simplicity rather than replacement or obsolescence.)