Remix.run Logo
samiv 4 hours ago

What you bring to the table night be fine, but how long do you think you'll find emoloyers willing to still pay for this?

One thing is for sure LLMs will bring down down the cost of software per some unit and increase the volume.

But..cost = revenue. What is a cost to one party is a revenue to another party. The revenue is what pays salaries.

So when software costs go down the revenues will go down too. When revenues go down lay offs will happen, salary cuts will happen.

This is not fictional. Markets already reacted to this and many software service companies took a hit.

atonse 3 hours ago | parent | next [-]

I don't have an answer for this, and won't pretend to.

But my take on this is that accountability will still be a purely human factor. It still is. I recently let go of a contractor who was hired to run our projects as a Scrum/PM, and his tickets were so bad (there were tickets with 3 words in them, one ticket was in the current sprint, that was blocked by a ticket deep in the backlog, basic stuff). When I confronted him about them, he said the AI generated them.

So I told him that:

1. That's not an excuse, his job is to verify what it generated and ensure it's still good.

2. That actually makes it look WORSE, that not only did he do nearly 0 work, that he didn't even check the most basic outputs. And I'm not anti-AI, I expressly said that we should absolutely use AI tools to accelerate our work. But that's not what happened here.

So you won't get to say (at least I think for another few years) "my AI was at fault" – you are ultimately responsible, not your tools. So people will still want to delegate those things down the chain. But ultimately they'll have to delegate to fewer people.

jgilias an hour ago | parent | next [-]

In general I agree. But it’s somehow very unlikely for the AI to generate a three word ticket. That’s what humans do. AI might generate an overly verbose and specific ticket instead.

eisa01 an hour ago | parent | prev [-]

What drives that behavior is what I like to call human slop :)

post-it 2 hours ago | parent | prev | next [-]

If AI completely erases the profession of software developer, I'll find something else to do. Like I can't in good faith ever oppose a technology just because it's going to make my job redundant, that would be insane.

rapnie 2 hours ago | parent | next [-]

Take that to its extreme. Suppose there was a technology that you do not own that would make everyone's job redundant. Everyone out of a job. There is no need for education, for skills to be mastered, for expertise. Would it still be insane to complain?

mchaver 25 minutes ago | parent | next [-]

Then society needs to collectively decide how to allocate resources. Uh oh!

satvikpendem an hour ago | parent | prev [-]

There are bigger issues if everyone is out of a job.

ipaddr 2 hours ago | parent | prev [-]

There may not be a job for you in an office setting. What would you do?

satvikpendem an hour ago | parent [-]

That's when the problem shifts from individual to systemic, and only systemic solutions fix systemic problems.

linsomniac 3 hours ago | parent | prev | next [-]

>What you bring to the table night be fine, but how long do you think you'll find emoloyers willing to still pay for this?

I'm assuming that the software factory of the future is going to need Millwrights https://en.wikipedia.org/wiki/Millwright

But, builders are builders. These tools turn ideas into things, a builders dream.

codebolt 3 hours ago | parent | prev | next [-]

Any given system will still need people around to steer the AI and ensure the thing gets built and maintained responsibly. I'm working on a small team of in-house devs at a financial company, and not worried about my future at all. As an IC I'm providing more value than ever, and the backlog of potential projects is still basically endless- why would anyone want to fire me?

kristiandupont an hour ago | parent | next [-]

Why would it need people to steer the AI? I can easily see a future where companies that don't rely on the physical world (like manufacturing) are completely autonomous, just machines making money for their owner.

codebolt an hour ago | parent [-]

It's easy to imagine but there's still a vast amount of innovation and development that has to happen before something like that becomes realistic. At that point the whole system of capitalism would need to be reconsidered. Not going to happen in the foreseeable future.

anonnon 18 minutes ago | parent | prev [-]

> why would anyone want to fire me?

Because they can hire some "prompt engineer" to "steer the AI" for $30-50k instead of $150-$250k.

jmalicki 2 hours ago | parent | prev | next [-]

"One thing is for sure LLMs will bring down down the cost of software per some unit and increase the volume.

But..cost = revenue."

That is Karl Marx's Labor theory of value that has been completely disproven.

You don't charge what it costs to build something, you charge the maximum the customer is willing to pay.

rps93 4 hours ago | parent | prev [-]

Just sold a house/moved out after being laid off in mid-January from a govt IT contractor(there for 8 great years and mostly remote). I started my UX Research, Design and Front End Web Design coding career in 2009, but now I think it's almost a stupid go nowhere vanishing career, thanks to AI.

I think much like you that AI is and will just continue to destroy the economy! At least I got to sell a house and make a profit--stash it away for when the big AI market crash happens (hopefully not a 2030 great depression tho). As then it's a down market and buying stocks, bitcoin and houses is always cheaper.