Remix.run Logo
bcjdjsndon 2 hours ago

> They want to replace workers

A simple question none of the ai-doomsayers can answer... who buys anything when nobody has a job cos robots do everything?

tux3 an hour ago | parent | next [-]

The true AI doomsayers believe in some sort of technological singularity, which means a point after which things become so strange that the world is radically transformed.

Things like "jobs" and "careers" are so integral to society that we can't really imagine what society would be like in a world where people don't have any clear purpose. That's why you won't get a definitive answer. The whole idea of a singularity is that people don't have the faintest clue what day to day life would look like after.

We often to choose to believe that a singularity can't happen, because we don't know what that even means. We can't answer the simple question. So it definitely better not happen, that would be very inconvenient.

garciasn an hour ago | parent | next [-]

I’m always amazed that when I tell people I intend to retire in my 50s, they tell me that I can’t possibly mean that and actively wonder how I could possibly fill my time. It’s as if we could not possibly function as humans without meaningless shifting of tangible/intangibles from one place to another.

Society is so hellbent on the idea that we need our job to be our identity, they lack the imagination for another other reality.

It’s ridiculous.

ryanackley an hour ago | parent | next [-]

Sure working sucks, but have you tried not working? I think this is from lived experience because I've gone for stretches of not working (intentionally). It can be challenging to find a sense of fulfillment. I know it seems counter-intuitive but if you do succeed in your dream of retiring in your 50's I think you'll understand what I mean when you get there.

JohnFen 41 minutes ago | parent | next [-]

I think this varies wildly from person to person. I've also intentionally gone long stretches without working and those are the times when I've had a dramatically increased sense of purpose and fulfillment. Working for others reduces those things for me.

I'm in the age group where a lot of the people around me have retired. Some of them have fared very poorly, some have straight-up blossomed.

ryanackley 33 minutes ago | parent [-]

Ok but one of the great things about retiring when everyone else does is you have a community. If you stop working when you're young, everyone else in your network is probably still working.

I'm not against early retirement. One of my points was that, in general, it's harder to find fulfillment as a working age adult outside of work. Not impossible, just more challenging.

mrhottakes 15 minutes ago | parent [-]

It's harder for -you- to find fulfillment outside of work. This is not a true statement for most or "in general".

mrhottakes 16 minutes ago | parent | prev | next [-]

I think you need to do better at not working. It's great actually.

bachmeier 37 minutes ago | parent | prev [-]

Sorry, but your comment isn't really responding to OP's main point.

> It can be challenging to find a sense of fulfillment.

If you actually get fulfillment from work, then great, continue to work. The critical thing that drives people to retire earlier than the average person is that their work doesn't give them a sense of fulfillment. It's literally just a way to fill out the day. Some people do have things that are more fulfilling than letting an employer tell them how to spend their day.

ryanackley 30 minutes ago | parent [-]

Yes, it was responding. One of my points was that it has nothing to do with society's expectations but people's lived experiences and observations.

You seem to think I'm advocating for working your entire life. I'm just trying to share my lived experience so please take it easy.

There is some bitterness that's coming across in your response.

bachmeier an hour ago | parent | prev [-]

It is indeed ridiculous. People saying they're going to let someone else tell them what to do with their time, energy, and calendar, even if they hate doing it. The only explanation I have is that they have been letting the wrong people program them.

woeirua an hour ago | parent | prev | next [-]

I believe that AI will continue to progress. I believe that we’re going to see a fast takeoff.

That said, some people are now discussing a “societal singularity” wherein society breaks before the actual emergence of AGI. I believe this is the trajectory we are on. The question is what happens to the unemployed. Democracies will not tolerate mass permanent unemployment, as we’ve seen over and over again.

UBI is a scam, many middle class folks would be worse off under UBI than they are under the current system. They will fight to defend the economic status quo.

In the end, I think capitalism is incompatible with the emergence of AGI, and I think an aligned ASI will smash the capitalist system simply out of pure egalitarianism. (Note: I was previously a proponent of capitalism.) I think many people will die trying to defend capitalism. We’re at the beginning of the AI wars.

nervousvarun 41 minutes ago | parent [-]

My sentiments are fairly similar.

In the US at least the middle class was already being hunted to extinction and it seems reasonable. This is just accelerant on that already burning fire.

visarga an hour ago | parent | prev [-]

It can't happen. For one - if it did happen it would mean all domains reach singularity at once, but we know the capability curve is jagged. Each domain advances at its own speed.

Second - the more you make progress, the harder it gets, exponentially harder. Maybe Newton could advance physics observing an apple fall, today they need space telescopes and billion dollar particle accelerators. The more tech advances, the harder it is. Will AGI be so "super" to cancel out exponentials?

And third - the AI progress is tied to learning signal, and we have exhausted the available data. In the last 1-2 years we have started using verified synthetic data (RLVR) but exponential difficulty is a barrier. Other domains don't even have built in verifiability like math and code. So there the progress will be slower. Testing a vaccine to be safe takes 6 months for 1 bit of information - that is how slow and expensive it can get in some domains. AI can't get the learning signal it needs across all domains fast enough.

thomascgalvin an hour ago | parent | prev | next [-]

You're asking a question that only applies to rational actors.

Corporations exist for one purpose: to get as much money as possible. Side concerns, which can range from "not destroying the environment" or "not destroying the economy," are objectively not their goal, nor do they consider them their responsibility. Those are things "someone else" should worry about.

AI destroying all jobs is similar to a nuclear arms race; these companies don't want to eliminate everyone's ability to buy things, but they don't want to be the only entity without that ability, so ...

bluecheese452 an hour ago | parent [-]

That is mostly true but a bit of a simplification. They exist to do what the people who have power want them to do which is not always strictly profit maximization.

A ceo may realize rto will decrease profits but do it anyway because it increases the power delta between him and the workers.

nervousvarun an hour ago | parent [-]

"not always strictly profit maximization."

Maybe in the short-term but public companies with shareholders won't allow this in any sort of long-term way right?

bluecheese452 an hour ago | parent [-]

Not allow it? They insist upon it!

The controlling votes are all part of the same social class. They would gladly give up a small amount of profit to keep the distance between them and the workers as large as possible.

nervousvarun 37 minutes ago | parent [-]

To the extent it doesn't negatively impact the stock price sure but you would agree the CEO and any sort of power-trip they have is ultimately beholden to that right?

bluecheese452 21 minutes ago | parent [-]

If he goes against what they want absolutely. If he introduced a 4 day work week for example he would be in big trouble.

dijit an hour ago | parent | prev | next [-]

Nobody can answer that?

There are jobs AI can't easily come for... not always nice ones, but either too physically fiddly or too cheap to bother automating.

But jobs go "extinct" all the time. My ancestors going back generations were sugarhouse labourers. That job's gone, but the lineage isn't: we just do different things now.

The pattern seems pretty consistent: raise the floor (dishwashers, CNC machines, laundry), and people tend to climb to higher levels of abstraction. The real question is who captures those productivity gains; and historically, it isn't the workers.

Shoes are the classic example. Automation made them cheaper and accessible to everyone. Then, once the market was captured, mid-tier became the ceiling and anything above it got expensive again. Nobody won except the owners.

isx726552 an hour ago | parent | prev | next [-]

Why would the doomsayers be the ones who need to answer that? That’s kind of their point! It’s the AI boosters who need to answer that, and so far it’s just a big collective shrug + silence.

Krssst an hour ago | parent | prev | next [-]

There will still be jobs. Manual jobs, the kind that break our backs and have us breath various stuff we shouldn't (dust, fumes). Robots are difficult and maybe not so economically viable when everyone is desperate for any job at any cost.

pelotron an hour ago | parent | prev | next [-]

We shouldn't be surprised people have a negative view of AI when Altman et al. have stated on stage that the goal is to replace everyone.

bcjdjsndon 41 minutes ago | parent [-]

Because it's not even logically possible, let alone practically

mrhottakes 14 minutes ago | parent [-]

Maybe the Altmans and Amodeis should stop saying otherwise all the time then.

variadix an hour ago | parent | prev | next [-]

The consumer economy only exists to extract value from common people and funnel it up the wealth ladder. If robots and AI take over all the production, you don’t need a consumer economy, the robots produce and their output directly goes to the top. The rest of us are left to starve.

general1465 7 minutes ago | parent [-]

Eh, no? If everything goes to top, then economy of scale just don't work and whole automation is pointless, because you can make the goods in artisan way.

ryanackley an hour ago | parent | prev | next [-]

It's bizarre that some of the doomsayers are AI stakeholders. It's like they don't realize that most people don't have net worth in the 7-8 figures.

I console myself with the fact that without a functioning economy, AI will implode since capital will dry up. Then all of the investment in data centers, R&D, etc. will never be recovered. Then we'll be back to rational thinking? Maybe?

mrhottakes an hour ago | parent | next [-]

They realize it, and they don't care.

dlev_pika an hour ago | parent | prev [-]

Yeah, but it doesn’t implode all at once - it’s not distributed evenly.

Something like over half of the US consumption is done by the top 10%, or something insane like that. This leads me to believe that a lot more people will eat shit, before enough feel real pain.

gypsy_boots an hour ago | parent | prev | next [-]

Take any econ 101 course, and you'll realize that this isn't a factor in the capitalist system. Capitalism is simply concerned with maximizing profit, and in this case, returning shareholder value. It's just simply not in the purview of the system to think about what happens when you completely get rid of your labor force.

Envisioned another way, the future of labor might look the way it did for laborers over 100 years ago, before major industries unionized; making 'Amazon-bucks' that can only be redeemed at the 'Amazon company store'.

an hour ago | parent | next [-]
[deleted]
general1465 5 minutes ago | parent | prev [-]

No wages -> Nobody buys stuff and services -> Companies are going bankrupt -> No taxable income from consumers nor companies -> States are going bankrupt.

You can either go through UBI and pretend that capitalism is still a thing, or just nationalize everything and go with communism.

Henry Ford understood this problem with his Model T and realized that if he wants to sell it, he also needs to pay workers to be able to afford it.

pugworthy 41 minutes ago | parent | prev | next [-]

I saw a talk by Brian Merchant (https://www.bloodinthemachine.com/) a while back where he talked a lot about the Luddites and their revolts against automation. He's definitely not a fan of AI, but it was very interesting to hear the comparisons of AI resistance now to Luddite resistance to automation in the 1800's.

There was unfortunately no Q&A in the lecture, as probably the one question I would have asked him was this: What if the Luddites had gotten their way? What do you imagine our society and world would be like right now?

It's not meant to be a trick question or a "gotcha" question. Society would indeed have been different. Maybe it would be all wonderfully Star Trek utopia and we'd have found a win-win for everyone. Or maybe we'd just be not nearly as technically advanced as a society as we are now.

hamdingers an hour ago | parent | prev | next [-]

Fully automated luxury space fascism doesn't really need buyers. A risk of high automation/post-scarcity is that abundance exists but remains under the control of people who are not interested in justice or equality or freedom. Lots of people feel that describes the leadership of most AI tech companies.

If they don't need your labor, and they don't need you as a customer, and they don't care about you as a person... where does that leave you?

[to be clear, I think post-scarcity, even in knowledge work, is a lot further off than most ai-doomsayers or ai-worshipers who take statements from people like Altman and Musk at face value]

general1465 a minute ago | parent [-]

But that also assume that peasants does not have agency and won't trigger another French Revolution.

lbrito an hour ago | parent | prev | next [-]

the answer to every question: Agents, of course! With GPU-collaterized credit or some other idiocy.

actionfromafar an hour ago | parent | prev | next [-]

The robots will tell you what to do, you will own nothing, and you will be happy. I think that is the plan?

bluecheese452 an hour ago | parent | prev [-]

If you have a magic robot that builds everything you want you don’t need anyone to buy anything.

Jfc this site is the worst. Use your words instead of drive by downvoting.

pelotron 40 minutes ago | parent | next [-]

Then all you have to do is magically convince the owners of the magic robot to give all their products away for free.

bluecheese452 27 minutes ago | parent [-]

They already have way more than they could ever use and still try to take more.

lorecore an hour ago | parent | prev [-]

Where do the raw materials for the thing it's building (or the robot itself for that matter) come from?

bluecheese452 an hour ago | parent [-]

From the earth. Maybe in the future space.

lorecore an hour ago | parent [-]

Finite natural resources are by their very nature, limited.

bluecheese452 an hour ago | parent [-]

Yes finite things are finite. Glad we cleared up it.

lorecore an hour ago | parent [-]

Finite means not free. Who will pay?

bluecheese452 30 minutes ago | parent [-]

If your magic robot is magic enough you own anything you want. You pay with the oldest currency in the world. Might.

lorecore 9 minutes ago | parent [-]

I don’t think pillaging the earth’s natural resources (even more than we already do) is going to go down well.