Remix.run Logo
cogman10 9 hours ago

Assuming AGI or something close to it hits, it'll likely be a nightmare for the average person.

The model in the paper talks about freeing the labor bottlenecks, which will be great for the capitalists who own the compute. Assuming we don't have a social safety net in place, that will ultimately mean that a lot of stuff is cheaper for the wealthy with nobody else being able to afford it.

What would a middle class job be? Almost all knowledge work would be obliterated. Engineering, science, maybe even art. All evaporated.

The paper suggests that people will shift into markets like manual labor. But what do we do when those jobs have no real bottleneck as the paper describes? There's only so many people needed to for care work or picking berries. And right now, it seems we have enough as the current salaries are pitiful in all the sectors the article mentions. What pressure would actually make wages increase? Sure not the fact that those are the only jobs that exist for normal people that weren't born to a family that owns a datacenter.

And it isn't like datacenter jobs are going to replace the army of jobs AGI would displace. You need so few people to operate a compute center.

That's why it'll likely be hell for most people. If you don't actually own resources, you'll be left out in the cold. Even if you do have pretty good resources, you'll be in a world with AGI set to perfectly extract every single bit of resource from you imaginable in ways we currently don't imagine. For example, knowing everything about you and knowing that you are willing to spend $11 for a widget while someone else is more willing to spend $10.

This will ultimately force the question of "what do we do with the unemployed" and I worry the answer is already "well, they should have worked harder. Sucks to suck".

visarga 8 hours ago | parent | next [-]

I think you jump from AGI to "human not needed" too abruptly. First of all, AGI might be smarter than you but you have to live with the consequences of using it. So you can't be removed from the loop. We need accountability, AI can't provide it, it has no skin. We need to act well in a local context, AI sits in the datacenter, not on the ground with us. Humans need to bring the context into the AI.

Hinton predicted 9 years ago that radiologists will lose their jobs, yet today they are better paid and more. Maybe AGI will make humans more valuable instead of less. There might be complementarities and mutual reinforcement between us and AGI.

cogman10 8 hours ago | parent [-]

> I think you jump from AGI to "human not needed" too abruptly.

No, I'm really just looking at what this paper proposes will be the future of labor and expanding on it. I'm not saying AGI will mean "humans not needed" I'm saying AGI will mean "less humans are needed" and in some cases that could be significantly less. If you've listened to any CEO gush over AI, you know that's exactly what they want to do.

> Hinton predicted 9 years ago that radiologists will lose their jobs, yet today they are better paid and more. Maybe AGI will make humans more valuable instead of less. There might be complementarities and mutual reinforcement between us and AGI.

Medicine is a tricky place for AI to integrate. Yet it is already integrating there. In particular, basically every health insurance agency is moving towards using AI to auto deny claims. That is a concrete case where companies are happy to live with the consequences even though they are pretty dire for the people they impact.

And, not for nothing 10 years is a pretty short time to completely eliminate an industry. The more we pay for radiologist, the more likely you'll start seeing a hospital decide that "maybe we should just start moving low risk scans to AI". Or you might start seeing remote radiologists for cheap willing to take on the risk of getting things wrong.

xienze 9 hours ago | parent | prev | next [-]

It’s very simple, you see. No one who doesn’t want to work will have to. There will be infinite UBI for everyone paid for by <waves hands> all those people who feel obligated to work for some reason.

cogman10 9 hours ago | parent | next [-]

UBI would have to be taken from the owners of compute to ever be possibly enough to fund things. Those will be the people with the vast majority of the resources.

And with today's political environment, I see that as particularly unlikely to ever happen. Everyone says "UBI" as a solution, but I've yet to see even the hint of that being tried. In fact, the opposite seems to be happening in the US with SSI (which is UBI) being slowly defunded and made less accessible.

testing22321 8 hours ago | parent | prev | next [-]

Australia has had that for decades. Anyone that doesn’t want to work gets free money from the gov, forever. Works fine.

danaris 8 hours ago | parent | prev [-]

You are very clearly being sarcastic, but the only reason that UBI like this wouldn't work in the scenario being described is because the already-mega-wealthy owners of the AGI hardware and software decide that they want to have all the money instead of actually making the promise of automation real for all of humanity.

And to be clear, there is no conceivable universe in which that extra money would make their lives better in any meaningful way.

They could support high taxes on the money they earn through the AGI, to fund a UBI that would support literally everyone—because their products are doing literally all the work necessary to maintain a civilized society (barring some in-person tasks that it's hard to hand over to even a very smart robot) without any human actually needing to do any work. They could do so without making themselves poor, or even the least bit less comfortable.

The reason they would choose not to is because they're corrupt selfish "rugged individualists" who care more about their dollar-denominated high scores than about literally any other human being on the planet. And we know this because that's the case with the people in the closest-analog positions today.

Theodores 8 hours ago | parent | prev [-]

In the 1980s, at my particular school in the UK, we were being primed for a society where computers and machines would do so much work that we would only have to work for ten hours a week or so. Therefore, in the curriculum, we had lots of classes on leisure things, as in hobbies.

This world never came to be. Instead we had Graeber's 'McJobs', with ever more specialist roles as per the capitalist division of labour idea, with lots of important jobs that aren't important at all.

We had a glimpse of a 'leisure society' during the pandemic when only key workers were needed to do anything useful, everyone else was on the government furlough.

In theory, AGI offers the prospect of a leisure society of sorts. However, it doesn't. Going back to the unusual curriculum at the school I happened to go to, a key skill was critical thinking. As well as doing macrame, football, cookery, pottery, art and what not during what would be teaching time, we also had plenty of courses on philosophy and much else that can be bracketed as literature. The idea was that we were not being brought up to be compliant serfs for the capitalist machine, instead we were expected to have agency and be able to think for ourselves.

The problem with AGI is that we are bypassing our brains to some extent and at a cost of being able to master the art of critical thinking. Nobody has to solve a problem by themselves, they can ask their phone to do it for them. I could be wrong, however, I don't see evidence that AGI is making people smarter and more intelligent.

We have already outsourced our ability to recall information to search engines. General knowledge used to be something you had or you didn't, and people earned respect for being able to remember and recall vast amounts of information. This information came from books that you either had or had to access in a library. Nowadays, whatever it is, a search engine has got you covered. Nobody has to be a walking gazetteer or dictionary.

This ability to recall information rather than look up everything came with risks, mostly because it was easy to be wrong, or 'almost right', which can be worse. However, it is/was the bedrock for critical thinking.

Clearly the utopian vision of a leisure society never happened in the form that some envisaged in the 1980s. With AGI I don't see any talk of a leisure society where we are only having to put in ten hour working weeks. This isn't being proposed at all. If it was then AGI would not be a nightmare for the average person.