Remix.run Logo
jfalcon 6 hours ago

>someone raised the question of “what would be the role of humans in an AI-first society”.

Norbert Wiener, considered to be the father of Cybernetics, wrote a book back in the 1950's entitled "The Human Use of Human Beings" that brings up these questions in the early days of digital electronics and control systems. In it, he brings up things like:

- 'Robots enslaving humans for doing jobs better suited by robots due to a lack of humans in the feedback loop which leads to facist machines.'

- 'An economy without human interaction could lead to entropic decay as machines lack biological drive for anti-entropic organization.'

- 'Automation will lead to immediate devaluation of human labor that is routine. Society needs to decouple a person's "worth" from their "utility as a tool".'

The human purpose is not to compete but to safeguard the telology (purpose) of the system.

9wzYQbTYsAIc 6 hours ago | parent | next [-]

Seems like a good time to enshrine human rights and the social safety net by ratifying the ICESCR (https://en.wikipedia.org/wiki/International_Covenant_on_Econ...) and giving human rights the teeth they need.

I used Anthropic to analyze the situation, it did halfway decent:

https://unratified.org/why/

https://news.ycombinator.com/item?id=47263664

WarmWash 6 hours ago | parent | prev | next [-]

>- 'Automation will lead to immediate devaluation of human labor that is routine. Society needs to decouple a person's "worth" from their "utility as a tool".'

I have this vision that in absence of the ability for people to form social hierarchies on the back of their economic value to society, there will be this AI fueled class hierarchy of people's general social ability. So rather than money determining your neighborhood, your ability to not be violent or crazy does.

energy123 5 hours ago | parent | next [-]

If we have post scarcity due to AI, everything becomes so uncertain. Why would we still have violent and crazy people? Surely the ASI could figure it out and fix whatever is going on in their brains. It's so fuzzy after that event horizon I have no confidence in any predictions.

jononor 21 minutes ago | parent | next [-]

There are easy fixes to get rid of violent and crazy people. Why would a powerful ASI bother with fixing them? A rabid dog just gets put down by humans. Why would we expect anything better of our overlords?

storus 3 hours ago | parent | prev | next [-]

Why are some people able to bear suffering whereas others go bonkers? Or what if the only source of happiness of some of those crazy people is domination of other people and exclusivity of social hierarchies? How would AI fix that?

bryanrasmussen 3 hours ago | parent [-]

>Why some people are able to bear suffering whereas others go bonkers?

Well at least in some cases the scale of suffering between the bonkers and the ones bearing it might be significantly different.

5 hours ago | parent | prev [-]
[deleted]
erikerikson 5 hours ago | parent | prev | next [-]

This seems to suggest a single dimensional evaluation. The complexity of social compatibility is high and the potential capacity to evaluate could also be greater.

5 hours ago | parent | prev | next [-]
[deleted]
2 hours ago | parent | prev | next [-]
[deleted]
ithkuil 3 hours ago | parent | prev [-]

I'm terrified at the idea that society will select the crazies and the violent instead. I wonder why I think that

WarmWash 3 hours ago | parent [-]

My real personal "doom" theory is that AI will, err, remove 99.99% of humans, pretty much everyone except for the top 100,000 based whatever fractally complex metric scheme it deems important.

Then those 100,000 get a utopia, the AI gets everything else, and ultimately the humans are just nice pets.

argee 5 hours ago | parent | prev | next [-]

> 'An economy without human interaction could lead to entropic decay as machines lack biological drive for anti-entropic organization.'

Not quite the point the quote makes, but it reminded me of the short SF story "Exhalation".

https://www.lightspeedmagazine.com/fiction/exhalation/

jay_kyburz 3 hours ago | parent | prev [-]

I think its important to remember that humans are not that far removed from the native animals that we share the earth with. Civilization is just a thin layer of rules we use to try and keep the peace between us.

Just being born doesn't entitle somebody to food and shelter, you have to go out and find it. You have to work.

A magpie is not provided food and shelter, it has to hunt, fight for territory, and build its nest.

Humans don't have some inalienable "worth". But if you can work, you might choose to trade it for some food and shelter.

AI is not going change that. We might think the AI owners have a moral obligation to feed people who can't find work, but there is no guarantee this will happen.

Also, for the short term at least, we need to stop talking about AI like its a thing, and talk about the companies that build and own the AI. Why would Google build an AI that can do everyone's job, then turn around and start building farms to feed us for free?

Do we perhaps imagine our Governments are going to start building super automated farms to feed us. How are they going to pay Google for the AI with no tax income?