Remix.run Logo
nightski 8 hours ago

Saying we don't "need" AGI is like saying we don't need electricity. Sure life existed before we had that capability, but it would be very transformative. Of course we can make specialized tools in the mean time.

hoosieree 5 hours ago | parent | next [-]

The error in this argument is that electricity is real.

mrandish 4 hours ago | parent [-]

Indeed, and I'd go even further. In addition to existing, electricity is also usefully defined - which helps greatly in establishing its existence. Neither unicorns nor AGI currently exist but at least unicorns are well enough defined to establish whether an equine animal is or isn't one.

charcircuit 7 hours ago | parent | prev [-]

Can you give an example how it would be transformative compared to specialized AI?

Jensson 7 hours ago | parent | next [-]

AGI is transformative in that it lets us replace knowledge workers completely, specialized AI requires knowledge workers to train them for new tasks while AGI doesn't.

fennecfoxy 6 hours ago | parent | prev [-]

Because it could very well exceed our capabilities beyond our wildest imaginations.

Because we evolved to get where we are, humans have all sorts of messy behaviours that aren't really compatible with a utopian society. Theft, violence, crime, greed - it's all completely unnecessary and yet most of us can't bring ourselves to solve these problems. And plenty are happy to live apathetically while billionaires become trillionaires...for what exactly? There's a whole industry of hyper-luxury goods now, because they make so much money even regular luxury is too cheap.

If we can produce AGI that exceeds the capabilities of our species, then my hope is that rather than the typical outcome of "they kill us all", that they will simply keep us in line. They will babysit us. They will force us all to get along, to ensure that we treat each other fairly.

As a parent teaches children to share by forcing them to break the cookie in half, perhaps AI will do the same for us.

hackinthebochs 2 hours ago | parent | next [-]

Why on earth would you want an AI that takes away our autonomy? It's wild to see someone actually advocate for this outcome.

johnb231 an hour ago | parent [-]

There are people who enjoy being dominated, kept on a leash like a dog. Bad idea to transfer that fetish to human civilization.

ASI to humans would be like humans are to rats or ants.

It could stomp all over us to achieve whatever goals it chooses to accomplish.

Humans being cared for as pets would be a relatively benign outcome.

an hour ago | parent | prev | next [-]
[deleted]
davidivadavid 6 hours ago | parent | prev | next [-]

Oh great, can't wait for our AI overlords to control us more! That's definitely compatible with a "utopian society"*.

Funnily enough, I still think some of the most interesting semi-recent writing on utopia was done ~15 years ago by... Eliezer Yudkowsky. You might be interested in the article on "Amputation of Destiny."

Link: https://www.lesswrong.com/posts/K4aGvLnHvYgX9pZHS/the-fun-th...

tirant 2 hours ago | parent | prev | next [-]

I still don’t see an issue of billionaires becoming trillionaires and being able to buy hyper luxury goods. Good for them and good for the people selling and manufacturing those goods. Meanwhile poverty is in all time lows and there’s a growing middle class at global level. Our middle class life conditions nowadays have a level of comfort that would get Kings from some centuries ago jealous.

brulard 2 hours ago | parent | prev | next [-]

Is this meant seriously? Do we really want something more intelligent than us to just force on us it's rules, logic and ways of living (or dying), which we may be too stupid to understand?

rurp 3 hours ago | parent | prev [-]

Who on earth has the resources to create true AGI and is interested in using it to create this sort of utopia for the masses?

If AGI is created it is most likely to be guided by someone like Altman or Musk, people whose interests couldn't be farther from what you describe. They want to make themselves gods and couldn't care less about random plebs.

If AGI is setting its own principles then I fail to see why it would care about us at all. Maybe we'll be amusing as pets but I expect a superhuman intelligence will treat us like we treat ants.