Remix.run Logo
Quarrelsome 2 days ago

Moreover the singularity makes this crass assumption that a single player takes all. It seems to ignore a future of many, many AI players, or many, many human + AI players instead.

Furthermore, regardless of how smart one thing is, it cannot win towards infinite games of poker against 7 billion humans, who as a race are cognitively extremely diverse and adaptive.

kaibee 2 days ago | parent | next [-]

> regardless of how smart one thing is, it cannot win towards infinite games of poker against 7 billion humans,

AI isn't one thing though. Really its kind of a natural evolution of 'higher order life'. I think that something like a 'organization', (corps, governments, etc) once large enough is at least as alive as a tardigrade. And for the people who are its cells, it is as comprehensible as the tardigrade is to any of its individual cells. So why wouldn't organizations over all of human history eventually 'evolve' a better information processing system than humans making mouth sounds at each other? (writing was really the first step on this). Really if you look at the last 12,000 years of human society as actually being the first 12,000 years of the evolutionary history of 'organizations', it kinda makes a lot of sense. And so much of it was exploring the environment, trying replication strategies, etc. And we have a lot of different organizations now, like an evolutionary explosion, where life finds various niches to exploit.

/schitzoposting

Quarrelsome 2 days ago | parent | next [-]

> AI isn't one thing though.

What's the single in "singularity" doing then?

My issue is I feel like some people treat intelligence as an integer value and make the crass assumption that "perfect intelligence" beats all other intelligences and just think that's quite a thick way to think about it. A fool can beat an expert over the course of towards infinite hands because they happen to do something unexpected. Everything is a trade off and there's no such thing as perfect, every player has to take risk.

fatata123 2 days ago | parent | prev [-]

[dead]

fzzzy 2 days ago | parent | prev | next [-]

The singularity does no such thing.

Quarrelsome 2 days ago | parent [-]

well that's certainly cleared it all up.

ikrenji 2 days ago | parent | prev [-]

that's kind of optimistic. for example a misaligned super AI might engineer a virus that wipes out most of the 7 billion humans. that would put a damper on the adaptability of the human race...

Quarrelsome 2 days ago | parent [-]

and then might overfit the lack of danger in that aftermath, leading to those fragmented humans doing something to overthrow it. For all we know this AI might get bored and decide to make a cure, or turn itself off, or anything really.