Remix.run Logo
EA-3167 2 days ago

The degree to which "AGI" appears to be a quasi-religious fixation cannot be overstated. At the extreme end you have the likes of the stranger Less Wrong crowd, the Zizians, and frankly some people here. Even when you withdraw from those extremes though, there's a tremendous amount of intellectualizing of what appears to be primarily a set of hopes and fears.

Well, that and it turns out that for a LOT of people "it talks like me" creates an inescapable impression that "It is thinking, and it's thinking like me". Issues such as the absolutely hysterical power and water demands, the need for billions worth of GPU's... these are ignored or minimized.

Then again we already have a model for this fervor, Cryptocurrency and "The Blockchain" creates a similar kind of money-fueled hysteria. People here would laugh in your face if you suggested that soon everything imaginable wouldn't simply run "on the chain". It was "obvious" that "fiat" was on the way out, that only crypto represented true freedom.

tl;dr The line between the hucksters and their victims really blurs when social media is involved, and hovering around all of this are a smaller group of True Believers who really think they're building God.

marssaxman 2 days ago | parent | next [-]

> for a LOT of people "it talks like me" creates an inescapable impression that "It is thinking

That really does seem to be true - even intelligent, educated people who one might expect to know better will fall for it (Blake Lemoine, famously). I suspect that childhood exposure to ELIZA followed by teenage experimentation with markov chains and syntax tree generators have largely immunized me against this illusion.

Of course the folks raising billions of dollars for AI startups have a vested interest in falling for it as hard as possible, or at least appearing to, and persuading everyone else to follow along.

rsynnott a day ago | parent | next [-]

One interesting thing that’s shown up in polling (of laypeople) fairly consistently; people tend to become less impressed with, and come to dislike more, LLMs as exposure grows.

To some extent ChatGPT was a magic trick; it really kind of looks like it’s talking to you at first glance. On repeated exposure the cracks start to show.

Vecr a day ago | parent | prev [-]

It didn't immunize Eliezer Yudkowsky, and he wrote Markov chain fictional characters. Everyone who looked up AI enough times knew about ELIZA.

EA-3167 a day ago | parent [-]

He constructed an echo chamber with himself at the center and really lost himself in it; the power of people telling you that you're a visionary and a prophet can't be overstated. Ironically it followed a very familiar pattern, one described by a LessWrong term: "Affective Death Spiral".

And now we have AI death cults.

rsynnott a day ago | parent | prev [-]

It’s really kind of fascinating; some people really seem to feel the need to just wholesale recreate religion. This is particularly visible with the more extreme lesswrong stuff; Roko’s Basilisk and all that.