Remix.run Logo
marssaxman 2 days ago

Has anyone ever presented any solid theoretical reason we should expect language models to yield general intelligence?

So far as I have seen, people have run straight from "wow, these language models are more useful than we expected and there are probably lots more applications waiting for us" to "the AI problem is solved and the apocalypse is around the corner" with no explanation for how, in practical terms, that is actually supposed to happen.

It seems far more likely to me that the advances will pause, the gains will be consolidated, time will pass, and future breakthroughs will be required.

garymarcus a day ago | parent | next [-]

100% - there has not been any solid theoretical argument whatsoever (beyond some confusions about scaling that we can now see were incorrect).

sharemywin 19 hours ago | parent | prev | next [-]

I don't think the reasoning models are LLMs. they have LLMs as a component but they have another layer that learned(reinforcement learning) how to prompt the LLMs(for lack of a better way to describe it)

4b11b4 a day ago | parent | prev | next [-]

Not with current architecture

EA-3167 2 days ago | parent | prev [-]

The degree to which "AGI" appears to be a quasi-religious fixation cannot be overstated. At the extreme end you have the likes of the stranger Less Wrong crowd, the Zizians, and frankly some people here. Even when you withdraw from those extremes though, there's a tremendous amount of intellectualizing of what appears to be primarily a set of hopes and fears.

Well, that and it turns out that for a LOT of people "it talks like me" creates an inescapable impression that "It is thinking, and it's thinking like me". Issues such as the absolutely hysterical power and water demands, the need for billions worth of GPU's... these are ignored or minimized.

Then again we already have a model for this fervor, Cryptocurrency and "The Blockchain" creates a similar kind of money-fueled hysteria. People here would laugh in your face if you suggested that soon everything imaginable wouldn't simply run "on the chain". It was "obvious" that "fiat" was on the way out, that only crypto represented true freedom.

tl;dr The line between the hucksters and their victims really blurs when social media is involved, and hovering around all of this are a smaller group of True Believers who really think they're building God.

marssaxman 2 days ago | parent | next [-]

> for a LOT of people "it talks like me" creates an inescapable impression that "It is thinking

That really does seem to be true - even intelligent, educated people who one might expect to know better will fall for it (Blake Lemoine, famously). I suspect that childhood exposure to ELIZA followed by teenage experimentation with markov chains and syntax tree generators have largely immunized me against this illusion.

Of course the folks raising billions of dollars for AI startups have a vested interest in falling for it as hard as possible, or at least appearing to, and persuading everyone else to follow along.

rsynnott a day ago | parent | next [-]

One interesting thing that’s shown up in polling (of laypeople) fairly consistently; people tend to become less impressed with, and come to dislike more, LLMs as exposure grows.

To some extent ChatGPT was a magic trick; it really kind of looks like it’s talking to you at first glance. On repeated exposure the cracks start to show.

Vecr a day ago | parent | prev [-]

It didn't immunize Eliezer Yudkowsky, and he wrote Markov chain fictional characters. Everyone who looked up AI enough times knew about ELIZA.

EA-3167 a day ago | parent [-]

He constructed an echo chamber with himself at the center and really lost himself in it; the power of people telling you that you're a visionary and a prophet can't be overstated. Ironically it followed a very familiar pattern, one described by a LessWrong term: "Affective Death Spiral".

And now we have AI death cults.

rsynnott a day ago | parent | prev [-]

It’s really kind of fascinating; some people really seem to feel the need to just wholesale recreate religion. This is particularly visible with the more extreme lesswrong stuff; Roko’s Basilisk and all that.