Remix.run Logo
crystal_revenge 3 days ago

The magical thinking around LLMs is getting bizarre now.

LLMs are not “intelligent” in any meaningful biological sense.

Watch a spider modify its web to adapt to changing conditions and you’ll realize just how far we have to go.

LLMs sometimes echo our own reasoning back at us in a way that sounds intelligent and is often useful, but don’t mistake this for “intelligence”

bubblyworld 3 days ago | parent | next [-]

Watch a coding agent adapt my software to changing requirements and you'll realise just how far spiders have to go.

Just kidding. Personally I don't think intelligence is a meaningful concept without context (or an environment in biology). Not much point comparing behaviours born in completely different contexts.

tim333 2 days ago | parent | prev | next [-]

They pass human intelligence tests like exams and IQ tests.

If I ask chatgpt how to get rid of spiders I'm probably going to get further than the spiders would scheming to get rid of chatgpt.

habinero 2 days ago | parent [-]

And Clever Hans could pass a math exam

"Some tests can be cheesed by a statistical model" is much less sexy and clickable than "my computer is sentient", but it's what's actually going on lol

tim333 a day ago | parent [-]

Some fibs and goalpost shifting there. Hans couldn't and sentience wasn't mentioned.

danenania 3 days ago | parent | prev [-]

The idea that biological intelligence is impossible to replicate by other means would seem to imply that there’s something magical about biology.

crystal_revenge 3 days ago | parent | next [-]

I'm nowhere implying that it's impossible to replicate, just that LLMs have almost nothing to do with replicating intelligence. They aren't doing any of the things even simple life forms are doing.

danenania 2 days ago | parent [-]

They lack many abilities of simple life forms, but they can also do things like complex abstract reasoning, which only humans and LLMs can do.

habinero 2 days ago | parent [-]

They don't reason. They can generate an illusion of it through a statistical model.

You don't gotta work hard to break the illusion, either.

People really really really want to believe this thing and I do not understand why. I wish I did lol

danenania 2 days ago | parent [-]

Ok, it’s an “illusion” of reasoning. Doesn’t change that it got a gold medal in the IMO or helped me fix a race condition the other day.

charcircuit 3 days ago | parent | prev | next [-]

There very well could be something magical about it.

danenania 3 days ago | parent [-]

It’s fine to think that—many clearly do.

But it would be more honest and productive imo if people would just say outright when they don’t think AGI is possible (or that AI can never be “real intelligence”) for religious reasons, rather than pretending there’s a rational basis.

gls2ro 3 days ago | parent | next [-]

AGI is not possible because we dont yet have a clear and commonly agreed definition of intelligence and more importantly we dont have a definition for consciousness nor we can define clearly (if there is) the link between those two.

until we got that AGI is just a magic word.

When we will have those two clear definitions that means we understood them and then we can work toward AGI.

3 days ago | parent | prev | next [-]
[deleted]
Mikhail_Edoshin 3 days ago | parent | prev | next [-]

When you try to solve a problem the goal or the reason to reject the current solution are often vague and hard to put in words. Irrational. For example, for many years the fifth postulate of Euclid was a source of mathematical discontent because of a vague feeling that it was way too complex compared to the other four. Such irrationality is a necessary step in human thought.

danenania 2 days ago | parent [-]

Yes, that’s fair. I’m not saying there’s no value to irrational hunches (or emotions, or spirituality). Just that you should be transparent when that’s the basis for your beliefs.

habinero 2 days ago | parent | prev | next [-]

That's not a good way to think about it.

Plenty of things could theoretically exist that aren't possible and likely will never be possible.

Like, sure, a Dyson sphere would solve our energy needs. We can't build one now and we almost certainly never will lol

"AGI" is theoretically feasible, sure. Our brains are just matter. But they're also an insanely complex and complicated system that came out of a billion years of evolution.

A little rinky dink statistical model doesn't even scratch the surface of it, and I don't understand why people think it does.

danenania 2 days ago | parent [-]

> But they're also an insanely complex and complicated system that came out of a billion years of evolution.

As are birds, yet we can still build airplanes.

player1234 a day ago | parent [-]

We know the laws of aerodynamics, what are the known laws of intelligence and consciousness you are replicating through other means with LLMs?

Weak ass gotcha, hang your head in shame, call your mom and tell her what a fraud you are.

danenania a day ago | parent [-]

Sorry you got triggered. I know it can be an emotional topic for some people. I'll try to explain in a simple way.

We clearly are replicating at least some significant aspects of human intelligence via LLMs, despite biological complexity. So we obviously don't need a 100% complete understanding of the corresponding biology to build things which achieve similar goals.

In other words, we can (conceivably) figure out how intelligence works and how to produce it independently of figuring out exactly how the human brain produces intelligence, just like we learned the laws of aerodynamics well enough to build airplanes independently of understanding everything about the biology of birds.

Whether we will achieve this or not to the point of AGI is a separate engineering question. I'm only pointing out how flawed these lines of argument are.

hitarpetar 2 days ago | parent | prev [-]

rationalism has become the new religion. Roko's basilisk is a ghost story and the quest for AGI is today's quest for the philosopher's stone. and people believe this shit because they can articulate a "rational basis"

smohare 3 days ago | parent | prev [-]

[dead]