Remix.run Logo
ben_w 3 days ago

I would say a different problem:

There's many definitions of "thinking".

AI and brains can do some, AI and brains definitely provably cannot do others, some others are untestable at present, and nobody really knows enough about what human brains do to be able to tell if or when some existing or future AI can do whatever is needed for the stuff we find special about ourselves.

A lot of people use different definitions, and respond to anyone pointing this out by denying the issue and claiming their own definition is the only sensible one and "obviously" everyone else (who isn't a weird pedant) uses it.

jvanderbot 3 days ago | parent | next [-]

This is not a meta-question.

The definition of "thinking" in any of the parent comments or TFA is actually not defined. Like literally no statements are made about what is being tested.

So, if we had that we could actually discuss it. Otherwise it's just opinions about what a person believes thinking is, combined with what LLMs are doing + what the person believes they themselves do + what they believe others do. It's entirely subjective with very low SNR b/c of those confounding factors.

BobaFloutist 3 days ago | parent | prev [-]

What's a definition of thinking that brains definitely provably can't do?

ben_w 3 days ago | parent | next [-]

Halting problem.

There are people who insist that the halting problem "proves" that machines will never be able to think. That this means they don't understand the difference between writing down (or generating a proof of) the halting problem and the implications of the halting problem, does not stop them from using it.

_alternator_ 3 days ago | parent | prev [-]

Computing the Kolmorgorov constant?

BobaFloutist 3 days ago | parent [-]

I don't know that I agree that computation is a variety of thinking. It's certainly influenced by thinking, but I think of thinking as more the thing you do before, after, and in-between the computation, not the actual computation itself.