| ▲ | jvanderbot 3 days ago |
| Thinking is undefined so all statements about it are unverifiable. |
|
| ▲ | ben_w 3 days ago | parent | next [-] |
| I would say a different problem: There's many definitions of "thinking". AI and brains can do some, AI and brains definitely provably cannot do others, some others are untestable at present, and nobody really knows enough about what human brains do to be able to tell if or when some existing or future AI can do whatever is needed for the stuff we find special about ourselves. A lot of people use different definitions, and respond to anyone pointing this out by denying the issue and claiming their own definition is the only sensible one and "obviously" everyone else (who isn't a weird pedant) uses it. |
| |
| ▲ | jvanderbot 3 days ago | parent | next [-] | | This is not a meta-question. The definition of "thinking" in any of the parent comments or TFA is actually not defined. Like literally no statements are made about what is being tested. So, if we had that we could actually discuss it. Otherwise it's just opinions about what a person believes thinking is, combined with what LLMs are doing + what the person believes they themselves do + what they believe others do. It's entirely subjective with very low SNR b/c of those confounding factors. | |
| ▲ | BobaFloutist 3 days ago | parent | prev [-] | | What's a definition of thinking that brains definitely provably can't do? | | |
| ▲ | ben_w 3 days ago | parent | next [-] | | Halting problem. There are people who insist that the halting problem "proves" that machines will never be able to think. That this means they don't understand the difference between writing down (or generating a proof of) the halting problem and the implications of the halting problem, does not stop them from using it. | |
| ▲ | _alternator_ 3 days ago | parent | prev [-] | | Computing the Kolmorgorov constant? | | |
| ▲ | BobaFloutist 3 days ago | parent [-] | | I don't know that I agree that computation is a variety of thinking. It's certainly influenced by thinking, but I think of thinking as more the thing you do before, after, and in-between the computation, not the actual computation itself. |
|
|
|
|
| ▲ | terminalshort 3 days ago | parent | prev | next [-] |
| Statements like "it is bound by the laws of physics" are not "verifiable" by your definition, and yet we safely assume it is true of everything. Everything except the human brain, that is, for which wild speculation that it may be supernatural is seemingly considered rational discussion so long as it satisfies people's needs to believe that they are somehow special in the universe. |
| |
| ▲ | gowld 3 days ago | parent | next [-] | | > it satisfies people's needs to believe that they are somehow special in the universe. Is it only humans that have this need? That makes the need special, so humans are special in the universe. | | | |
| ▲ | sublinear 3 days ago | parent | prev | next [-] | | I think what many are saying is that of all the things we know best, it's going to be the machines we build and their underlying principles. We don't fully understand how brains work, but we know brains don't function like a computer. Why would a computer be assumed to function like a brain in any way, even in part, without evidence and just hopes based on marketing? And I don't just mean consumer marketing, but marketing within academia as well. For example, names like "neural networks" have always been considered metaphorical at best. | | |
| ▲ | terminalshort 3 days ago | parent [-] | | What has it got to do with anything whether brains function like computers? This is only relevant if you define thinking as something only the brain can do, and then nothing that doesn't work like a brain can think. This would be like defining flight as "what birds do" and then saying airplanes can't fly because they don't work like birds. And then what do you even mean by "a computer?" This falls into the same trap because it sounds like your statement that brains don't function like a computer is really saying "brains don't function like the computers I am familiar with." But this would be like saying quantum computers aren't computers because they don't work like classical computers. | | |
| ▲ | sublinear 3 days ago | parent [-] | | To use your own example, it's relevant because the definition of "flight" that we apply to planes is not as versatile as the one we apply to birds. To put this in terms of "results", because that's what your way of thinking insists upon, a plane does not take off and land the way a bird does. This limits a plane's practicality to such an extent that a plane is useless for transportation without all the infrastructure you're probably ignoring with your argument. You might also be ignoring all the side effects planes bring with them. Would you not agree that if we only ever wanted "flight" for a specific use case that apparently only birds can do after evaluating what a plane cannot do, then planes are not capable of "flight"? This is the very same problem with "thought" in terms of AI. We're finding it's inadequate for what we want the machine to do. Not only is it inadequate for our current use cases, and not only is it inadequate now, but it will continue to be inadequate until we further pin down what "thought" is and determine what lies beyond the Church-Turing thesis. https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis#P... Relevant quote: "B. Jack Copeland states that it is an open empirical question whether there are actual deterministic physical processes that, in the long run, elude simulation by a Turing machine; furthermore, he states that it is an open empirical question whether any such processes are involved in the working of the human brain" |
|
| |
| ▲ | jvanderbot 3 days ago | parent | prev [-] | | True. You need to define "it" before you can verify physics bounds it. Unicorns are not bound by the laws of physics - because they do not exist. | | |
| ▲ | cwmoore 3 days ago | parent | next [-] | | They are, apparently, proscribed by the totality of the laws of physics. For now. | |
| ▲ | wizzwizz4 3 days ago | parent | prev [-] | | But every unicorn is bound by the laws of physics. |
|
|
|
| ▲ | d-lisp 3 days ago | parent | prev | next [-] |
| Do you think that thinking is undefinable ?
If thinking is definable, then all statements about it aren't unverifiable. |
| |
| ▲ | ablob 3 days ago | parent [-] | | Caveat: if thinking is definable, then not all statements about it are unverifiable. | | |
| ▲ | d-lisp 3 days ago | parent [-] | | Yes, that's a problem of me not being a native english speaker.
"All x aren't y" may mean "not all x are y" in my tongue.
Not a single x is y is more what we would say in the previous case.
But in our case we would say there are x that aren't y. If thinking is definable, it is wrong that all statements about it are unverifiable (i.e. there are statements about it that are verifiable.) Well, basic shit. |
|
|
|
| ▲ | nh23423fefe 3 days ago | parent | prev | next [-] |
| Is this some self refuting sentence? |
| |
| ▲ | d-lisp 3 days ago | parent | next [-] | | I think they meant "Cannot evaluate : (is <undefined> like x ?), argument missing" edit : Thinking is undefined, statements about undefined cannot be verified. | |
| ▲ | ux266478 3 days ago | parent | prev [-] | | is a meta-level grammar the same as an object-level grammar? |
|
|
| ▲ | random9749832 3 days ago | parent | prev [-] |
| Is reasoning undefined? That's what usually meant by "thinking". |
| |
| ▲ | nutjob2 3 days ago | parent | next [-] | | Formal reasoning is defined, informal reasoning very much isn't. | | |
| ▲ | random9749832 3 days ago | parent [-] | | At the end of the day most people would agree that if something is able to solve a problem without a lookup table / memorisation that it used reasoning to reach the answer. You are really just splitting hairs here. | | |
| ▲ | gowld 3 days ago | parent [-] | | What do "most" people thinking about LLMs, then? The "hair-splitting" underlies the whole GenAI debate. | | |
| ▲ | random9749832 2 days ago | parent [-] | | We have widely used benchmarks for reasoning. And no no it does not, you need get off HN. |
|
|
| |
| ▲ | CamperBob2 3 days ago | parent | prev [-] | | The difference between thinking and reasoning is that I can "think" that Elvis is still alive, Jewish space lasers are responsible for California wildfires, and Trump was re-elected president in 2020, but I cannot "reason" myself into those positions. It ties into another aspect of these perennial threads, where it is somehow OK for humans to engage in deluded or hallucinatory thought, but when an AI model does it, it proves they don't "think." |
|