| ▲ | burnte 2 days ago | ||||||||||||||||
> Whatever thinking fundamentally is, it also has an equivalence as a mathematical transformation of data. Do not confuse the mathematical description of physical processes as the world being made of math. > You're assuming the conclusion by saying that the two mathematical transformations of data are not isomorphic. Correct. They're not isomorphic. One is simple math that runs on electrified sand, and one is an unknown process that developed independently across a billion years. Nothing we're doing with AI today is even close to real thought. There are a billion trivial proofs that make the rounds as memes, like one R in strawberry, or being unable to count, etc. | |||||||||||||||||
| ▲ | naasking 2 days ago | parent [-] | ||||||||||||||||
> Do not confuse the mathematical description of physical processes as the world being made of math. Again, this doesn't apply to information. A simulation of a computation really is equivalent to that computation. > One is simple math that runs on electrified sand, and one is an unknown process that developed independently across a billion years. Right, so you admit that it's an unknown process, which means you literally cannot conclude that it is different to what LLMs are doing. > There are a billion trivial proofs that make the rounds as memes, like one R in strawberry, or being unable to count, etc. No, none of these are definitive proofs that they are not thinking. LLM "perceptions" are tokens, the strawberry question is basically asking it to figure out something that's below it's perceptual range. This has literally nothing to do with whether the way it processes information is or is not thinking. | |||||||||||||||||
| |||||||||||||||||