▲ | elbasti 4 days ago | ||||||||||||||||||||||||||||||||||
It's not some "magical way"--the ways in which a human thinks that an LLM doesn't are pretty obvious, and I dare say self-evidently part of what we think constitutes human intelligence: - We have a sense of time (ie, ask an LLM to follow up in 2 minutes) - We can follow negative instructions ("don't hallucinate, if you don't know the answer, say so") | |||||||||||||||||||||||||||||||||||
▲ | int_19h 4 days ago | parent | next [-] | ||||||||||||||||||||||||||||||||||
We only have a sense of time in the presence of inputs. Stick a human into a sensory deprivation tank for a few hours and then ask them how much time has passed afterwards. They wouldn't know unless they managed to maintain a running count throughout, but that's a trick an LLM can also do (so long as it knows generation speed). The general notion of passage of time (i.e. time arrow) is the only thing that appears to be intrinsic, but it is also intrinsic for LLMs in a sense that there are "earlier" and "later" tokens in its input. | |||||||||||||||||||||||||||||||||||
▲ | chpatrick 4 days ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||
I think plenty of people have problems with the second one but you wouldn't say that means they can't think. | |||||||||||||||||||||||||||||||||||
|