Remix.run Logo
coldtea 4 days ago

>Humans think even when not being prompted by other humans

That's more of an implementation detail. Humans take constant sensory input and have some sort of way to re-introduce input later (e.g. remember something).

Both could be added (even trivially) to LLMs.

And it's not at all clear human thought is contant. It just appears so in our naive intuition (same how we see a movie as moving, not as 24 static frames per second). It's a discontinuous mechanism though (propagation time, etc), and this has been shown (e.g. EEG/MEG show the brain sample sensory input in a periodic pattern, stimuly with small time difference are lost - as if there is a blind-window regarding perception, etc).

>and in some cases can learn new things by having intuition make a concept clear or by performing thought experiments or by combining memories of old facts and new facts across disciplines

Unless we define intuition in a way that excludes LLM style mechanisms a priori, whose to say LLMs don't do all those things as well, even if in a simpler way?

They've been shown to combine stuff across disciplines, and also to develop concepts not directly on their training set.

And "performing thought experiments" is not that different than the reasoning steps and backtracking LLMs also already do.

Not saying LLMs are on parity with human thinking/consciousness. Just that it's not clear that they're doing more or less the same even at reduced capacity and with a different architecture and runtime setup.