Remix.run Logo
delecti 8 hours ago

I think you're right, but also that LLMs are showing that sentience isn't necessarily required for AGI.

For exactly the reasons you mention, I don't expect sentience to arise out of LLMs. They have nowhere for an interiority or mind to live. And even if there were a new generation of transformers that did have some looping "mind", where they could "think about" what they're "thinking about", their concepts of things wouldn't really correspond to... things. Without senses to integrate knowledge across domains they're just associating text.

I haven't heard about anyone creating trying to create model that have an interior loop and also integration with sensory input, but I don't expect we would unless it ends up working.