▲ | nurettin 9 hours ago | |||||||||||||
It just aligns generated words according to the input. It is missing individual agency and self sufficiency which is a hallmark of consciousness. We sometimes confuse the responses with actual thought because neural networks solved language so utterly and completely. | ||||||||||||||
▲ | Zarathruster 7 hours ago | parent | next [-] | |||||||||||||
Not sure I'd use those criteria, nor have I heard them described as hallmarks of consciousness (though I'm open, if you'll elaborate). I think the existence of qualia, of a subjective inner life, would be both necessary and sufficient. Most concisely: could we ask, "What is it like to be Claude?" If there's no "what it's like," then there's no consciousness. Otherwise yeah, agreed on LLMs. | ||||||||||||||
| ||||||||||||||
▲ | cma 7 hours ago | parent | prev [-] | |||||||||||||
> It is missing individual agency and self sufficiency which is a hallmark of consciousness. You can be completely paralyzed and completely concious. | ||||||||||||||
|