Remix.run Logo
neom 2 days ago

I wrote about this the other day more fully. I'd suspect sooner rather than later we formalize consciousness as self model coherence. Simply any dynamical state where predictive and reflective layers remain mutually consistent. Machines will exhibit that state, and for operational purposes it will count as consciousness. Philosophers will likely keep arguing, but it makes sense for industry and law to adopt something like "behavioral sentience" as the working definition.

hodgehog11 2 days ago | parent [-]

Consistency is one aspect, but it is not enough. I believe (and this is somewhat based in other arguments from neuroscience and discussions with alignment researchers) that two more are necessary: compression, which demonstrates algorithmic development; and linear representation capacity, as this is the only way that we really interpret the world, and therefore will only define another as intelligent if it can distill knowledge into the same language that we understand.

neom 2 days ago | parent [-]

I think compression is probably a natural consequence of coherent self models? Isn't requiring other minds to package their intelligence in human interpretable linear narratives is like requiring dolphins to demonstrate intelligence through written language?

hodgehog11 2 days ago | parent [-]

> compression is probably a natural consequence of coherent self models

I don't think so. The case in the Chinese Room argument is coherent by construction but is not a compression.

> dolphins to demonstrate intelligence through written language

Because LLMs are artificial and not a product of physical evolution, our standards for proving intelligence are much, much higher.

a day ago | parent [-]
[deleted]