| ▲ | neom 2 days ago | |||||||||||||||||||||||||
I wrote about this the other day more fully. I'd suspect sooner rather than later we formalize consciousness as self model coherence. Simply any dynamical state where predictive and reflective layers remain mutually consistent. Machines will exhibit that state, and for operational purposes it will count as consciousness. Philosophers will likely keep arguing, but it makes sense for industry and law to adopt something like "behavioral sentience" as the working definition. | ||||||||||||||||||||||||||
| ▲ | hodgehog11 2 days ago | parent [-] | |||||||||||||||||||||||||
Consistency is one aspect, but it is not enough. I believe (and this is somewhat based in other arguments from neuroscience and discussions with alignment researchers) that two more are necessary: compression, which demonstrates algorithmic development; and linear representation capacity, as this is the only way that we really interpret the world, and therefore will only define another as intelligent if it can distill knowledge into the same language that we understand. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||