| ▲ | hodgehog11 2 days ago | ||||||||||||||||
Consistency is one aspect, but it is not enough. I believe (and this is somewhat based in other arguments from neuroscience and discussions with alignment researchers) that two more are necessary: compression, which demonstrates algorithmic development; and linear representation capacity, as this is the only way that we really interpret the world, and therefore will only define another as intelligent if it can distill knowledge into the same language that we understand. | |||||||||||||||||
| ▲ | neom 2 days ago | parent [-] | ||||||||||||||||
I think compression is probably a natural consequence of coherent self models? Isn't requiring other minds to package their intelligence in human interpretable linear narratives is like requiring dolphins to demonstrate intelligence through written language? | |||||||||||||||||
| |||||||||||||||||