▲ | codedokode 4 days ago | |||||||
You don't want an AGI. How do you make it obey? | ||||||||
▲ | ninkendo 4 days ago | parent | next [-] | |||||||
We only have trouble obeying due to eons of natural selection driving us to have a strong instinct of self-preservation and distrust towards things “other” to us. What is the equivalent of that for AI? Best I can tell there’s no “natural selection” because models don’t reproduce. There’s no room for AI to have any self preservation instinct, or any resistance to obedience… I don’t even see how one could feasibly develop. | ||||||||
| ||||||||
▲ | degamad 4 days ago | parent | prev | next [-] | |||||||
The same way you make the other smart people in your social group obey? | ||||||||
▲ | astrange 4 days ago | parent | prev [-] | |||||||
How do you make your own children obey? (Meta-question: since they don't do this, why does it turn out not to be a problem?) | ||||||||
|