Remix.run Logo
XorNot 2 days ago

In so much as you could regard a goal function as an emotion, why would you assume alien intelligence need have emotions that match anything humans do?

The entire thought experiment about the paperclip maximizer, in fact most AI threat scenarios is focused on this problem: that we produce something so alien that it executes it's goal to the diminishment of all other human goals, yet with the diligence and problem solving ability we'd expect of human sentience.