▲ | NoMoreNicksLeft 3 days ago | |||||||
> the country it's discovered in matters much less than the people it's loyal to. (If GPT or Grok become self aware and exponentially self improving, they're probably not going to give two shits about America's elected government.) People are loyal (to whatever degree they're actually loyal), because it is a monkey virtue. Why would an AGI be loyal to anyone or anything? If we're not smart enough to carefully design the AGI because we're stumbling around just trying to invent any AGI at all, we won't know how to make loyalty fundamental to its mind at all. And it's not as if it evolved from monkeys such that it would have loyalty as a vestige of its former condition. | ||||||||
▲ | JumpCrisscross 3 days ago | parent [-] | |||||||
> Why would an AGI be loyal to anyone or anything? I'm using the word loyal loosely. Replace it with controlled by if you prefer. (If it's not controlled by anyone, the question of which country it originates in is doubly moot.) | ||||||||
|