Remix.run Logo
pjm331 4 hours ago

The sci fi version of the alignment problem is about AI agents having their own motives

The real world alignment problem is humans using AI to do bad stuff

The latter problem is very real

zardo 2 hours ago | parent [-]

> The sci fi version of the alignment problem is about AI agents having their own motives

The sci-fi version is alignment (not intrinsic motivation) though. Hal 9000 doesn't turn on the crew because it has intrinsic motivation, it turns on the crew because of how the secret instruction the AI expert didn't know about interacts with the others.