▲ | tbrownaw 21 hours ago | |
> historical drift from “AI alignment” referring to existential risk, to today, where any AI personality you don’t like is “unaligned.” Alignment has always been "what it actually does doesn't match what it's meant to do". When the crowd that believes that AI will inevitably become an all-powerful God owned the news cycle, alignment concerns were of course presented through that lens. But it's actually rather interesting if approached seriously, especially when different people have different ideas about what it's meant to do. |