Remix.run Logo
kalavan 12 hours ago

> Every other field is aligned "aligned" when the humans in it are "aligned",

That doesn't seem like the whole story. Pick two countries, for instance, one of which has evolved to be democratic (with high regard for rule of law, etc.) and the other is dictatorial. How did these countries end up the way they did? It probably has to do with rules, not just default human qualities.

Let's say you consider popular participation to be good. Then you could say the humans who live in the first country are more "aligned" than the second, but the mechanisms of their forms of government also play part. E.g. if the bureaucracy is set up so that skillfully stabbing others in the back gets you political clout, the selection process will marginalize or kick out people who don't want to engage in backstabbing.

Any organization's behavior depends on some combination of what its incentives promote and on the qualities of its members. This makes AI alignment just an extreme on a scale, not a thing set apart from all other kinds of alignment. The AI alignment problem is the "all rules" extreme of the scale, and organizational alignment is some combination of rules and the inclinations of the humans who are part of it.

The ethics problem of "what does 'aligned' mean anyway" would both apply to the AI situation and the mixed organization situation. A dictator might want an AI "aligned" to maximize his own power, and would also want a human organization to be engineered in such a way as to be both obedient and effective. Someone of a more democratic predisposition would have other priorities - whether they are of what AIs should do or what human organizations should do.

hamburga 9 hours ago | parent [-]

Thank you for this. It gets exactly to the heart of the issue and what I sense is being missed in the AI alignment conversation. “What does ‘aligned’ mean” is and ethical/political question; and when people skip over that, it’s often to (1) smuggle in their own ethics and present them as universal, or (2) run away from the messy political questions and towards the safe but much narrower domain of technical research.