Remix.run Logo
alexk307 a day ago

There's a bit of a difference between "enforcing social norms" and telling a user to ingest prescription drugs to combat nausea from the other drugs that it told the user to take.

Yes, you should be able to write a book with this same information. No, you should not be able to release software that instructs its users to harm themselves. LLMs aren't people, and you shouldn't anthropomorphize human rights onto them.