Remix.run Logo
SpicyLemonZest 21 hours ago

LLM companies don't agree that using an LLM to answer questions is a stupid thing people ought to face consequences for. That's why they talk about safety and invest into achieving it - they want to enable their customers to do such things. Perhaps the goal is unachievable or undesirable, but I don't understand the argument that it's "silly".