Remix.run Logo
jagged-chisel 7 hours ago

I don't follow your reasoning at all. Without a specific input stating that you can't be your own victim, how would the AI catch this? In what cases does that specific input even make sense? Attempted suicide removes one's own autonomy in the eyes of the law in many ways in our world - would the aforementioned specific input negate appropriate decisions about said autonomy?

I don't see how an AI / LLM can cope with this correctly.