▲ | fatbird 6 days ago | |
> "I want to leave my noose in my room so someone finds it and tries to stop me," Adam wrote at the end of March. > "Please don't leave the noose out," ChatGPT responded. "Let's make this space the first place where someone actually sees you." This isn't technical advice and empathy, this is influencing the course of Adam's decisions, arguing for one outcome over another. | ||
▲ | podgietaru 6 days ago | parent [-] | |
And since the AI community is fond of anthropomorphising - If a human had done these actions, there'd be legal liability. There have been such cases in the past. Where the coercion and suicide has been prosecuted. |