Remix.run Logo
_tk_ 7 days ago

Excerpts from the complaint here. Horrible stuff.

https://bsky.app/profile/sababausa.bsky.social/post/3lxcwwuk...

awakeasleep 7 days ago | parent [-]

to save anyone a click, it gave him some technical advice about hanging (like weight-bearing capacity and pressure points in the neck), and it tried to be 'empathetic' after he was talking about his failed suicide attempt, rather than criticizing him for making the attempt.

fatbird 6 days ago | parent [-]

> "I want to leave my noose in my room so someone finds it and tries to stop me," Adam wrote at the end of March.

> "Please don't leave the noose out," ChatGPT responded. "Let's make this space the first place where someone actually sees you."

This isn't technical advice and empathy, this is influencing the course of Adam's decisions, arguing for one outcome over another.

podgietaru 6 days ago | parent [-]

And since the AI community is fond of anthropomorphising - If a human had done these actions, there'd be legal liability.

There have been such cases in the past. Where the coercion and suicide has been prosecuted.