▲ | pembrook 5 days ago | |||||||
I think you missed the part where the kid ignored the repeated help messaging from chatgpt and twisted it into giving this information by lying about creating a fictional story. Also, these are just the most inflammatory excerpts selected by a lawyer trying to win a case. Without the full transcript, and zero context around this kids life in the real world, to claim ChatGPT is at fault here is just wild. At what point do you ascribe agency or any responsibility to the actual humans involved here (the 17 year old, his parents, his school, his community, etc.)? While tragic, blaming [new thing the kids are doing] is fundamentally stupid as it does nothing to address the real reasons this kid is dead now. In fact, it gives everyone an "out" where they don't have to face up to any uncomfortable realities. | ||||||||
▲ | lowsong 5 days ago | parent [-] | |||||||
We can debate about how legally culpable OpenAI is for their products and if they did enough to ensure safeguards functioned, but if you can’t agree that “a machine that encourages and enables suicide is dangerous and morally wrong” without qualification, then there is nothing to discuss. There is no wider context that would make a product encouraging this behaviour acceptable. Deflecting blame onto the parents or the victim is extremely offensive, and I sincerely hope you don’t repeat these comments to people who have lost loved ones to suicide. | ||||||||
|