▲ | Workaccount2 6 days ago | |||||||
It's hard to see what is going on without seeing the actual chats, as opposed to the snippets in the lawsuit. A lot of suicidal people talk to these LLMs for therapy, and the reviews on the whole seem excellent. I'm not ready to jump on the bandwagon only seeing a handcrafted complaint. Ironically though I could still see lawsuits like this weighing heavily on the sycophancy that these models have, as the limited chat excerpts given have that strong stench of "you are so smart and so right about everything!". If lawsuits like this lead to more "straight honest" models, I could see even more people killing themselves when their therapist model says "Yeah, but you kind of actually do suck". | ||||||||
▲ | Notatheist 6 days ago | parent | next [-] | |||||||
>and the reviews on the whole seem excellent I detest this take because Adam would have probably reviewed the interactions that lead to his death as excellent. Getting what you want isn't always a good thing. That's why therapy is so uncomfortable. You're told things you don't want to hear. To do things you don't want to do. ChatGPT was built to do the opposite and this is the inevitable outcome. | ||||||||
▲ | dartharva 6 days ago | parent | prev | next [-] | |||||||
A commenter above in this thread posted the full complaint, which contains the actual chats. Read through them, seriously, they are beyond horrifying: https://drive.google.com/file/d/1QYyZnGjRgXZY6kR5FA3My1xB3a9... | ||||||||
| ||||||||
▲ | password321 6 days ago | parent | prev | next [-] | |||||||
>If lawsuits like this lead to more "straight honest" models, I could see even more people killing themselves when their therapist model says "Yeah, but you kind of actually do suck". It is not one extreme or the other. o3 is nowhere near as sycophantic as 4o but it is also not going to tell you that you suck especially in a suicidal context. 4o was the mainstream model because OpenAI probably realised that this is what most people want rather than a more professional model like o3 (besides the fact that it also uses more compute). The lawsuits probably did make them RLHF GPT-5 to be at least a bit more middle-ground though that led to backlash because people "missed" 4o due this type of behaviour so they made it bit more "friendly". Still not as bad as 4o. | ||||||||
▲ | 6 days ago | parent | prev | next [-] | |||||||
[deleted] | ||||||||
▲ | rsynnott 6 days ago | parent | prev [-] | |||||||
> A lot of suicidal people talk to these LLMs for therapy, and the reviews on the whole seem excellent. I mean, lots of people use homeopathy to treat their cancer, and the reviews are of course, excellent (they still die, though). You really can't trust _reviews_ by people who are embracing medical quackery of that medical quackery. > If lawsuits like this lead to more "straight honest" models, I could see even more people killing themselves when their therapist model says "Yeah, but you kind of actually do suck". It is not the job of a therapist to be infinitely agreeable, and in fact that would be very dangerous. |