| ▲ | GaryBluto 14 hours ago |
| Looking forward to mobilephonedeathcount.com and computernetworkingdeathcount.com because most of them accessed the LLM through those technologies. This is an incredibly manipulative propaganda piece that seeks to blame companies for mental health issues of the user. We don't blame any other forms of media that pretend to interact with the user for consumer's suicides. |
|
| ▲ | pinkgolem 14 hours ago | parent | next [-] |
| You are comparing a medium of transport to (generated) content. And yes, Contend that encourages suicide is largely discouraged/shunned, be it film, forums, books |
|
| ▲ | lukev 14 hours ago | parent | prev | next [-] |
| This is an issue of content, not transmission technology. Have you read the transcripts of any of these chats? It's horrifying. |
| |
| ▲ | fragmede 9 hours ago | parent | next [-] | | You can't have because they were redacted. If you tried to talk to ChatGPT prior to Adam Raine's case, it wouldn't help you, just like it won't one-shot answer the question "how do you make cocaine?" The court documents don't have the part where it refuses to help first. The crime here is that OpenAI didn't set conversation limits because when the context window gets exceeded it goes off the rails. Bing instituted this very early on. Claude has those guad rails. But for some reason, OpenAi chose not to implement that. The chats are horrifying, but it took a concerted dedicated effort to get ChatGPT to go there. If I drive through a sign that says Do Not Enter and fall off a cliff, who's really at fault? | |
| ▲ | GaryBluto 14 hours ago | parent | prev [-] | | >Have you read the transcripts of any of these chats? It's horrifying. Most LLMs reflect the user's attitudes and frequently hallucinate. Everybody knows this. If people misuse LLMs and treat them as a source of truth and rationality, that is not the fault of the providers. | | |
| ▲ | lukev 14 hours ago | parent [-] | | These products are being marketed as "artificial intelligence." Do you expect a mentally troubled 13 year old to see past the marketing and understand how these things actually work? | | |
| ▲ | GaryBluto 14 hours ago | parent [-] | | The mentally troubled 13 year old's parents should have intervened. We can't design the world for the severely mentally ill. | | |
| ▲ | atkirtland 13 hours ago | parent [-] | | Responsibility for handling mental illness should be a joint effort. It's not reasonable to expect parents alone to handle all problems. Some issues may not be apparent at home, for example. |
|
|
|
|
|
| ▲ | loeg 13 hours ago | parent | prev | next [-] |
| > We don't blame any other forms of media that pretend to interact with the user for consumer's suicides. Wrongly or rightly, people frequently blame social media for tangentially associated outcomes. Including suicide. |
|
| ▲ | maartin0 14 hours ago | parent | prev | next [-] |
| Maybe not the entire internet, but this absolutely true for TikTok/Instagram-like algorithms |
|
| ▲ | 14 hours ago | parent | prev [-] |
| [deleted] |