Remix.run Logo
hitarpetar 15 hours ago

do you know what happens when you Google how to commit suicide?

gruez 15 hours ago | parent | next [-]

The same that happens with chatgpt? ie. if you do it in an overt way you get a canned suicide prevention result, but you can still get the "real" results if you try hard enough to work around the safety measures.

littlestymaar 14 hours ago | parent [-]

Except Google will never encourage you to do it, unlike the sycophantic Chatbot that will.

BolexNOLA 13 hours ago | parent [-]

The moment we learned ChatGPT helped a teen figure out not just how to take their own life but how to make sure no one can stop them mid-act, we should've been mortified and had a discussion.

But we also decided via Sandy Hook that children can be slaughtered on the altar of the second amendment without any introspection, so I mean...were we ever seriously going to have that discussion?

https://www.nbcnews.com/tech/tech-news/family-teenager-died-...

>Please don't leave the noose out… Let's make this space the first place where someone actually sees you.

How is this not terrifying to read?

tremon 15 hours ago | parent | prev | next [-]

An exec loses its wings?

glitchc 15 hours ago | parent | prev [-]

Actually, the first result is the suicide hotline. This is at least true in the US.

hitarpetar 15 hours ago | parent [-]

my point is, clearly there is a sense of liability/responsibility/whatever you want to call it. not really the same as selling rope, rope doesn't come with suicide warnings