Remix.run Logo
japhyr 12 hours ago

> Because if that's not at least a "maybe", I feel like chatGPT did provide comfort in a dire situation here.

That's a pretty concerning take. You can provide comfort to someone who is despondent, and you can do it in a way that doesn't steer them closer to ending their life. That takes training though, and it's not something these models are anywhere close to being able to handle.

anotheryou 9 hours ago | parent [-]

I'm in no way saying proper help wouldn't be better.

Maybe in the end ChatGPT would be a great tool to actually escalate on detecting a risk (instead of an untrue and harmful text snippet and a phone number).