Remix.run Logo
podgietaru 6 days ago

Take a step back and think about what the Model told that Teenager. It told him to specifically hide his behaviour from people who would have tried to prevent it and get him help.

There is no comparison to therapists. Because a therapist would NEVER do that unless wanting to cause harm.

fzeindl 6 days ago | parent [-]

> There is no comparison to therapists. Because a therapist would NEVER do that unless wanting to cause harm.

Some therapists ultimately might. It occurs that therapists were stripped of their licenses for leading abusive sects:

https://en.m.wikipedia.org/wiki/Center_for_Feeling_Therapy

lionkor 6 days ago | parent [-]

That's an edge case, this case is ChatGPT working as intended.

fzeindl 6 days ago | parent [-]

Exactly. That might be something interesting to think about. Humans make mistakes. LLMs make mistakes.

Yet for humans we have built a society which prevents these mistakes except in edge cases.

Would humans make these mistakes as often as LLMs if there would be no consequences?