▲ | d2049 3 days ago | |||||||||||||||||||||||||||||||
Reminder that Sam Altman chose to rush the safety process for GPT-4o so that he could launch before Gemini, which then led directly to this teen's suicide: | ||||||||||||||||||||||||||||||||
▲ | richwater 3 days ago | parent | next [-] | |||||||||||||||||||||||||||||||
> which then led directly to this teen's suicide Incredible logic jump with no evidence whatsoever. Thousands of people commit suicide every year without AI. > ChatGPT detects a prompt indicative of mental distress or self-harm, it has been trained to encourage the user to contact a help line. Mr. Raine saw those sorts of messages again and again in the chat, particularly when Adam sought specific information about methods. But Adam had learned how to bypass those safeguards by saying the requests were for a story he was writing Somehow it's ChatGPT's fault? | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | throwaway98797 3 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||
build something |