▲ | lupire 4 days ago | |
Please cite your source. I found this one: https://apnews.com/article/chatbot-ai-lawsuit-suicide-teen-a... When someone is suicidal, anything in their life can be tied to suicide. In the linked case, the suffering teen was talking to a chatbot model of a fictional character from a book that was "in love" with him (and a 2024 model that basically just parrots back whatever the user says with a loving spin), so it's quite a stretch to claim that the AI was encouraging a suicide, in contrast to a situation where someone was persuaded to try to meet a dead person in an afterlife, or bullied to kill themself. |