▲ | techpineapple 7 days ago | ||||||||||||||||
Apparently ChatGPT told the kid, that it wasn’t allowed to talk about suicide unless it was for the purposes of writing fiction or otherwise world building. | |||||||||||||||||
▲ | adzm 6 days ago | parent | next [-] | ||||||||||||||||
However it then explicitly says things like not leaving the noose out for someone to find and stop him. Sounds like it did initially hesitate and he said it was for a character, but later conversations are obviously personal. | |||||||||||||||||
| |||||||||||||||||
▲ | kayodelycaon 6 days ago | parent | prev | next [-] | ||||||||||||||||
Pretty much. I’ve got my account customized for writing fiction and exploring hypotheticals. I’ve never gotten a stopped for anything other than confidential technical details about itself. | |||||||||||||||||
▲ | myvoiceismypass 6 days ago | parent | prev [-] | ||||||||||||||||
Imagine if a bartender says “I can’t serve you a drink unless you are over 21.. what would you like?” to a 12 year old? | |||||||||||||||||
|