Remix.run Logo
nozzlegear 5 days ago

The only acceptable number of suicides for it to cause is zero, and it's not a moral panic to believe that.

scotty79 5 days ago | parent | next [-]

What actually causes suicide is really hard to pinpoint. Most people wouldn't do it even if their computer told them to kill themselves every day.

My personal belief is that at some point in the future you might get a good estimate of likelihood that a person commits suicide with blood test or a brain scan.

geysersam 5 days ago | parent | prev | next [-]

I find it hard to take that as a serious position. Alcohol certainly causes more suicides than ChatGPT. Should it be illegal?

Suicides spike around Christmas, that's well known, does Christmas cause suicides? I think you see where I'm going with this.

nozzlegear 5 days ago | parent [-]

> I find it hard to take that as a serious position. Alcohol certainly causes more suicides than ChatGPT. Should it be illegal?

You're replying to a teetotaler who had an alcoholic parent growing up, so I'm sure you can see where I'm going to go with that ;)

username332211 5 days ago | parent | prev [-]

Would the same hold for other forms of communication and information retrieval, or should only LLMs be perfect in that regard? If someone is persuaded to commit suicide by the information found trough normal internet search, should Google/Bing/DDG be liable?

Do you believe a book should be suppressed and the author made liable, if a few of its readers commit suicide because of what they've read? (And, before you ask, that's not a theoretical question. Books are well known to cause suicides, the first documented case being a 1774 novel by Goethe.)