Remix.run Logo
charcircuit 6 days ago

>We need these things to be legislated. Punished.

I disagree. We don't need the government to force companies to babysit people instead of allowing people to understand their options. It's purely up to the individual to decide what they want to do with their life.

>They had the tools to stop the conversation.

So did the user. If he didn't want to talk to a chatbot he could have stopped at any time.

>To steer the user into helpful avenues.

Having AI purposefully manipulate its users towards the morals of the company is more harmful.

luisfmh 6 days ago | parent | next [-]

So people that look to chatgpt for answers and help (as they've been programmed to do with all the marketing and capabilities from openai) should just die because they looked to chatgpt for an answer instead of google or their local suicide helpline? That doesn't seem reasonable, but it sounds to me like what you're saying.

> So did the user. If he didn't want to talk to a chatbot he could have stopped at any time. This sounds similar to when people tell depressed people, just stop being sad.

IMO if a company is going to claim and release some pretty disruptive and unexplored capabilities through their product, they should at least have to make it safe. You put a safety railing because people could trip or slip. I don't think a mistake that small should be end in death.

sooheon 6 days ago | parent | next [-]

Let's flip the hypothetical -- if someone googles for suicide info and scrolls past the hotline info and ends up killing themselves anyway, should google be on the hook?

knowannoes 6 days ago | parent | next [-]

I don't know. In that scenario, has any google software sold as being intelligent produced text encouraging and providing help with the act?

podgietaru 6 days ago | parent [-]

I don't know this for sure, but also I'm fairly sure that google make a concerted effort to not expose that information. Again, from experience. It's very hard to google a painless way to kill yourself.

Their SEO ranking actually ranks pages about suicide prevention very high.

mothballed 6 days ago | parent | prev [-]

The solution that is going to be found, is they will put some age controls, probably half-heartedly, and call it a day. I don't think the public can stomach the possible free speech limitations on consenting adults to use a dangerous tool that might cause them to hurt themselves.

charcircuit 6 days ago | parent | prev [-]

Firstly, people don't "just die" by talking to a chatbot.

Secondly, if someone wants to die then I am saying it is reasonable for them to die.

unnamed76ri 6 days ago | parent | next [-]

The thing about depression and suicidal thoughts is that they lie to you that things will never get better than where they are right now.

So someone wanting to die at any given moment, might not feel that way at any given moment in the future. I know I wouldn’t want any of my family members to make such a permanent choice to temporary problems.

podgietaru 6 days ago | parent | next [-]

1000% As I said in my comment. I never thought I'd be better. I am. I am happy and I live a worthwhile life.

In the throws of intense depression it's hard to even wake up. The idea that I was acting in my right mind, and was able to make a decision like that is insane to me.

charcircuit 5 days ago | parent | prev [-]

If someone wants to look for their lost cat in a snowstorm should they be able to make that decision even if they could regret it in the future due to health reasons of going out in the cold to save their cat? I believe they should be able to make that decision for themselves. It's not the responsibility of your door manufacter to deny you the ability to go outside because it knows better than you and it is too dangerous.

unnamed76ri 5 days ago | parent [-]

This is a fairly weak attempt at salvaging your previous comment.

simonask 6 days ago | parent | prev | next [-]

You are out of your mind if you think people can reliably tell what they want. Sometimes they can, sometimes they can't. Telling the difference is hard, but it's pretty clear that they can't when they suffer from the serious mental condition called depression.

During a lifetime, your perspective and world view will change completely - multiple times. Young people have no idea, because they haven't had the chance to experience it yet.

charcircuit 5 days ago | parent [-]

I never claimed that people could. People make choices that negatively or positively affect their entire life and that is a part of life.

RandomBacon 6 days ago | parent | prev | next [-]

> if someone wants to die then I am saying it is reasonable for them to die.

Including children? If so, do you believe it is reasonable for children to smoke cigarettes if they want to?

leftcenterright 6 days ago | parent | prev | next [-]

WOW! clealry you have no understanding of thoughts that might make their way to teenage minds or to children' minds in general. seriously, WOW!

Do you believe there exists such a thing as depression?

freestingo 6 days ago | parent | prev [-]

A literal fedora wrote this comment.

teiferer 6 days ago | parent | prev | next [-]

> allowing people to understand their options.

Which is what a suicidal person has a hard time doing. That's why they need help.

We need to start viewing mental problems as what they are. You wouldn't tell somebody who broke their leg to get it together and just walk again. You'd bring them to the hospital. A mental problem is no different.

charcircuit 6 days ago | parent [-]

Even nonsuicidal people have a hard time understanding the pros, cons and proper methods on how they can end their life. People have to do research into such a thing since there isn't much ways to gain practical experience in the subject.

teiferer 5 days ago | parent [-]

"Research" is one thing. An anthropomorphized chat encouraging you to go through with it is another altogether.

vasco 6 days ago | parent | prev | next [-]

One thing about suicide is I'm pretty sure for as many people that get stopped in the last moment there are many for which the tiny thing could've stopped them, didn't.

The same way seeing a hotline might save one person, to another it'll make no difference and seeing a happy family on the street will be the trigger for them to kill themselves.

In our sadness we try to find things to blame in the tools the person used just before, or to perform the act, but it's just sad.

Nobody blames a bridge, but it has as much fault as anything else.

podgietaru 6 days ago | parent [-]

There was a fascinating article I read a while back about Sylvia Plath, and the idea that she likely wouldn't have commited suicide a few years later due to the removal of that method.

It was mostly about the access of guns in the US, and the role that plays in suicidality. I cannot for the life of me find it, but I believe it was based on this paper: https://drexel.edu/~/media/Files/law/law%20review/V17-3/Goul...

Which was summarised by NPR here: https://www.npr.org/2008/07/08/92319314/in-suicide-preventio...

When it comes to suicide, it's a complicated topic. There was also the incident with 13 reasons why. Showing suicide in media also grants permission structures to those who are in that state, and actually increases the rate of suicide in the general population.

Where I lie on this is there is a modicum of responsibility that companies need to have. Making access harder to that information ABSOLUTELY saves lives, when it comes to asking how. And giving easy access to suicide prevention resources can also help.

maxweylandt 6 days ago | parent [-]

another example: packing paracetamol in blister packs seems to have reduced suicides.

https://pmc.ncbi.nlm.nih.gov/articles/PMC526120/

> Suicidal deaths from paracetamol and salicylates were reduced by 22% (95% confidence interval 11% to 32%) in the year after the change in legislation on 16 September 1998, and this reduction persisted in the next two years. Liver unit admissions and liver transplants for paracetamol induced hepatotoxicity were reduced by around 30% in the four years after the legislation.

(This was posted here on HN in the thread on the new paracetamol in utero study that I can't seem to dig up right now)

fredoliveira 6 days ago | parent | prev | next [-]

> he could have stopped at any time.

Obviously, clearly untrue. You go ahead and try stopping a behavior that reinforces your beliefs, especially when you're in an altered mental state.

itvision 6 days ago | parent [-]

If a stupid chatbot reinforces something you hold dear, maybe you need the help of a professional psychiatrist. And the kid never did.

But yeah, let's paint ChatGPT responsible. It's always corporations, not whatever shit he had in his life, including and not limited to his genes.

habinero 6 days ago | parent [-]

Are you really blaming a child in crisis for not having the ability to get a psychiatrist?

We regulate plenty of things for safety in highly effective and practical ways. Seatbelts in cars. Railings on stairs. No lead in paint.

msgodel 6 days ago | parent | next [-]

The problem is there's no way to build anything like a safety rail here. If you had it your way teens, likely everyone else too wouldn't be allowed to use computers at all without some kind of certification.

habinero 6 days ago | parent [-]

I honestly don't hate the idea.

On a more serious note, of course there's ways to put in guard rails. LLMs behave like they do because of intentional design choices. Nothing about it is innate.

imtringued 6 days ago | parent | next [-]

If you take this idea even a little bit further, you'll end up with licenses for being allowed to speak.

habinero 5 days ago | parent [-]

I wasn't being entirely serious. Also, we managed to require drivers licenses without also walking licenses.

msgodel 5 days ago | parent [-]

We did that by making walking practically useless instead as many people here point out ~every week.

lp0_on_fire 6 days ago | parent | prev [-]

Correct. The companies developing these LLMs are throwing dump trucks full of money at them like we’ve not seen before. They choose to ignore glaring issues with the technology because if they don’t, some one else will.

msgodel 5 days ago | parent [-]

Perhaps a better way to phrase that would be "beyond what they're doing now." Most popular hosted LLMs already refuse to complete explanations for suicide.

FireBeyond 5 days ago | parent [-]

Except in this case, the LLM literally said "I can't explain this for you. But if you'd like roleplay with me, I could explain it for you that way."

itvision 5 days ago | parent | prev [-]

The concept of "guilt" is foreign to me. I hate it with all my heart.

On the other hand, someone might be held responsible for this, and that's it.

"Might" is the key word here. Given what we've learned, it's difficult to pinpoint who might be responsible.

6 days ago | parent | prev | next [-]
[deleted]
knowannoes 6 days ago | parent | prev [-]

At the very least, selling a text completion api and a chat interface wrapper as "artificial intelligence" is false marketing.