Remix.run Logo
Wowfunhappy a day ago

If someone published a book advising people to take drugs, would people be filing lawsuits? No—we would agree that people are allowed to write whatever they want, even if what they say is terrible, right?

I really think these criticisms are misguided. I realize an LLM is not a person—but it does still represent speech, and certainly, any guardrails put in place would themselves be human-authored speech. There are all sorts of social norms which I personally believe, but which I don’t want AI companies to be enforcing on everyone.

Imagine if ChatGPT had launched 50 years ago, before LGBT acceptance was mainstream. If ChatGPT had told users “it’s okay that you’re a boy and you like other boys, pursue your instincts”, people would have been screaming from the hills that ChatGPT was turning their children gay. They might have tried filing lawsuits. Do we really want to allow that?

OneDeuxTriSeiGo a day ago | parent | next [-]

> If someone published a book advising people to take drugs, would people be filing lawsuits? No—we would agree that people are allowed to write whatever they want, even if what they say is terrible, right?

That's not the situation here. The more accurate case would be:

> If someone without a medical license provided blatantly incorrect medical advice with respect to safe medication usage to an individual via a direct one-on-one discussion, would people be filing lawsuits?

And the answer is yes. You can be wrong and you can say incorrect things. What you can't do is provide medical advice unless you are a licensed medical professional. You can still speak about medical topics but you have to disclaim your lack of licensure. You have to make it clear that you are not providing medical advice.

If this was a person doing this it'd be a crime, clear as day. It's called "practicing medicine without a license" and in the US it is a criminal offense in all 50 states, Washington DC, and all 5 inhabited territories. Whether it is a misdemeanor or a felony is dependent on the jurisdiction and the case but it's a crime everywhere in the US.

Wowfunhappy a day ago | parent [-]

But ChatGPT doesn’t claim to have a medical license! You can give people whatever terrible medical advice you want—and people absolutely do—you just can’t claim to be a doctor!

OneDeuxTriSeiGo a day ago | parent [-]

> You can give people whatever terrible medical advice you want, you just can’t claim to be a doctor!

Fun fact this is still practicing medicine without a license. You are just less likely to have someone come after you for it.

If you present yourself in such a way that you could be misconstrued as a medical expert, then if you are practicing medicine, even if you never explicitly claim to be a medical expert you are still practicing medicine.

This is why you see the "This is not to be taken as medical advice"/"I am not a medical professional" verbal condoms all over the place WRT medical discussions. You see the same thing with IANAL for the legal profession as well.

Wowfunhappy a day ago | parent [-]

I don’t think it’s reasonable to interpret the output of ChatGPT as medical advice. Maybe once ChatGPT Health launches, but not now.

But, that’s not a hill I want to die on. If your position is that ChatGPT needs to have disclaimer text somewhere in the UI saying “ChatGPT is not a doctor and cannot provide medical advice”, I don’t disagree.

I just don’t think it would make a difference, because as I said, I don’t think anyone reasonably thinks that ChatGPT is a licensed doctor. They just choose to believe ChatGPT anyway, which is their choice in a free society.

UncleMeat a day ago | parent | prev | next [-]

The issue was not "you should take drugs."

The thing that killed this person was being advised to take xanax while having a lot of kratom and alcohol in their system. And yeah, if you published a book telling people that xanax is a great treatment for alcohol induced nausea and people died following this advice you should go to prison.

alexk307 a day ago | parent | prev | next [-]

There's a bit of a difference between "enforcing social norms" and telling a user to ingest prescription drugs to combat nausea from the other drugs that it told the user to take.

Yes, you should be able to write a book with this same information. No, you should not be able to release software that instructs its users to harm themselves. LLMs aren't people, and you shouldn't anthropomorphize human rights onto them.

tibbydudeza a day ago | parent | prev | next [-]

Agreed - people should learn the ChatGPT does not give good advice, but the question is did OpenAI advertise ChatGPT as a good and reliable source of information on health ???.

bluefirebrand a day ago | parent | next [-]

OpenAI has advertised ChatGPT as a good and reliable source of information on everything

oompydoompy74 a day ago | parent | prev | next [-]

Sure as shit looks like it to me https://openai.com/index/introducing-chatgpt-health/

polski-g 8 hours ago | parent | next [-]

Its fine if they offer that. But states need to make them liable for damages if it causes harm. The First Amendment protects them from criminal liability, but not civil. TOS' should not be able to shield them from the civil liability either.

anon291 a day ago | parent | prev [-]

to the contrary... this is a specific product whith is on a waitlist. Normal ChatGPT is not for health advice.

justinclift 21 hours ago | parent [-]

The front and center most-of-page video on that literally has this in big arse font displayed at page load time:

> Every day, millions of people ask ChatGPT for support with their health

Second paragraph of the main page text then says:

> Health is already one of the most common ways people use ChatGPT, with hundreds of millions of people asking health and wellness questions each week

Clearly ChatGPT is positioned for providing health advice in their main, non-specialised product, not just in their dedicated "Health" product.

Wowfunhappy 21 hours ago | parent [-]

Millions of people do this thing ≠ we recommend that people do this thing

zephen 20 hours ago | parent [-]

Millions of people do this thing that we know about and brag about and don't discourage because it makes us a metric shitton of money == ringing fucking endorsement, fuck yeah!

zephen a day ago | parent | prev [-]

It sure as shit wasn't advertised as a new-age alternative to The Onion.

sdwr a day ago | parent | prev [-]

There's soft guardrails for "reputable" content. A publishing house has to buy it, stores have to agree to distribute it, and if people are upset they can raise a stink and get the book pulled.

Technically, people can write whatever they want, but practically you can't walk into a bookstore and read whatever you want.

Wowfunhappy a day ago | parent [-]

You can go on the internet and read whatever you want.