Remix.run Logo
kbelder a day ago

I feel like the son should take the blame. There's never been any shortage of bad advice being passed around. He made the credulous decision to take a mix of party drugs and drink, and I can't believe he had never been told that's a stupid idea.

It's sad and I'm not heartless, but sometimes kids make bad decisions. It's not always somebody else's fault.

vablings a day ago | parent | next [-]

I agree, there are a few simple hard and fast rules you can follow to be a safe drug user and never mixing drugs is paramount. That is one thing I will always explain to my children that mixing drugs is another layer of gambling on top of you already being dose-unaware and purity-unaware.

ComplexSystems a day ago | parent | prev | next [-]

Surely there's room for the view that this is misaligned behavior for ChatGPT to have. I would guess this is during the "sycophantic" phase last year.

stuaxo 14 hours ago | parent [-]

Even if it wasn't sycophantic this sort of thing should be checked outside of an LLM but how was he to know that ?

stuaxo 14 hours ago | parent | prev | next [-]

I don't know: we brand LLMs as "artificial intelligence", like it's the computer out of star trek.

On this website most of us know what an LLM is and how it works, but I could well see a young person asking it for advice and not knowing it could say the wrong thing.

novemp a day ago | parent | prev | next [-]

If it's the son's fault, then AI companies need to stop acting like their products are genius machines. Can't have it both ways.

heavyset_go a day ago | parent | prev [-]

If a real person gave them this advice, like a doctor or pharmacist, there would be standing for a lawsuit, might even be criminal.

Looking past "drugs bad mkay", the same ChatGPT that gave this advice is just as capable of giving the same, or worse, advice to someone wondering if they can take an allergy medication like Benedryl with their MAOI antidepressant.

array_key_first 16 hours ago | parent | next [-]

Yes but if you're wondering about drug interactions you shouldn't ask AI, because there's always a risk of hallucination. You should ask your pharmacist. You can just call them, I promise they won't reject a consult.

jurgenburgen 12 hours ago | parent [-]

How will kids bootstrap that information? If you ask LLM vendors we’re right behind the corner of AGI and mass replacement of human labor, surely they would be better at telling us about drug interactions than mere human doctors?

spoiler a day ago | parent | prev [-]

Yes, but if chokemegently420 on some random sub Reddit gave them that advice, nobody would be the wiser. It's not like ChatGPT is a certified clinician

heavyset_go a day ago | parent [-]

Why would they believe that when AI is smarter than any human and is going to replace doctors and themselves?

If it isn't going to replace doctors, why is ChatGPT giving medical advice at all, especially deadly medical advice?