Remix.run Logo
zdragnar 4 days ago

It would still need to be regulated and licensed. There was this [0] I saw today about a guy who tried to replace sodium chloride in his diet with sodium bromide because ChatGPT said he could, and poisoned himself.

With a regulated license, there is someone to hold accountable for wantonly dangerous advice, much like there is with humans.

[0] https://x.com/AnnalsofIMCC/status/1953531705802797070

II2II 4 days ago | parent | next [-]

There are two different issues here. One is tied to how authoritative we view a source, and the other is tied to the weaknesses of the person receiving advice.

With respect to the former, I firmly believe that the existing LLMs should not be presented as a source for authoritative advice. Giving advice that is not authoritative is okay as long as the recipient realizes such, in the sense that it is something that people have to deal with outside of the technological realm anyhow. For example, if you ask for help for a friend you are doing so with the understanding that, as a friend, they are doing so to the best of their ability. Yet you don't automatically assume they are right. They are either right because they do the footwork for you to ensure accuracy or you check the accuracy of what they are telling you yourself. Likewise, you don't trust the advice of a stranger unless they are certified, and even that depends upon trust in the certifying body.

I think the problem with technology is that we assume it is a cure-all. While we may not automatically trust the results returned by a basic Google search, a basic Google search result coupled with an authoritative sounding name automatically sounds more accurate than a Google search result that is a blog posting. (I'm not suggesting this is the only criteria people use. You are welcome to insert your own criteria in its place.) Our trust of LLMs, as they stand today, is even worse. Few people have developed criteria beyond: it is an LLM, so it must be trustworthy; or, it is an LLM so it must not be trustworthy. And, to be fair, it is bloody difficult to develop criteria for the trustworthiness of LLMs (even arbitrary criteria) because the provide so few cues.

Then there's the bit about the person receiving the advice. There's not a huge amount we can do about that beyond encouraging people regard the results from LLMs as stepping stones. That is to say they should take the results and do research that will either confirm or deny it. But, of course, many people are lazy and nobody has the expertise to analyze the output of an LLM outside of their personal experience/training.

terminalshort 4 days ago | parent | prev | next [-]

You cite one case for LLMs, but I can cite 250,000 a year for licensed doctors doing the same https://pubmed.ncbi.nlm.nih.gov/28186008/. Bureaucracy doesn't work for anyone but the bureaucrats.

laserlight 4 days ago | parent | next [-]

Please show me one doctor who recommended taking a rock each day. LLMs have a different failure mode than professionals. People are aware that doctors or therapists may err, but I've already seen countless instances of people asking relationship advice from sycophant LLMs and thinking that the advice is “unbiased”.

pmarreck 3 days ago | parent | next [-]

> LLMs have a different failure mode than professionals

That actually supports the use-case of collaboration, since the weaknesses of both humans and LLMs would potentially cancel each other out.

shmel 4 days ago | parent | prev | next [-]

Homeopathy is a good example. For an uneducated person it sounds convincing enough and yes, there are doctors prescribing homeopathic pills. I am still fascinated it still exists.

fl0id 4 days ago | parent [-]

That’s actually a example of sth different. And as it’s basically a placebo it only harms people’s wallets (mostly). That cannot be said for random llm failure modes. And whether it can be prescribed by doctors depends very much on the country

ivell 4 days ago | parent [-]

I don't think it is that harmless. Believe in homeopathy often delays patients from taking timely intervention.

pmarreck 3 days ago | parent [-]

Yes. See: Steve Jobs, maybe.

terminalshort 4 days ago | parent | prev [-]

A LLM (or doctor) recommending that I take a rock can't hurt me. Screwing up in more reasonable sounding ways is much more dangerous.

zdragnar 4 days ago | parent [-]

Actually, swallowing a rock will almost certainly cause problems. Telling your state medical board that your doctor told you to take a rock will have a wildly different outcome than telling a judge that you swallowed one because ChatGPT told you to do so.

Unless the judge has you examined and found to be incompetent, they're most likely to just tell you that you're an idiot and throw out the case.

terminalshort 4 days ago | parent [-]

They can't hurt me by telling me to do it because I won't.

pmarreck 3 days ago | parent | prev [-]

Wow. I had never heard this before.

nullc 4 days ago | parent | prev | next [-]

You don't need a "regulated license" to hold someone accountable for harm they caused you.

The reality is that professional licensing in the US often works to shield its communities from responsibility, though it's primary function is just preventing competition.

oinfoalgo 4 days ago | parent | prev [-]

I would suspect at some point we will get models that are licensed.

Not tomorrow, but I just can't imagine this not happening in the next 20 years.