|
| ▲ | pmarreck 3 days ago | parent | next [-] |
| > LLMs have a different failure mode than professionals That actually supports the use-case of collaboration, since the weaknesses of both humans and LLMs would potentially cancel each other out. |
|
| ▲ | shmel 4 days ago | parent | prev | next [-] |
| Homeopathy is a good example. For an uneducated person it sounds convincing enough and yes, there are doctors prescribing homeopathic pills. I am still fascinated it still exists. |
| |
| ▲ | fl0id 4 days ago | parent [-] | | That’s actually a example of sth different. And as it’s basically a placebo it only harms people’s wallets (mostly). That cannot be said for random llm failure modes. And whether it can be prescribed by doctors depends very much on the country | | |
| ▲ | ivell 4 days ago | parent [-] | | I don't think it is that harmless. Believe in homeopathy often delays patients from taking timely intervention. | | |
|
|
|
| ▲ | terminalshort 4 days ago | parent | prev [-] |
| A LLM (or doctor) recommending that I take a rock can't hurt me. Screwing up in more reasonable sounding ways is much more dangerous. |
| |
| ▲ | zdragnar 4 days ago | parent [-] | | Actually, swallowing a rock will almost certainly cause problems. Telling your state medical board that your doctor told you to take a rock will have a wildly different outcome than telling a judge that you swallowed one because ChatGPT told you to do so. Unless the judge has you examined and found to be incompetent, they're most likely to just tell you that you're an idiot and throw out the case. | | |
|