Remix.run Logo
Johnny555 9 hours ago

>It sounds like you should never trust any medical advice you receive from ChatGPT and should seek proper medical help instead. That makes sense. The OpenAI company doesn't want to be held responsible for any medical advice that goes wrong.

While what you're saying is good advice, that's not what they are saying. They want people to be able to ask ChatGPT for medical advice, give answers that sound authoritative and well grounded medical science, but then disavow any liability if someone follows their advice because "Hey, we told you not to act on our medical advice!"

If ChatGPT is so smart, why can't it stop itself from giving out advice that should not be trusted?

navigate8310 8 hours ago | parent [-]

At times the advice is genuinely helpful. However, it's practically impossible to measure under what exact situations the advice would be accurate.

the_af 7 hours ago | parent [-]

I think ChatGPT is capable of giving reasonable medical advice, but given that we know it will hallucinate the most outlandish things, and its propensity to agree with whatever the user is saying, I think it's simply too dangerous to follow its advice.