Remix.run Logo
bonsai_spool 3 days ago

> Physicians need to have it pounded into them that every hallucination is downstream harm.

I think any person using 'AI' knows it makes mistakes. In a medical note, there are often errors at present. A consumer of a medical note has to decide what makes sense and what to ignore, and AI isn't meaningfully changing that. If something matters, it's asked again in follow up.

theshackleford 2 days ago | parent [-]

> I think any person using 'AI' knows it makes mistakes.

You think wrong. I’m now encountering people on a regular basis arguing “those days are behind us” and it’s “old news.”