Remix.run Logo
dghlsakjg 5 hours ago

I would argue that the ED is the least similar to code. You have the most unknowns, unreliable data and history, non deterministic options and time constraints.

An ER staff is frequently making inferences based on a variety of things like weather, what the pt is wearing, what smells are present, and a whole lot of other intangibles. Frequently the patients are just outright lying to the doctor. An AI will not pick up on any of that.

TurdF3rguson 4 hours ago | parent [-]

> An AI will not pick up on any of that.

It will if it trains on data like that. It's all about the training data.

n8henrie 4 hours ago | parent | next [-]

Unfortunately the training data is absolute garbage.

Diagnostic standards in (at least emergency, but I think other specialties) medicine are largely a joke -- ultimately it's often either autopsy or "expert consensus."

We get to bill more for more serious diagnoses. The amount of patients I see with a "stroke" or "heart attack" diagnosis that clearly had no such thing is truly wild.

We can be sued for tens of millions of dollars for missing a serious diagnosis, even if we know an alternative explanation is more likely.

If AI is able to beat an average doctor, it will be due to alleviating perverse incentives. But I can't imagine where we could get training data that would let it be any less of a fountain of garbage than many doctors.

Without a large amount of good training data, how could AI possibly be good at doctoring IRL?

TurdF3rguson an hour ago | parent [-]

You just get 1M doctors to wear body cams for a year. Now you have a model that has thousands of times your experience with patients, encyclopedic knowledge of every ailment including ones that never present in your geography, read all the latest papers, etc..

I don't understand how you think this doesn't win vs a human doctor.

xarope 20 minutes ago | parent [-]

In healthcare, HIPAA/GDPR equivalent would block this. Let's be realistic in our discussion; this is not the same as google buying up a library worth of books, scanning and destroying them

mrbungie 4 hours ago | parent | prev [-]

The user will be adversarial and probably learn new tricks to trick the machine, this is not solvable (only) via training data.

bonesss 37 minutes ago | parent [-]

We have that expression “garbage in, garbage out.

My sense is that doctors and AI would be doing a lot better if they were just doing medicine, not being a contact surface for failures of housing, mental health and addiction services, and social systems. Drug seeking and the rest should be non-issues, but drug seekers are informed and adaptive adversariesz