Remix.run Logo
thewanderer1983 7 hours ago

I was diagnosed with a rare blood disease called Essential Thrombocythemia (ET) which is part of a group of diseases called myeloproliferative neoplasms. This happened about three years ago. Recently, I decided to get a second opinion and my new specialist changed my diagnosis from ET to Polycythemia Vera (PV). She also highly recommended I quickly go and give blood to lower my haematocrit levels as it put me at a much higher risk of a blood clot. This is standard practice for people with PV but not people with ET. I decided to put the details into google AI in the same way that the original specialist used to diagnose me. Google AI predicted I very likely had PV instead of ET. I also asked Google AI how one could misdiagnose my condition with ET instead of PV and google correctly explained how. My specialist had used my high platelet count and blood test that came back with a JAK2 mutation then after a bone marrow biopsy to incorrectly diagnose me with ET. My high hemoglobin levels should of been checked by my first specialist as an indication of PV not ET. Only the second specialist picked up on this. Google AI took five seconds, and is free. The specialists costs $$$ and took weeks.

But yeah AI slop and all that...

boring-human 14 minutes ago | parent | next [-]

I think AI "slop" will improve medical diagnoses dramatically. Let's assume for a second that the first specialist did not graduate at the top of their class.

The year is 2030, when LLMs are more pervasive. The first specialist now asks you to wait, heads into the other room and double-checks their ET diagnosis with AI. Doing so has become standard practice to avoid malpractice suits. The model persuades them to diagnose PV, avoiding a Type-II error.

But let's say the model gets it wrong too. You eventually visit the second specialist, who did graduate at the top of their class. The model says ET, but the specialist is smart enough to tell that the model is wrong. There is some risk that the second specialist takes the CYA route, but I'd expect them not to. They diagnose PV, avoiding a Type-I error.

Aurornis 6 hours ago | parent | prev [-]

I’m glad you figured it out, but there are a lot of situations like this that look good with the benefit of hindsight.

I have some horror stories from a friend who started trusting ChatGPT over his doctors at the time and started declining rapidly. Be careful about accepting any one source as accurate.