| ▲ | 01100011 6 hours ago | |||||||||||||
I wouldn't put much weight in this study, but I think a lot of us can still attest to the usefulness of LLMs in self-diagnostics. The reality in the US is that it is difficult to get the attention and care of a doctor so we're left having to do it ourselves. 10 years ago you'd hear docs complaining about patients coming in with things they found on google but now I don't think there's an alternative. Case in point, I went to a podiatrist for foot and ankle issues. He diagnosed my foot issues from the xray but just shrugged his shoulders for the ankle issues and said the xray didn't show anything. My 15 minute allocation of his attention expired and I left without a clue as to the issue or what corrective actions to take. 5 minutes with an LLM and I had a plausible reason for the ankle issues which aligned with the diagnosis in my foot. | ||||||||||||||
| ▲ | guidedlight an hour ago | parent | next [-] | |||||||||||||
I agree. I think the issue with LLM’s are not with the correct diagnoses’s but rather the incorrect ones. Real doctors tend to have a degree of cautiousness. I would rather a real doctor be hesitate and seek more information, than an alarmist LLM suggesting I have cancer. | ||||||||||||||
| ▲ | NegativeK 6 hours ago | parent | prev [-] | |||||||||||||
I don't think that using LLMs for medicine is an appropriate fix for the US's healthcare issues. Unless healthcare businesses decide to improve patient care with AI instead of increasing patients per day, I think it's going to make things even worse. | ||||||||||||||
| ||||||||||||||