| |
| ▲ | stevekemp 21 hours ago | parent [-] | | That sounds like the kind of hallucinated statement you might expect from ChatGPT. Which doctors, in which countries, are using LLMs to treat patients? | | |
| ▲ | gordian-mind 11 minutes ago | parent | next [-] | | My experience with ChatGPT is that it rarely dares to make short, generalizing, opinionated statements without an excruciating amount of hedging. Doctors pay subscriptions for specialized software that relies on LLMs enriched with medical context. But like other professionals, they also use ChatGPT as a search engine and verify what it tells them by virtue of being, well, doctors. | |
| ▲ | lurking_swe 18 hours ago | parent | prev [-] | | i’m not the person you replied to. but a quick google search is just as much effort (on your part) as replying with a sassy “this sounds like a hallucination”. A low value comment in my opinion. I found this: https://www.who.int/europe/news/item/19-11-2025-is-your-doct... Quote: > “AI is already a reality for millions of health workers and patients across the European Region,” said Dr Hans Henri P. Kluge, WHO Regional Director for Europe. “But without clear strategies, data privacy, legal guardrails and investment in AI literacy, we risk deepening inequities rather than reducing them.” |
|
|