Remix.run Logo
rafterydj 3 days ago

"Given a detailed list of symptoms" is sure holding a lot of weight in that statement. There's way too much information that doctors tacitly understand from interactions with patients that you really cannot rely on those patients supplying in a "detailed list". Could it diagnose correctly, some of the time? Sure. But the false positive rate would be huge given LLMs suggestible nature. See the half dozen news stories covering AI induced psychosis for reference.

Regardless, it's diagnostic capability is distinct from the dangers it presents, which is what the parent comment was mentioning.