| ▲ | eqvinox 3 hours ago | |
It matters because for medical questions, you [are supposed to] go to a medical professional, and those very much cares about and make that distinction. Which is exactly the problem here; it "used to be" that reasonable people would disbelieve random things they find on the internet at least to some degree. "Media literacy". LLMs don't seem to have that capability, and a good number of people are using LLMs in blissful ignorance of that fact. They very confidently exclaim things that make them sound like experts in the field at question. Would it have made a difference for the AI data center heat island thing you're quoting? maybe not. But for medical matters? Most people wouldn't even have caught wind of this odd fake disease. LLMs just amplify it and serve it to everyone. | ||
| ▲ | simianwords 2 hours ago | parent [-] | |
I agree with you and I think the companies have solved it. I think they should be more skeptical of medical articles in general and be more conservative. > Which is exactly the problem here; it "used to be" that reasonable people would disbelieve random things they find on the internet at least to some degree. "Media literacy". LLMs don't seem to have that capability, and a good number of people are using LLMs in blissful ignorance of that fact. I completely disagree with this part. LLM's absolutely have the ability to be skeptical but skepticism comes at a cost. LLMs did what used to be a reasonable thing - trust articles published in reputed sources. But maybe it shouldn't do that - it should spend more time and processing power in being skeptical. | ||