Remix.run Logo
Terr_ 6 hours ago

LLMs aren't described as hallucinators (just) because they sometimes give results we don't find useful, but because their method is flawed.

For example, the simple algorithm is_it_lupus(){return false;} could have an extremely competitive success rate for medical diagnostics... But it's also obviously the wrong way to go about things.