| ▲ | fainpul 8 hours ago | |
True, but people can use classifier words like "I think …" or "Wasn't there this thing …", which allows you to judge their certainty about the answer. LLMs are always super confident and tell you how it is. Period. You would soon stop asking a coworker who repeatedly behaved like that. | ||
| ▲ | illuminator83 7 hours ago | parent [-] | |
Yeah, for the most part. But I've even had a few instance in which someone was very sure about something and still wrong. Usually not about APIs but rather about stuff that is more work to verify or not quite as timeless. Cache optimization issue or suitability of certain algorithms for some problems even. The world is changing a lot and sometimes people don't notice and stick to stuff that was state-of-the-art a decade ago. But I think the point of the article is that you should have measure in place which make hallucinations not matter because it will be noticed in CI and tests. | ||