▲ | vrighter 5 days ago | |||||||||||||||||||
if you insist that they are different, then please find one logical, non-subjective, way to distinguish between a hallucination and not-a-hallucination. Looking at the output and deciding "this is clearly wrong" does not count. No vibes. | ||||||||||||||||||||
▲ | esafak 5 days ago | parent [-] | |||||||||||||||||||
> Looking at the output and deciding "this is clearly wrong" does not count. You need the ground truth to be able to make that determination, so using your knowledge does count. If you press the model to answer even when it does not know, you get confabulation. What today's models lack is the ability to measure their confidence, so they know when to abstain. | ||||||||||||||||||||
|