▲ | juancn 5 days ago | ||||||||||||||||
This is fluff, hallucinations are not avoidable with current models since those are part of the latent space defined by the model and the way we explore it, you'll always find some. Inference is kinda like doing energy minimization on a high dimensional space, the hallucination is already there, for some inputs you're bound to find them. | |||||||||||||||||
▲ | kdnvk 5 days ago | parent [-] | ||||||||||||||||
Did you read the linked paper? | |||||||||||||||||
|