| ▲ | ChadNauseam 8 hours ago | |||||||
So if you set temperature=0 and run the LLM serially (making it deterministic) it would stop hallucinating? I don't think so. I would guess that the nondeterminism issues mentioned in the article are not at all a primary cause of hallucinations. | ||||||||
| ▲ | joquarky 7 hours ago | parent [-] | |||||||
I thought that temperature can never actually be zero or it creates a division problem or something similar. I'm no ML or math expert, just repeating what I've heard. | ||||||||
| ||||||||