Remix.run Logo
burnte 3 days ago

I agree with the folks who call these screwups rather than hallucinations because the point of LLMs isn't to be right, it's to be statistically highly likely. If making something up fits that model, then that's what it will do. That's literally how it works.