Remix.run Logo
Panzerschrek 5 hours ago

I find the term "hallucination" very misleading. What LLMs produce means really "lie" or "misinformation". The term "hallucination" is so common nowadays only because corporations developing LLMs prefer using it rather than saying the truth, that their models are just huge machines for making things up. I am still wondering, why there are no legal consequences for authors of these LLMs because of that.

s-macke 4 hours ago | parent | next [-]

Author here. The discussion about this wording is actually the opening section of the article.

> Unfortunately, the term hallucination quickly stuck to this phenomenon — before any psychologist could object.

leobg 5 hours ago | parent | prev [-]

“Confabulation” is the better term imho (literally: making things up). But I guess OpenAI et al stuck to “hallucination” because it generalizes across text, audio and image generation.