Remix.run Logo
saghm 5 days ago

> so if you ask, "what is the capital of colorado" and it answers "denver" calling it a Hallucination is nihilistic nonsense that paves over actually stopping to try and understand important dynamics happening in the llm matrices

On the other hand, calling it anything other than a hallucination misrepresents the idea of truth as being something that these models have any ability to differentiate between their outputs based on whether they accurately reflect reality by conflating a fundamentally unsolved problem as an engineering tradeoff.

ComplexSystems 5 days ago | parent [-]

It isn't a hallucination because that isn't how the term is defined. The term "hallucination" refers, very specifically, to "plausible but false statements generated by language models."

At the end of the day, the goal is to train models that are able to differentiate between true and false statements, at least to a much better degree than they can now, and the linked article seems to have some very interesting suggestions about how to get them to do that.

player1234 3 days ago | parent | next [-]

Why use a word that you have to redefine the meaning of? The answer is to deceive.

throwawaymaths 5 days ago | parent | prev [-]

your point is good and taken but i would amend slightly -- i dont think that "absolute truth" is itself a goal, but rather "how aware is it that it doesn't know something". this negative space is frustratingly hard to capture in the llm architecture (though almost certainly there are signs -- if you had direct access to the logits array, for example)