| ▲ | fn-mote 18 hours ago | |
The title is misleading. This isn't the correct use of the term "hallucination". Hallucination refers to making up facts, not extrapolating into the future. I read 10 comments before I realized that this was referring to 10 years in the FUTURE and not in the PAST (as would be required for it to be a hallucination). | ||
| ▲ | oriettaxx 13 hours ago | parent | next [-] | |
> I read 10 comments before I realized that this was referring to 10 years in the FUTURE and not in the PAST (as would be required for it to be a hallucination). omg, the same for me, I was half way telling my colleague about the 100% rest kernel ... | ||
| ▲ | jrm4 18 hours ago | parent | prev | next [-] | |
You're right this is how people are PRESENTLY using the term "hallucination," but to me this illustrates the deeper truth about that term and that concept: As many have said but it still bears repeating -- they're always hallucinating. I'm of the opinion that its a huge mistake to use "hallucination" as meaning "the opposite of getting it right." It's just not that. They're doing the same thing either way. | ||
| ▲ | rrr_oh_man 18 hours ago | parent | prev | next [-] | |
Don’t LLMs only ever hallucinate? | ||
| ▲ | alexwebb2 13 hours ago | parent | prev | next [-] | |
You're correct, OP used the word "hallucination" wrong. A lot of these other comments are missing the point – some deliberately ('don't they ONLY hallucinate, har har'), some not. For those who genuinely don't know – hallucination specifically means false positive identification of a fact or inference (accurate or not!) that isn't supported by the LLM's inputs. - ask for capital of France, get "London" => hallucination - ask for current weather in London, get "It's cold and rainy!" and that happens to be correct, despite not having live weather data => hallucination - ask for capital of DoesNotExistLand, get "DoesNotExistCity" => hallucination - ask it to give its best GUESS for the current weather in London, it guess "cold and rainy" => not a hallucination | ||
| ▲ | madeofpalk 18 hours ago | parent | prev | next [-] | |
It’s apt, because the only thing LLMs is hallucinate because they have no grounding in reality. They take your input and hallucinate to do something “useful” with it. | ||
| ▲ | adastra22 18 hours ago | parent | prev | next [-] | |
There is no technical difference. | ||
| ▲ | hombre_fatal 18 hours ago | parent | prev [-] | |
Extrapolation is a subset of hallucination. The ubiquitous use of hallucination I see is merely "something the LLM made up". | ||