| ▲ | alexwebb2 13 hours ago | |
You're correct, OP used the word "hallucination" wrong. A lot of these other comments are missing the point – some deliberately ('don't they ONLY hallucinate, har har'), some not. For those who genuinely don't know – hallucination specifically means false positive identification of a fact or inference (accurate or not!) that isn't supported by the LLM's inputs. - ask for capital of France, get "London" => hallucination - ask for current weather in London, get "It's cold and rainy!" and that happens to be correct, despite not having live weather data => hallucination - ask for capital of DoesNotExistLand, get "DoesNotExistCity" => hallucination - ask it to give its best GUESS for the current weather in London, it guess "cold and rainy" => not a hallucination | ||