| ▲ | solid_fuel 4 hours ago | ||||||||||||||||
> True, but no more true than it is if you replace the antecedent with "people". Incorrect. People are capable of learning by observation, introspection, and reasoning. LLMs can only be trained by rote example. Hallucinations are, in fact, an unavoidable property of the technology - something which is not true for people. [0] | |||||||||||||||||
| ▲ | TheOtherHobbes 4 hours ago | parent | next [-] | ||||||||||||||||
The suggestion that hallucinations are avoidable in humans is quite a bold claim. | |||||||||||||||||
| ▲ | CamperBob2 4 hours ago | parent | prev [-] | ||||||||||||||||
What you (and the authors) call "hallucination," other people call "imagination." Also, you don't know very many people, including yourself, if you think that confabulation and self-deception aren't integral parts of our core psychological makeup. LLMs work so well because they inherit not just our logical thinking patterns, but our faults and fallacies. | |||||||||||||||||
| |||||||||||||||||