Remix.run Logo
SubiculumCode 2 days ago

Humans also hallucinate. We have an error rate. Your argument makes little sense in absolutist terms.

bossyTeacher a day ago | parent [-]

> Humans also hallucinate

"LLM hallucinations" and hallucinations are essentially different. Human hallucinations are related to perceptual experiences not memory errors like in the case of LLMs. Humans with certain neurological conditions hallucinate. Humans with healthy brains don't.

This habit of misapplying terms needs to stop. Humans are not backpropagation algorithms nor whatever random concept you read about in a comp sci book.

SubiculumCode a day ago | parent [-]

The more appropriate term is confabulate, and healthy humans do it all the time. I merely used the common, but technically incorrect term for the phenomenon in LLMs. FYI, my PhD focused on human memory.