| ▲ | SubiculumCode 2 days ago | |||||||
Humans also hallucinate. We have an error rate. Your argument makes little sense in absolutist terms. | ||||||||
| ▲ | bossyTeacher a day ago | parent [-] | |||||||
> Humans also hallucinate "LLM hallucinations" and hallucinations are essentially different. Human hallucinations are related to perceptual experiences not memory errors like in the case of LLMs. Humans with certain neurological conditions hallucinate. Humans with healthy brains don't. This habit of misapplying terms needs to stop. Humans are not backpropagation algorithms nor whatever random concept you read about in a comp sci book. | ||||||||
| ||||||||