▲ | tracerbulletx 2 days ago | |||||||||||||||||||||||||||||||
Ugh you just fancy auto-completed a sequence of electrical signals from your eyes into a sequence of nerve impulses in your fingers to say that, and how do I know you're not hallucinating, last week a different human told me an incorrect fact and they were totally convinced they were right! | ||||||||||||||||||||||||||||||||
▲ | adamredwoods 2 days ago | parent | next [-] | |||||||||||||||||||||||||||||||
Humans base their "facts" on consensus-driven education and knowledge. Anything that falls into a range of "I think this is true" or "I read this somewhere" or "I have a hunch" is more acceptable for a human than an LLM. Also humans are more often to encapsulate their uncertain answers with phrasing. LLMs can't do this, they don't have a way to track answers that are possibly incorrect. | ||||||||||||||||||||||||||||||||
▲ | deadbabe 2 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||
The human believes it was right. The LLM doesn’t believe it was right or wrong. It doesn’t believe anything anymore than a mathematical function believes 2+2=4. | ||||||||||||||||||||||||||||||||
|