| ▲ | xlerb 19 hours ago | ||||||||||||||||||||||||||||||||||
Humans don’t have an internal notion of “fact” or “truth.” They generate statistically plausible text. Reliability comes from scaffolding: retrieval, tools, validation layers. Without that, fluency can masquerade as authority. The interesting question isn’t whether they’re coworkers or exoskeletons. It’s whether we’re mistaking rhetoric for epistemology. | |||||||||||||||||||||||||||||||||||
| ▲ | whyenot 19 hours ago | parent | next [-] | ||||||||||||||||||||||||||||||||||
> LLMs aren’t built around truth as a first-class primitive. neither are humans > They optimize for next-token probability and human approval, not factual verification. while there are outliers, most humans also tend to tell people what they want to hear and to fit in. > factuality is emergent and contingent, not enforced by architecture. like humans; as far as we know, there is no "factuality" gene, and we lie to ourselves, to others, in politics, scientific papers, to our partners, etc. > If we’re going to treat them as coworkers or exoskeletons, we should be clear about that distinction. I don't see the distinction. Humans exhibit many of the same behaviours. | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||
| ▲ | kiba 19 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||
A much more useful tool is a technology that check for our blind spots and bugs. For example fact checking a news article and making sure what's get reported line up with base reality. I once fact check a virology lecture and found out that the professor confused two brothers as one individual. I am sure about the professor having a super solid grasp of how viruses work, but errors like these probably creeps in all the time. | |||||||||||||||||||||||||||||||||||
| ▲ | emp17344 18 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||
Ethical realists would disagree with you. | |||||||||||||||||||||||||||||||||||
| ▲ | AlexandrB 40 minutes ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||
> Humans don’t have an internal notion of “fact” or “truth.” They generate statistically plausible text. This doesn't jive with reality at all. Language is a relatively recent invention, yet somehow Homo sapiens were able to survive in the world and even use tools before the appearance of language. You're saying they did this without an internal notion of "fact" or "truth"? I hate the trend of downplaying human capabilities to make the wild promises of AI more plausible. | |||||||||||||||||||||||||||||||||||