▲ | amelius 5 days ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Your comparison to hallucination is spot on. LLMs have shown the general public how AI can be plain wrong and shouldn't be trusted for everything. Maybe this influences how they, and regulators, will think about self driving cars. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | bbarnett 5 days ago | parent [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Well I wish this was true. But loads of DEVs on here will claim LLMs are infallible. And the general public?! No way. Most are completely unaware of the foibles of LLMs. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|