Remix.run Logo
pton_xd 2 days ago

> Saying "I don't know" is sort of like an error message. Clear error messages make systems easier to use.

"I don't know" is not a good error message. "Here's what I know: ..." and "here's why I'm not confident about the answer ..." would be a helpful error message.

Then the question is, when it says "here's what I know, and here's why I'm not confident" -- is it telling the truth, or is that another layer of hallucination? If so, you're back to square one.

skybrian 2 days ago | parent [-]

Yeah, AI chatbots are notorious at not understanding their own limitations. I wonder how that could be fixed?

Terretta 18 hours ago | parent [-]

Easier to think about if LLM means Large Limitations Mirror.