Agree. I deeply suspect the problem of asking an LLM to not hallucinate is equivalent to the classic Halting Problem.