Extremely exaggerated comment. LLMs dont hallucinate that much. That doesn’t rule them out of any control loop.
I mean, I think you have not put much thought into your theory.