So were you were just saying that LLMs aren't AGI?
Or was there something more to it, specifically related to ambiguity/illusions?