I suspect that the fact that LLMs tend to have a sort of tunnel vision and lack a more general awareness also plays a role here. Solving this is probably an important step towards AGI.