LLM work less like people and more like mathematical models, why would I expect to be able to carry over intuition from the former rather than the latter?