I'd presume it could reason around the wrong answer, at least to realize something was off. Current LLMs will sometimes hallucinate that this has happened when they're "thinking".