Determinism is a red herring here. The problem is that LLMs are inductive systems, not deductive systems. This makes them powerfully general, and yet inherently unreliable.