| ▲ | vidarh 5 days ago | |||||||
This is meaningless. An LLM is also deterministic if configured to be so, and any library, framework, module can be non-deterministic if built to be. It's not a distinguishing factor. | ||||||||
| ▲ | strix_varius 5 days ago | parent [-] | |||||||
That isn't how LLMs work. They are probabilistic. Running them on even different hardware yields different results. And the deltas compound the longer your context and the more tokens you're using (like when writing code). But more importantly, always selecting the most likely token traps the LLM in loops, reduces overall quality, and is infeasible at scale. There are reasons that literally no LLM that you use runs deterministically. | ||||||||
| ||||||||