| ▲ | lelanthran 5 hours ago | ||||||||||||||||
If I understand your argument, you're saying that models can be deterministic, right? Care to point to any that are set up to be deterministic? Did you ever stop to think about why no one can get any use out of a model with temp set to zero? | |||||||||||||||||
| ▲ | mrob 4 hours ago | parent | next [-] | ||||||||||||||||
llama.cpp is deterministic when run with a specified PRNG seed, at least when running on CPU without caching. This is true regardless of temperature. But when people say "non-deterministic", they really mean something closer to "chaotic", i.e. the output can vary greatly with small changes to input, and there is no reliable way to predict when this will happen without running the full calculation. This is very different behavior from traditional compilers. | |||||||||||||||||
| ▲ | cortesoft 5 hours ago | parent | prev [-] | ||||||||||||||||
No, LLMs ARE deterministic, just like all computer programs are. I get why that is in practice different then the manner in which compilers are deterministic, but my point is the difference isnt because of determinism. | |||||||||||||||||
| |||||||||||||||||