Remix.run Logo
throw-qqqqq 3 days ago

> AI is neither deterministic nor chaotic. It is nondeterministic because it works based on probability

A deterministic function/algorithm always gives the same output given the same input.

LLMs are deterministic if you control all parameters, including the “temperature” and random “seed”. Same input (and params) -> same output.

mejutoco 3 days ago | parent | next [-]

I thought this too, but it seems that is not the case. I could not remember the reason I saw why so I googled it (AI excerpt).

Large Language Models (LLMs) are not perfectly deterministic even with temperature set to zero , due to factors like dynamic batching, floating-point variations, and internal model implementation details. While temperature zero makes the model choose the most probable token at each step, which is a greedy, "deterministic" strategy, these other technical factors introduce subtle, non-deterministic variations in the output

Calavar 3 days ago | parent [-]

You were probably thinking about this piece on nondeterminism in attention by Thinking Machines: https://thinkingmachines.ai/blog/defeating-nondeterminism-in...

andai 3 days ago | parent [-]

If I understood correctly the reason for this is that some floating point operations are not commutative?

District5524 3 days ago | parent | prev | next [-]

Not that it's incorrect but there is some data showing variability even with the very same input and all parameters. Especially if we have no control over the model behind the API with engineering optimizations etc. See Berk Atil et al.: Non-Determinism of "Deterministic" LLM Settings, https://arxiv.org/abs/2408.04667v5

viccis 3 days ago | parent | prev | next [-]

Ignoring that you are making an assumption about how the randomness is handled, this is a very vacuous definition of "deterministic" in the context of the discussion here, which is AI controlling large and complex systems. The fact that each inference can be repeated if and only if you know and control the seed and it is implemented with a simple PRNG is much less important to the conversation than its high level behavior, which is nondeterministic in this application.

If your system is only deterministic if it processes its huge web of interconnected agentic prompts in exactly the same order, then its behavior is not deterministic in any sense that could ever be important in the context of predictable and repeatable system behavior. If I ask you whether it will handle the same task the same exact way, and its handling of it involves lots of concurrent calls that are never guaranteed to be ordered the same way, then you can't answer "yes".

mbesto 3 days ago | parent | prev | next [-]

> LLMs are deterministic if you control all parameters, including the “temperature” and random “seed”.

This is not true. Even my LLM told me this isn't true: https://www.perplexity.ai/search/are-llms-deterministic-if-y...

cnnlives78 3 days ago | parent | prev [-]

The LLMs most of us are using have some element of randomness to every token selected, which is non-deterministic. You can try to attempt to corral that, but statistically, with enough iteration, it may provide nonsense, unintentional, dangerous, opposite solutions/answers/action, even if you have system instructions defining otherwise and a series of LLMs checking themselves. Be sure that you fully understand this. Even if you could make it fully deterministic, it would be deterministic based on the model and state, and you’ll surely be updating those. It amazes me how little people know about what they’re using.