Remix.run Logo
pkaye 16 hours ago

There is a parameter in LLMs called temperature that controls creativity/randomness. If you set it to 0 it makes the model deterministic. I think some LLMs expose this as a tunable parameter.

muwtyhg 16 hours ago | parent | next [-]

The study used a temperature of 0.01.

> "Thirteen food photographs were each submitted 495–561 times to four LLM vision APIs (GPT-5.4, Claude Sonnet 4.6, Gemini 2.5 Pro, Gemini 3.1 Pro Preview) using an identical structured prompt adapted from the iAPS automated insulin delivery system (26,904 total queries, temperature 0.01)"

jihadjihad 15 hours ago | parent | prev [-]

> If you set it to 0 it makes the model deterministic.

No, it doesn't. It can help make the model more deterministic, but it does not guarantee it.

azakai 14 hours ago | parent [-]

The hardware can also add nondeterminism. GPUs reorder operations, leading to different results.

Vendors might also be running A/B testing or who knows what, even when you ask for a temperature of 0.

But, if you run a fixed model with temperature 0 on your local CPU, it will be deterministic (unless there are bugs).