Remix.run Logo
nemo1618 2 hours ago

LLMs are not inherently non-deterministic. This is a common misconception. You used to be able to set temp=0 and a fixed seed and get the same output every time. This broke when labs started implementing batching, and no one bothered fixing it because the benefits of batching vastly outweighed the demand for deterministic output.

I am hopeful deterministic output will return, though; DeepSeek v4 claims to have implemented "bitwise batch-invariant and deterministic kernels," though I haven't tested it myself.

sroussey an hour ago | parent | next [-]

Thinking Machines Lab uses batch invariant kernels, btw.

iLoveOncall an hour ago | parent | prev [-]

> LLMs are not inherently non-deterministic.

Reproducible does not mean deterministic. You cannot determine in advance what a prompt will give as output, even with a temperature of 0 and a fixed seed, therefore they are not deterministic.