Remix.run Logo
hnlmorg 7 hours ago

Have you actually tried high temperature values for coding? Because I don’t think it’s going to do what you claim it will.

LLMs don’t “reason” the same way humans do. They follow text predictions based on statistical relevance. So raising the temperature will more likely increase the likelihood of unexecutable pseudocode than it would create a valid but more esoteric implementation of a problem.

Terr_ 6 hours ago | parent | next [-]

To put it another way, a high-temperature mad-libs machine will write a very unusual story, but that isn't necessarily the same as a clever story.

balamatom 4 hours ago | parent [-]

So why is this "temperature" not on, like, a rotary encoder?

So you can just, like, tweak it when it's working against your intent in either direction?

bob1029 5 hours ago | parent | prev [-]

High temperature seems fine for my coding uses on GPT5.2.

Code that fails to execute or compile is the default expectation for me. That's why we feed compile and runtime errors back into the model after it proposes something each time.

I'd much rather the code sometimes not work than to get stuck in infinite tool calling loops.