Remix.run Logo
the_duke 3 hours ago

You shouldn't be downvoted - LLMs could in theory be deterministic, but they currently are not, due to how models are implemented.

otabdeveloper4 an hour ago | parent [-]

All my self-hosted inference has temperature zero and no randomness.

It is absolutely workable, current inference engines are just lazy and dumb.

(I use a Zobrist hash to track and prune loops.)