| ▲ | adevilinyc 9 hours ago | ||||||||||||||||
How do you configure LLM température in coding agents, e.g. opencode? | |||||||||||||||||
| ▲ | kabr 8 hours ago | parent | next [-] | ||||||||||||||||
https://opencode.ai/docs/agents/#temperature set it in your opencode.json | |||||||||||||||||
| ▲ | Der_Einzige 8 hours ago | parent | prev [-] | ||||||||||||||||
You can't without hacking it! That's my point! The only places you can easily are via the API directly, or "coomer" frontends like SillyTavern, Oobabooga, etc. Same problem with image generation (lack of support for different SDE solvers, the image version of LLM sampling) but they have different "coomer" tools, i.e. ComfyUI or Automatic1111 | |||||||||||||||||
| |||||||||||||||||