| ▲ | brcmthrowaway 3 hours ago | |
Basically LM Studio has a server that serves models over HTTP (localhost). Configure/enable the server and connect OpenCode to it. Try this article https://advanced-stack.com/fields-notes/qwen35-opencode-lm-s... I'm looking for an alternative to OpenCode though, I can barely see the UI. | ||
| ▲ | AstroBen 3 hours ago | parent [-] | |
Codex also supports configuring an alternative API for the model, you could try that: https://unsloth.ai/docs/basics/codex#openai-codex-cli-tutori... | ||