▲ | paool 5 days ago | ||||||||||||||||
Can you use your Qwen instance in CLIs like Claude code, codex, or whatever open source coding agent? Or do you have to copy paste into LM studio? | |||||||||||||||||
▲ | evilduck 4 days ago | parent | next [-] | ||||||||||||||||
Yeah you can, so long as you're hosting your local LLM through something with an OpenAI-compatible API (which is a given for almost all local servers at this point, including LM Studio). https://opencode.ai and https://github.com/QwenLM/qwen-code both allow you to configure any API as the LLM provider. That said, running agentic workloads on local LLMs will be a short and losing battle against context size if you don't have hardware specifically bought for this purpose. You can get it running and it will work for several autonomous actions but not nearly as long as a hosted frontier model will work. | |||||||||||||||||
| |||||||||||||||||
▲ | DrAwdeOccarim 4 days ago | parent | prev [-] | ||||||||||||||||
LM Studio lets you run a model as a local API (OpenAI-compatible REST server). |