▲ | unethical_ban 4 days ago | ||||||||||||||||
I still can't figure out how to set up a completely free, completely private/no-accounts method of connecting an IDE to LM Studio. I thought it would be "Continue" extension for VS Code, but even for local LM integration it insists I sign-in to their service before continuing. | |||||||||||||||||
▲ | mikestaas 4 days ago | parent | next [-] | ||||||||||||||||
Roo code in vs code, and qwen coder in lm studio is a decent local only combo. | |||||||||||||||||
| |||||||||||||||||
▲ | maxsilver 4 days ago | parent | prev | next [-] | ||||||||||||||||
Both Roo and Continue support local modals (via LM Studio). For Continue, you add a fake account (type in literally anything) and then click 'edit' -- it will take you to the settings JSON, and you can type in LM Studio as your source. The main problem I'm seeing, is that a lot of the tooling doesn't work as well "agentically" with the models. (Most of these tools say something like 'works best with Claude, tested with Claude, good luck with any local models'). The local models via LM Studio already works really well for pure chat, but occasionally trip up semi-regularly on basic things, like writing files or running commands -- stuff that say, GitHub Copilot has mostly already polished. But those are basically just bugs in tooling that will likely get fixed. The local-only setup is behind the current commercial market -- but not much behind. I strongly agree with the commenter above, if the commercial models and tooling slow down at any point, the free/open models and tooling will absolutely catch up -- I'd guess within 9 months or so. | |||||||||||||||||
▲ | taneq 4 days ago | parent | prev [-] | ||||||||||||||||
Huh? I have Continue on Codium talking to ollama, all local, and I never signed up to nuffin’ |