| ▲ | aiscoming 2 hours ago | |
vs code supports local models (bring your own key/model) you need a model server - ollama/llama.cpp/lm studio | ||
| ▲ | no-name-here an hour ago | parent [-] | |
> bring your own key Do you mean supporting oai-compatible api URLs in copilot? If so then you need either VS Code Insiders, or a VS Code extension I believe? | ||