▲ | drcongo 3 days ago | |
This looks pretty neat. Just spotted in the docs that it has an MCP server too, however, I haven't found anything in the docs about using a locally hosted model. Running this on a box in the corner of the office would be great, but external AI providers would be a deal breaker. | ||
▲ | bshzzle 3 days ago | parent [-] | |
Running Sourcebot with a self-hosted LLM is something we plan to support and have documented in the golden path very soon, so stay tuned. We are using the Vercel AI SDK which supports Ollama via a community provider, but doesn't V5 yet (which Sourcebot is on): https://v5.ai-sdk.dev/providers/community-providers/ollama |