| ▲ | _ache_ 6 hours ago | |
I'm using ollama with local LLM for completion (tabby-ml) and Open WebUI for chat. What will be the goto local ACP server working with ollama ? Ideally working with toad to experiment with it. | ||
| ▲ | evalstate 4 hours ago | parent | next [-] | |
fast-agent has ACP support and works well with ollama. Once installed you can just use `toad acp "fast-agent-acp --model generic.<ollama-model>"`. | ||
| ▲ | willm 6 hours ago | parent | prev [-] | |
You may be confusing Agent Communication Protocol with Agent Client Protocol. Yeah, 2 ACP protocols. I had no hand in the naming. If an agent can be configured to use Ollama, then you could use it from a Toad. It might be possible right now. | ||