| ▲ | harvey9 8 days ago | |
Do you have the option to run on a local model? Lots of firms don't want data or prompts going outside the local network  | ||
| ▲ | jorgeoguerra 8 days ago | parent [-] | |
Yep — if you have a local model with an OpenAI-compatible v1/chat/completions endpoint (most local models have this option), you can route Erdos to use it in the Erdos AI settings.  | ||