Remix.run Logo
accrual 7 days ago

I got Jan working with Ollama today. Jan reported it couldn't connect to my Ollama instance on the same host despite it working fine for other apps.

I captured loopback and noticed Ollama returning an HTTP 403 forbidden message to Jan.

The solution was set environment variables:

    OLLAMA_HOST=0.0.0.0
    OLLAMA_ORIGINS=*
Here's the rest of the steps:

- Jan > Settings > Model Providers

- Add new provider called "Ollama"

- Set API key to "ollama" and point to http://localhost:11434/v1

- Ensure variables above are set

- Click "Refresh" and the models should load

Note: Even though an API key is not required for local Ollama, Jan apparently doesn't consider it a valid endpoint unless a key is provided. I set mine to "ollama" and then it allowed me to start a chat.

7 days ago | parent [-]
[deleted]