Remix.run Logo
reader9274 7 days ago

Tried to run the gpt-oss:20b in ollama (runs perfectly) and tried to connect ollama to jan but it didn't work.

accrual 7 days ago | parent | next [-]

I got Jan working with Ollama today. Jan reported it couldn't connect to my Ollama instance on the same host despite it working fine for other apps.

I captured loopback and noticed Ollama returning an HTTP 403 forbidden message to Jan.

The solution was set environment variables:

    OLLAMA_HOST=0.0.0.0
    OLLAMA_ORIGINS=*
Here's the rest of the steps:

- Jan > Settings > Model Providers

- Add new provider called "Ollama"

- Set API key to "ollama" and point to http://localhost:11434/v1

- Ensure variables above are set

- Click "Refresh" and the models should load

Note: Even though an API key is not required for local Ollama, Jan apparently doesn't consider it a valid endpoint unless a key is provided. I set mine to "ollama" and then it allowed me to start a chat.

7 days ago | parent [-]
[deleted]
thehamkercat 7 days ago | parent | prev [-]

Exactly: https://github.com/menloresearch/jan/issues/5474

Can't make it work with ollama endpoint

this seems to be the problem but they're not focusing on it: https://github.com/menloresearch/jan/issues/5474#issuecommen...