Remix.run Logo
dymk 6 hours ago

Can the base URL be overridden so I can point it at eg Ollama or any other OpenAI compatible endpoint? I’d love to use this with local LLMs, for the speed and privacy boost.

jedbrooke 6 hours ago | parent | next [-]

https://github.com/chr15m/runprompt/blob/main/runprompt#L9

seems like it would be, just swap the openai url here or add a new one

chr15m 5 hours ago | parent | prev [-]

Good idea. Will figure out a way to do this.

khimaros 3 hours ago | parent | next [-]

simple solution: honor OPENAI_API_BASE env var

benatkin 4 hours ago | parent | prev [-]

Perhaps instead of writing an llm abstraction layer, you could use a lightweight one, such as @simonw's llm.