Remix.run Logo
segmenta 8 hours ago

Yes - you can use local LLMs through LiteLLM and Ollama. Would you like us to support anything else?

thedangler 7 hours ago | parent [-]

LM Studio?

ramnique 7 hours ago | parent [-]

Yes, because LM Studio is openai-compatible. When you run rowboatx the first time, it creates a ~/.rowboat/config/models.json. You can then configure LM Studio there. Here is an example: https://gist.github.com/ramnique/9e4b783f41cecf0fcc8d92b277d...