| ▲ | mentalgear 5 hours ago | |
The reported alternatives seem pretty User-Friendly to me: > LM Studio gives you a GUI if that’s what you want. It uses llama.cpp under the hood, exposes all the knobs, and supports any GGUF model without lock-in. > Jan(https://www.jan.ai/) is another open-source desktop app with a clean chat interface and local-first design. > Msty(https://msty.ai/) offers a polished GUI with multi-model support and built-in RAG. koboldcpp is another option with a web UI and extensive configuration options. API wise: LM Studio has REST, OpenAI and more API Compatibilities. | ||
| ▲ | shantnutiwari 2 hours ago | parent [-] | |
All of those options were either too slow, or didnt work for me (Mac with Intel). I could have spent hours googling, but I downloaded Ollama and it just worked. So no, they are not alternatives to ollama | ||