▲ | lolinder 7 months ago | |||||||
Llama.cpp forms the base for both Ollama and Kobold.cpp and probably a bunch of others I'm not familiar with. It's less a question of whether you want to use llama.cpp or one of the others and more of a question of whether you benefit from using one of the wrappers. I can imagine some use cases where you'd really want to use llama.cpp directly, and there are of course always people who will argue that all wrappers are bad wrappers, but for myself I like the combination of ease of use and flexibility offered by Ollama. I wrap it in Open WebUI for a GUI, but I also have some apps that reach out to Ollama directly. | ||||||||
▲ | freehorse 7 months ago | parent [-] | |||||||
What is the advantage of ollama/open webui let's say, vs llama-server? I have been using llama.cpp since when it came out, I am used to the syntax etc and I do not have problems building it (probably because I use macos which it supports better), so am I missing something from not using ollama? | ||||||||
|