| ▲ | easygenes 3 hours ago | |||||||||||||||||||||||||
Why is ollama so many people’s go-to? Genuinely curious, I’ve tried it but it feels overly stripped down / dumbed down vs nearly everything else I’ve used. Lately I’ve been playing with Unsloth Studio and think that’s probably a much better “give it to a beginner” default. | ||||||||||||||||||||||||||
| ▲ | diflartle 2 hours ago | parent | next [-] | |||||||||||||||||||||||||
Ollama is good enough to dabble with, and getting a model is as easy as ollama pull <model name> vs figuring it out by yourself on hugging face and trying to make sense on all the goofy letters and numbers between the forty different names of models, and not needing a hugging face account to download. So you start there and eventually you want to get off the happy path, then you need to learn more about the server and it's all so much more complicated than just using ollama. You just want to try models, not learn the intricacies of hosting LLMs. | ||||||||||||||||||||||||||
| ▲ | DiabloD3 an hour ago | parent | prev | next [-] | |||||||||||||||||||||||||
Advertising, mostly. Ollama's org had people flood various LLM/programming related Reddits and Discords and elsewhere, claiming it was an 'easy frontend for llama.cpp', and tricked people. Only way to win is to uninstall it and switch to llama.cpp. | ||||||||||||||||||||||||||
| ▲ | polotics 2 hours ago | parent | prev [-] | |||||||||||||||||||||||||
Ollama got some first-mover advantage at the time when actually building and git pulling llama.cpp was a bit of a moat. The devs' docker past probably made them overestimate how much they could lay claim to mindshare. However, no one really could have known how quickly things would evolve... Now I mostly recommend LM-studio to people. What does unsloth-studio bring on top? | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||