| ▲ | robotswantdata 9 hours ago | |||||||
Why are people still using Ollama? Serious. Lemonade or even llama.cpp are much better optimised and arguably just as easy to use. | ||||||||
| ▲ | eddieroger 2 hours ago | parent | next [-] | |||||||
`ollama serve` and `ollama run` The devex is great and familiar to folks who have used Docker. Reading through the Lemonade documentation, it seems like a natural migration, but we're talking about two steps for getting started versus just one. So I'd need a reason to make that much change when I'm happy enough with Ollama. | ||||||||
| ▲ | hamdingers 2 hours ago | parent | prev | next [-] | |||||||
Why not? Also serious. It seems to just work every time I try to use it, the API is easy to work with, the model library is convenient. I've never hit any kind of snag that makes me look elsewhere. | ||||||||
| ▲ | niek_pas 6 hours ago | parent | prev | next [-] | |||||||
Serious answer: I don't use it that much, it's what I happened to download like 1.5 years ago, and it works fine. Happy to see what may be a speed boost, and have little interest in switching to something else (unless my situation changes, of course). | ||||||||
| ▲ | vorticalbox 6 hours ago | parent | prev [-] | |||||||
i like ollama, mostly because the cli is pretty nice. its desktop app has stupid choices like if a model can support tools then the ui should give me the "search" option but it only shows for cloud models. i have ran lmstudio for a while but i don't really use local models that much other than to mess about. | ||||||||
| ||||||||