| ▲ | thehamkercat 5 hours ago | |||||||
LMStudio is not open source though, ollama is but people should use llama.cpp instead | ||||||||
| ▲ | smcleod 4 hours ago | parent | next [-] | |||||||
I suspect Ollama is at least partly moving away open source as they look to raise capitol, when they released their replacement desktop app they did so as closed source. You're absolutely right that people should be using llama.cpp - not only is it truly open source but it's significantly faster, has better model support, many more features, better maintained and the development community is far more active. | ||||||||
| ▲ | nateb2022 3 hours ago | parent | prev | next [-] | |||||||
> but people should use llama.cpp instead MLX is a lot more performant than Ollama and llama.cpp on Apple Silicon, comparing both peak memory usage + tok/s output. edit: LM Studio benefits from MLX optimizations when running MLX compatible models. | ||||||||
| ▲ | behnamoh 4 hours ago | parent | prev [-] | |||||||
> LMStudio is not open source though, ollama is and why should that affect usage? it's not like ollama users fork the repo before installing it. | ||||||||
| ||||||||