▲ | sorenjan 11 hours ago | |||||||||||||||||||||||||
> Some of the supported models are large and wouldn't fit on most local devices. Why would I use those models on your cloud instead of using Google's or Anthropic's models? I'm glad there are open models available and that they get better and better, but if I'm paying money to use a cloud API I might as well use the best commercial models, I think they will remain much better than the open alternatives for quite some time. | ||||||||||||||||||||||||||
▲ | mchiang 10 hours ago | parent | next [-] | |||||||||||||||||||||||||
When we started Ollama, we were told how open-source (open-weight wasn't a term back then) will always be inferior to the close-sourced models. This was 2 years ago (Ollama's birthday is July 18th, 2023). Fast forward to now, open models are quickly catching up, and at a significantly lower price point for most and can be customized for specific tasks instead of being general purpose. For general purpose models, absolutely the closed models are currently dominating. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
▲ | ineedasername 10 hours ago | parent | prev [-] | |||||||||||||||||||||||||
A person can use Google’s Gemma models on ollama’s cloud and possibly pay less. And have more quality control that way (and other types of control I guess) since there is no don’t need to wonder if a recent model update or load balance throttling impacted results. Your use case doesn’t generalize. |