Remix.run Logo
well_ackshually 4 hours ago

>solved the UX problem.

>One command

Notwithstanding the fact that there's about zero difference between `ollama run model-name` and `llama-cpp -hf model-name`, and that running things in the terminal is already a gigantic UX blocker (Ollama's popularity comes from the fact that it has a GUI), why are you putting the blame back on an open source project that owes you approximately zero communication ?

zozbot234 4 hours ago | parent | next [-]

> Ollama's popularity comes from the fact that it has a GUI

It's not the GUI, it's the curated model hosting platform. Way easier to use than HF for casual users.

kgwgk 3 hours ago | parent [-]

It also made easy for casual users to think that they were running deepseek.

Eisenstein 2 hours ago | parent | prev [-]

> Notwithstanding the fact that there's about zero difference between `ollama run model-name` and `llama-cpp -hf model-name`

There is a TON of difference. Ollama downloads the model from its own model library server, sticks it somewhere in your home folder with a hashed name and a proprietary configuration that doesn't use the in built metadata specified by the model creator. So you can't share it with any other tool, you can't change parameters like temp on the fly, and you are stuck with whatever quants they offer.