Remix.run Logo
all2 10 hours ago

What sort of monetization model would you like to see? What model would you deem acceptable?

dcreater 9 hours ago | parent | next [-]

Ollama , the local inference platform, stays completely local. Maintained by a non-profit org with dev time contributed to by a for-profit company. That company can be VC backed and can make their cloud inference platform. And can use ollama as its backed, as a platform to market etc. But keep it as a separate product (not named ollama).

This is almost exactly how duckdb/motherduck functions and I think theyre doing an excellent job.

EDIT: grammar and readability

troyvit 9 hours ago | parent | prev | next [-]

If I were them I'd go whole-hog on local models and:

* Work with somebody like System76 or Framework to create great hardware systems come with their ecosystem preinstalled.

* Build out a PaaS, perhaps in partnership with an existing provider, that makes it easy for anybody to do what Ollama search does. I'm more than half certain I could convince our cash strapped organization to ditch elastic search for that.

* Partner with Home Assistant, get into home automation and wipe the floor with Echo and its ilk (yeah basically resurrect Mycroft but add whole-house automation to it).

Each of those are half-baked, but it also took me 7 minutes to come up with them, and they seem more in line with what Ollama tries to represent than a pure cloud play using low-power models.

Cheer2171 7 hours ago | parent | prev [-]

Have ollama server support auth / API keys (closed as out of scope) and monetize the way everyone else does around SSO.