▲ | dcreater 9 hours ago | |
Ollama , the local inference platform, stays completely local. Maintained by a non-profit org with dev time contributed to by a for-profit company. That company can be VC backed and can make their cloud inference platform. And can use ollama as its backed, as a platform to market etc. But keep it as a separate product (not named ollama). This is almost exactly how duckdb/motherduck functions and I think theyre doing an excellent job. EDIT: grammar and readability |