| ▲ | Maxious 5 hours ago | |||||||
They spent several years in stealth mode but the initial release was llama.cpp. Ollama v0.0.1 "Fast inference server written in Go, powered by llama.cpp" https://github.com/ollama/ollama/tree/v0.0.1 | ||||||||
| ▲ | em-bee 4 hours ago | parent [-] | |||||||
They spent several years in stealth mode doing what? trying to build themselves what llama.cpp ended up doing for them? | ||||||||
| ||||||||