| ▲ | fy20 6 hours ago | ||||||||||||||||
It feels like a bit of history is missing... If ollama was founded 3 years before llama.cpp was released, what engine did they use then? When did they transition? | |||||||||||||||||
| ▲ | wolvoleo 5 hours ago | parent | next [-] | ||||||||||||||||
I don't think that is the case. Llama.cpp appeared within weeks after meta released llama to select researchers (which then made it out to the public). 3 years before that nobody knew of the name llama. I'm sure that llama.cpp existed first | |||||||||||||||||
| |||||||||||||||||
| ▲ | Maxious 5 hours ago | parent | prev [-] | ||||||||||||||||
They spent several years in stealth mode but the initial release was llama.cpp. Ollama v0.0.1 "Fast inference server written in Go, powered by llama.cpp" https://github.com/ollama/ollama/tree/v0.0.1 | |||||||||||||||||
| |||||||||||||||||