▲ | rshemet 4 days ago | ||||||||||||||||
True - but Cactus is not just an app. We are a dev toolkit to run LLMs cross-platform locally in any app you like. | |||||||||||||||||
▲ | jadbox 4 days ago | parent | next [-] | ||||||||||||||||
How does it work? How does one model on the device get shared to many apps? Does each app have it's own inference sdk running or is there one inference engine shared to many apps (like ollama does). If it's the later, what's the communication protocol to the inference engine? | |||||||||||||||||
| |||||||||||||||||
▲ | pogue 4 days ago | parent | prev [-] | ||||||||||||||||
I would like to see it as an app, tbh! If I could run it as an APK with a nice GUI interface for picking different models to run, that would be a killer feature. | |||||||||||||||||
|