| ▲ | PullJosh 8 hours ago | |||||||||||||||||||||||||||||||||||||
This is awesome! 1) I am able to run the model on my iPhone and get good results. Not as good as Gemini in the cloud, but good. 2) I love the “mobile actions” tool calls that allow the LLM to turn on the flashlight, open maps, etc. It would be fun if they added Siri Shortcuts support. I want the personal automation that Apple promised but never delivered. 3) I am so excited for local models to be normalized. I build little apps for teachers and there are stringent privacy laws involved that mean I strongly prefer writing code that runs fully client-side when possible. When I develop apps and websites, I want easy API access to on-device models for free. I know it sort of exists on iOS and Chrome right now, but as far as I’m aware it’s not particularly good yet. | ||||||||||||||||||||||||||||||||||||||
| ▲ | buzzerbetrayed 7 hours ago | parent [-] | |||||||||||||||||||||||||||||||||||||
For me the hallucination and gaslighting is like taking a step back in time a couple of years. It even fails the “r’s in strawberry” question. How nostalgic. It’s very impressive that this can run locally. And I hope we will continue to be able to run couple-year-old-equivalent models locally going forward. | ||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||