| ▲ | rootusrootus 11 hours ago | |
I'd hope it could be the other way around. Some stuff should be relatively straightforward -- summarizing notifications, emails, setting timers, things like that should be obviously on-device. But aside from that, I would hope that the on-device AI can make the determination on whether it is necessary to go to a datacenter AI for a better answer. But you may be right, maybe on-device won't be smart enough to decide it isn't smart enough. Though it does seem like the local LLMs have gotten awfully good. | ||
| ▲ | layer8 10 hours ago | parent [-] | |
I can see them going that route, but it would cause similarly annoying breaks in the flow as current Siri offering to delegate to ChatGPT, or on-device Siri deciding it can do the task but actually failing or doing it wrong. It certainly wouldn’t be an “it just works” experience. | ||