Remix.run Logo
shagie 7 hours ago

It was driven by privacy and on device compute.

Anything you ask an Android device to do, or an Alexa device goes to their clouds to be 100% processed there.

Apple tried to make a small and focused interface that could do a limited set of things on device without going to the cloud to do it.

This was built around the idea of "Intents" and it only did the standard intents... and app developers were supposed to register and link into them.

https://developer.apple.com/documentation/intents

Some of the things didn't really get fleshed out, some are "oh, that's something in there?" (Restaurant reservations? Ride Booking?) and feels more like the half baked mysql interfaces in php.

However, as part of privacy - you can create a note (and dictate it) without a data connection with Siri. Your "start workout" command doesn't leave your device.

Part of that is privacy. Part of that is that Apple was trying to minimize its cloud spend (on GCP or AWS) by keeping as much of that activity on device. It wasn't entirely on device, but a lot more of it is than what Android is... and Alexa is a speaker and microphone hooked up to AWS.

This was ok, kind of meh, but ok pre-ChatGPT. With ChatGPT the expectations changed and the architecture that Apple had was not something that could pivot to meeting those expectations.

https://en.wikipedia.org/wiki/Apple_Intelligence

> Apple first implemented artificial intelligence features in its products with the release of Siri in the iPhone 4S in 2011.

> ...

> The rapid development of generative artificial intelligence and the release of ChatGPT in late 2022 reportedly blindsided Apple executives and forced the company to refocus its efforts on AI.

ChatGPT was as much a blindside to Apple as the iPhone was to Blackberry.

npunt 6 hours ago | parent [-]

I think all of these are true:

1. Apple is big enough that it needs to take care of edge cases like offline & limited cell reception, which affect millions in any given moment.

2. Launching a major UI feature (Siri) that people will come to rely on requires offline operation for common operations like basic device operations and dictation. Major UI features shouldn't cease to function when they enter bad reception zones.

3. Apple builds devices with great CPUs, which allows them to pursue a strategy of using edge compute to reduce spend.

4. A consequence of building products with good offline support is they are more private.

5. Apple didn't even build a full set of intents for most of their apps, hence 'remind me at this location' doesn't even work. App developers haven't either, because ...

6. Siri (both the local version and remote service) isn't very good, and regularly misunderstands or fails at basic comprehension tasks that do not even require user data to be understood or relayed back to devices to execute.

I don't buy that privacy is somehow an impediment to #5 or #6. It's only an issue when user data is involved, and Apple has been investing in techs like differential privacy to get around these limitations to some extent. But that is further downstream from #5 and #6 though.

bombcar 2 hours ago | parent [-]

Dragon Naturally Speaking was way more accurate than Siri is now, and it was on-device on ancient computers.

I don't care if I have to carefully say "bibbidy bobbity boo, set an alarm for two" - I just need it to be reliable.