Remix.run Logo
pradn 2 days ago

Apple needs on-device AI to do chores for me with the apps I have installed. Apple has everything it needs:

* Apps are already logged in, so no extra friction to grant access.

* Apps mostly use Apple-developed UI frameworks, so Apple could turn them into AI-readable representations, instead of raw pixels. In the same way a browser can give the AI the accessibility DOM, Apple could give AIs an easier representation to read and manipulate.

* iPhones already have specialized hardware for AI acceleration.

I want to be able to tell my phone to a) summarize my finances across all the apps I have b) give me a list of new articles of a certain topic from my magazine/news apps c) combine internet search with on-device files to generate personal reports.

All this is possible, but Apple doesn't care to do this. The path not taken is invisible, and no one will criticize them for squandering this opportunity. That's a more subtle drawback with only having two phone operating systems.

darth_avocado 2 days ago | parent | next [-]

> iPhones already have specialized hardware for AI acceleration.

This really is the problem. Why do I spend hundreds of dollars more for specialized hardware that’s better than last years specialized hardware if all the AI features are going to be an API call to chatGPT? I am pretty sure I don’t need all of that hardware to watch YouTube videos or scroll Instagram/web, which is what 95% of the users do.

glxxyz a day ago | parent [-]

"Do you want me to use ChatGPT to answer that?"

teeray 2 days ago | parent | prev | next [-]

> Apple needs on-device AI to do chores for me with the apps I have installed

Nevermind that—iOS just needs to reliably be able to play the song I’m telling it to without complaining “sorry, something went wrong with the connection…”

dpoloncsak a day ago | parent [-]

Honeslty I don't think I've ever had this happen, apart for when im in a tunnel on a train without service and streaming.

some_random 2 days ago | parent | prev | next [-]

I agree completely, it's really unfortunate how AI on apple devices has been going. The message summarization is borderline useless and widely mocked, meanwhile their giant billboard ads for it are largely stupid and uncompelling. Let me choose to give it access to my data if I want to do really useful stuff with on device processing. They've been leaning into the privacy thing, do the stuff that would be creepy if it left my device, generate push notification reminders for stuff I forgot to put in the calendar, or track my location and tell me I'm going to the wrong airport. Suggest birthday gifts for my friends and family, idk.

Edit: And add strong controls to limit what it can and cannot access, especially for the creepy stuff.

pradn 12 hours ago | parent [-]

They're stuck on the privacy angle, because what it means is you can't call remote services. You'll always have access to more resources at a data-center than a phone. So, while the frontier of what's possible with purely-local models will keep advancing, it'll never exceed what's possible with remote models.

People care about extra privacy when the delta in capability is minimal. But people won't allow a massive discrepancy, like the difference between a 8B model and a 700B model.

_mu 2 days ago | parent | prev | next [-]

I think on-device AI will show up more front and center but in a few more years.

A big issue to solve is battery life. Right now there's already a lot that goes on at night while the user sleeps with their phone plugged in. This helps to preserve battery life because you can run intensive tasks while hooked up to a power source.

If apps are doing a lot of AI stuff in the course of regular interaction, that could drain the battery fairly quickly.

Amazingly, I think the memory footprint of the phones will also need to get quite a bit larger to really support the big uses cases and workflows. (I do feel somewhat crazy that it is already possible to purchase an iPhone with 1TB of storage and 8GB of RAM).

throwaway81523 2 days ago | parent [-]

2TB microsdxc cards have been available for a year or so, and 1TB cards have been available for several years and are even quite affordable. They work in many Android phones including my cheap Motorola. So it's Apple's sky-high premiums that has made their 1TB phones surprising.

https://www.bhphotovideo.com/c/product/1868375-REG/sandisk_s... 2TB $185

https://www.bhphotovideo.com/c/product/1692704-REG/sandisk_s... 1TB $90

https://www.bhphotovideo.com/c/product/1712751-REG/sandisk_s... 512GB $40

2 days ago | parent [-]
[deleted]
astrange 2 days ago | parent | prev | next [-]

> In the same way a browser can give the AI the accessibility DOM, Apple could give AIs an easier representation to read and manipulate.

Apps already have such an accessibility tree; it's used for VoiceOver and you can use it to write UI unit tests. (If you haven't tested your own app with VoiceOver, you should.)

pradn 12 hours ago | parent [-]

I have used it actually! It's been years so the fact that it's an accessibility tree just like in a browser didn't come to mind immediately. Both Mac and Windows have such representations for native apps. The actual functionality apps and the accessibility clients support is something like a two-way negotiation. A lot of stuff that should be supported in apps, in theory, is not, just because no client supports it, etc.

rcxdude 2 days ago | parent | prev | next [-]

This is all possible, but an absolutely terrible idea from a security point of view, while prompt injection attacks are still a thing, and there's little evidence they will stop being a thing soon.

pradn 12 hours ago | parent [-]

We can work toward closing security gaps with new technology, yes. It is necessary for large-scale adoption of LLM tech.

kodefreeze 2 days ago | parent | prev | next [-]

They've being doing some research on this: https://machinelearning.apple.com/research/ferretui-mobile

pradn 12 hours ago | parent [-]

I didn't see this, thank you! They have a follow-up as well:

https://machinelearning.apple.com/research/ferret-ui-2

mgh2 2 days ago | parent | prev | next [-]

Apple is generally anti market hype. It is a smart PR move to avoid mentioning AI after the Apple Intelligence fiasco, their researchers leaving, and the bubble sentiment at the moment.

pradn 12 hours ago | parent | next [-]

It's not a smart move to avoid integrating the most important capability advance in computing in the past decade - LLMs. They do support small cases, like summarizing text. But there's scope to do more.

adastra22 2 days ago | parent | prev [-]

You are missing the point. Why was Apple Intelligence a fiasco? Because they failed to understand what users like GP wanted.

mgh2 a day ago | parent [-]

It failed to deliver on its promises, investors sued them from overstating AI capabilities.

IMO, it was the researcher team's fault, good riddance.

adastra22 a day ago | parent [-]

no, I placed this squarely on apple‘s shoulders. There are real use cases for new AI tools that are actually useful. Use cases that Apple is already invested into — Siri, text to speech & vice versa, etc. many of these have open source models that they could very easily be integrating into their product, even if they didn’t have a partnership with the premier AI research lab.

Instead, we got, what? An automated memeoji maker? Holy hell they dropped the ball on this.

pixxel 2 days ago | parent | prev [-]

[dead]