Remix.run Logo
alwillis 10 hours ago

> Private cloud is used AFIAK for virtually 0 use cases so far.

Applications using Apple's foundation models can seamlessly switch from on-device models to Private Compute Cloud.

Research is already showing the use of LLMs for people's most intimate relationship and medical issues. The usual suspects will try to monetize that, which why Private Cloud Compute is a thing from the jump.

> Then they've got OpenAI/Gemini/Anthropic via API. But this completely goes against all their private cloud messaging

Using ChatGPT via Siri today, no personally identifying information is shared with OpenAI and those prompts aren't used for training. I suspect Apple would want something similar for Google, Anthropic, etc.

At some point, there will be the inevitable enshitification of AI platforms to recoup the billions VCs have invested, which means ads, which won't happen to Apple users using foundation model-based apps.

> Nearly all their devices do not have enough RAM and

Every Apple Silicon Mac (going back to the M1 in 2020) can run Apple Intelligence. 8 GB RAM is all they need. Every iPhone 15 Pro, Pro Max and the entire 16 line can all run Apple Intelligence.

Flagship iPhone 17 models are expected to come with 12 GB of RAM and all current Mac models come with at least 16 GB.

Apple sells over 200 million iPhones in a given year.

There's no doubt Apple stumbled out of the gate regarding AI; these are early days. They can't be counted out.