▲ | newswasboring 6 days ago | |||||||||||||||||||||||||||||||
My guess would be local AI. Apple Silicon is uniquely suitable with its shared memory. | ||||||||||||||||||||||||||||||||
▲ | theptip 6 days ago | parent [-] | |||||||||||||||||||||||||||||||
Yeah exactly. The MacBook Pro is by far the most capable consumer device for local LLM. A beefed up NPU could provide a big edge here. More speculatively, Apple is also one of the few companies positioned to market an ASIC for a specific transformer architecture which they could use for their Siri replacement. (Google has on-device inference too but their business model depends on them not being privacy-focused and their GTM with Android precludes the tight coordination between OS and hardware that would be required to push SOTA models into hardware. ) | ||||||||||||||||||||||||||||||||
|