| ▲ | Someone1234 7 hours ago | ||||||||||||||||||||||
Apple's AI strategy really kind of threads the needle cleverly. "AI" (LLMs) may or may not have a bubble-pop moment, but until it does Apple get to ride it on these press releases and claims. But if the big-pop occurs, then Apple winds up with really fantastic hardware that just happens to be good at AI workloads (as well as general computing). For example, image classification (e.g. face recognition/photo tagging), ASR+vocoders, image enhancement, OCR, et al, were popular before the current boom, and will likely remain popular after. Even if LLM usage dries up/falls out of vogue, this hardware still offers a significant user benefit. | |||||||||||||||||||||||
| ▲ | lamontcg 4 hours ago | parent | next [-] | ||||||||||||||||||||||
LLM usage is not very likely to "dry up". What is more likely to happen though is that it doesn't take multiple $10B of datacenter and capital to build out models--and the performance against LLM benchmarks starts to max out to the point where throwing more capital at it doesn't make enough of a difference to matter. Once the costs shrink below $1B then Apple could start building their own models with the $139B in cash and marketable securities that they have--while everyone else has burned through $100B trying to be first. Of course the problem with this strategy right now is that Siri really, really sucks. They do need to come up with some product improvements now so that they don't get completely lapped. | |||||||||||||||||||||||
| ▲ | ChrisGreenHeur 7 hours ago | parent | prev [-] | ||||||||||||||||||||||
those things could likely just run fine on the gpu though | |||||||||||||||||||||||
| |||||||||||||||||||||||