| ▲ | bigyabai 3 days ago | ||||||||||||||||
Some lot of good that's done them. The Neural Engine is dark silicon on most devices I've seen, and now we're getting another product segment with M5's matmul GPUs. To me, it feels like Apple should have supported CUDA from the start. Sell the ARM-hungry datacenter some rackmount Macs with properly fast GPUs, and Apple can eventually bring the successful inference technology to cheaper devices. Apple's current all-or-nothing strategy has produced nothing but redundant hardware accelerators, while Nvidia's vertical integration only gets stronger. | |||||||||||||||||
| ▲ | robotresearcher 3 days ago | parent | next [-] | ||||||||||||||||
> The Neural Engine is dark silicon on most devices I've seen At the very least it's used by the Photos app[1]. Likely other Apple apps too. [1] https://machinelearning.apple.com/research/recognizing-peopl... | |||||||||||||||||
| |||||||||||||||||
| ▲ | rickdeckard 3 days ago | parent | prev [-] | ||||||||||||||||
Maybe. But Apple tried the server business and found that they can't compete there. Not because of Engineering deficiencies, but because datacenters buy based on facts, not fluff. Now their ARM silicon is top-notch, no doubt about that. But will they earn a higher margin if they put it in a datacenter instead of a consumer device which is then used to consume Apple Services? I don't think so. | |||||||||||||||||
| |||||||||||||||||