▲ | bigyabai 5 days ago | |
It's not a fantasy-land idea at all. Apple has already tried penetrating the datacenter before, they've proven they can ship a product to market if they want to. They just don't. They don't want to support Nvidia drivers or complex GPGPU primatives or non-raster GPU architectures or cross-platform acceleration libraries. Which is frankly an braindead decision from the opportunity cost side of things; if your consumers don't care, why not deliver what developers want? Apple can have their cake and eat it here. If they reacted fast enough (eg. ~2018) then they could have had a CUDA competitor in-time for the crypto and AI craze - that's a fact! But they missed out on those markets because they had their blinders on, dead-to-rights focused on the consumer market they're losing control over. It's starting to verge on pathetic how gimped the Mac is for a "real computer" product. |