Remix.run Logo
chao- 2 days ago

It is very hard to put any belief in the rumor mill surrounding Intel's discrete desktop GPUs. Already this year, there have been at least three "leaks" saying "It's canceled!", and every time, a counter-rumor comes about saying "It isn't canceled!"

In all accounts I have seen, their single SKU from this second generation consumer lineup has been well-received. Yet the article says "what can only be categorized as a shaky and often rudderless business", without any justification.

Yes, it is worth pondering what the Nvidia investment means for Intel Arc Graphics, but "rudderless"? Really?

belval 2 days ago | parent | next [-]

Honestly, the rumor mill surrounding Intel is actually very similar to AMD 2015-2016 pre-Zen (not saying that they will see the same outcome). I swear I have seen the same "x86 license is not transferable [other company] might sue them" 9 years ago or "Product Y will be discontinued".

When it comes to GPUs, a $4T company probably couldn't care less what their $150B partner does in their spare time as long as they prioritize the partnership. Especially when the GPUs in question are low-end units, in a segment that Nvidia has no competition in and not even shipping that many. If they actually asked them to kill it, it would be 100% out of pettiness.

Sometimes I wonder if these articles are written for clicks and these "leakers" are actually just the authors making stuff up and getting it right from time to time.

chao- 2 days ago | parent [-]

From a corporate strategy perspective, cancel Arc or keep Arc, I can see it both ways.

Intel has so many other GPU-adjacent products and they will doubtless be continuing most of them, even if they don't pursue Arc further: Jaguar Shores, Flex GPUs for VDI, and of course their Xe integrated graphics. I could possibly see Intel not ship a successor to Flex? Maybe? I cannot see a world where they abandon Xe (first-party laptop graphics) or Jaguar Shores ("rack scale" datacenter "GPUs").

With all of that effort going into GPU-ish designs, is there enough overlap that the output/artifacts from those products support and benefit Arc? Or if Arc only continues to be a mid-tier success, is it thus a waste of fab allocation, a loss of potential profit, and an unnecessary expense in terms of engineers maintaining drivers, and so forth? That is the part I do not know, and why I could see it going either way.

I want to acknowledge that I am speaking out of my depth a bit: I have not read all of Intel's quarterly financials, and not followed every zig and zag of every product line. Yet while can see it both ways, in no world do I trust these supposed leaks.

AnthonyMouse 2 days ago | parent | next [-]

GPUs are parallel compute engines. The difference between a high performance CPU core design from Intel/AMD/Apple and the low end stuff is a bunch of fancy branch prediction, out of order execution, cache hierarchies, etc. all designed to improve single-thread performance. The primary difference between a large GPU and a small GPU is that a large GPU has more cores.

Sufficiently far in the past you might have been able to get away with an integrated GPU that didn't even have meaningful 3D acceleration etc., but those days are gone. Even web browsers are leaning on the GPU to render content, which matters for iGPUs for battery life, which makes performance per watt the name of the game. And that's the same thing that matters most for large GPUs because the constraint on performance is power and thermals.

Which is to say, if you're already doing the work to make a competitive iGPU, you've done most of the work to make a competitive discrete GPU.

The thing Intel really needs to do is to get the power efficiency of their own process on par with TSMC. Without that they're dead; they can't even fab their own GPUs.

belval 2 days ago | parent | prev [-]

> From a corporate strategy perspective, cancel Arc or keep Arc, I can see it both ways.

Me too, I just really really doubt that it would come from Nvidia

cubefox 2 days ago | parent | prev [-]

Yeah that is bizarre. They have been very focused and even managed to upstage AMD by several years in the ML acceleration department (XeSS).