Remix.run Logo
tart-lemonade 6 days ago

Intel's iGPUs don't seem very at risk because the market for low-power GPUs isn't very profitable to begin with. As long as Nvidia is able to sell basically any chip they want, why waste engineering hours and fab time on low-margin chips? The GT 1030 (Pascal) never got a successor, so that line is as good as dead.

Even before the Pascal GTs, most of the GT 7xx cards, which you would assume were Maxwell or Kepler from the numbering, were rebadged Fermi cards (4xx and 5xx)! That generation was just a dumping ground for all the old chips they had laying about, and given the prominence of halfway decent iGPUs by that point, I can't say I blame them for investing so little in the lineup.

That said, the dGPUs are definitely somewhat at risk, but I think the risk is only slightly elevated by this investment, given that it isn't exactly a cash cow and Intel has been doing all sorts of cost-cutting lately.

hakfoo 6 days ago | parent | next [-]

Aren't a lot of those cards sold for the audience that needs more display heads rather than necessarily performance?

This has been somewhat improved-- some mainboards will have HDMI and DisplayPort plumbed to the iGPU, but the classic "trader desk" with 4-6 screens hardly needs a 5090.

They could theoretically sell the same 7xx and 1030 chips indefinitely. I figure it's a static market like those strange 8/16Mb VGA chipsets that you sometimes see on server mainboards, just enough hardware to run diagnostics on a normally headless box.

xp84 6 days ago | parent | prev | next [-]

Agree. Not only would there be no money in it to try to replace Iris graphics or whatever they call them now -- it would be ultra pointless because the only people buying integrated graphics are those where gaming, on-device AI, and cryptocurrency aren't even part of the equation. Now, that is like 80%+ of the PC market, but it's perfectly well served already.

I saw this move more as setting up a worthy competitor to Snapdragon X Elite, and it could also probably crush AMD APUs if these RTX things are powerful.

behringer 6 days ago | parent | next [-]

Intel sells discrete cards and their next card was setup to do AI and games competently. They were poised to compete with the low to mid range Nvidia cards at HALF the cost.

It was definitely going to upset the market. Now i understand the radio silence on a card that was supposed to have been coming by Xmas.

xp84 5 days ago | parent [-]

Oh for sure. Arc is in jeopardy. Though tbh it was already, wasn't it? Can't you see an alternate universe where this story never happened, but Intel announced today "Sorry, because our business is dying in general and since Arc hasn't made us a ton of money yet anyway, we need to cut Arc to focus on our core blah blah blah".

I just meant their integrated GPUs are what's completely safe here.

behringer 5 days ago | parent [-]

I doubt it's safe, it competes directly with Nvidia on handhelds.

Also the arc wasn't in jeopardy, the arc cards have been improving with every release and the latest one got pretty rave reviews.

xp84 5 days ago | parent [-]

It wasn't in jeopardy for being no good, it was in jeopardy because Intel is so troubled. Like the Bombardier C-Series jet: Everyone agreed it was a great design and very promising, but in the end they had no choice but to sell it to Airbus (who calls it the A220), I think because they didn't really have the money to scale up production. In like manner, Intel lacks the resources to make Arc the success it technically deserves to be, and without enough scale, they'll lose money on Arc, which Intel can hardly afford at this point.

kmacdough 6 days ago | parent | prev [-]

Calling BS on "gaming not part of the equation". Several of my friends and I have exclusively games on integrated graphics. Sure we don't play the most abusively unoptimized AAA games like RDR2. But we're here and we're gaming.

utternerd 6 days ago | parent | next [-]

RDR2 is quite optimized. We spend a lot of time profiling before release, and while input latency can be a tad high, the rendering pipeline is absolutely highly optimized as exhibited by the large amount of benchmarks on the web.

purpleflame1257 6 days ago | parent | next [-]

This is why I love HN. You get devs from any software or hardware project you care to name showing up in the comments.

uncircle 6 days ago | parent | prev [-]

RDR2 ran beautifully on Linux for me. If you were part of the team, excellent work.

xp84 6 days ago | parent | prev | next [-]

Sorry, I'm happy for you, and I do play Minecraft on an iGPU. I just meant that about 80% of the PCs sold seem to be for "business use" or Chromebooks, and the people writing those POs aren't making their selections with gaming in mind.

(And also, I'm pretending Macs don't exist for this statement. They aren't even PCs anymore anyway, just giant iPhones, from a silicon perspective.)

og_kalu 6 days ago | parent [-]

RDD2, Ghosts Of Tsushima, Black Myth Wukong. These games will play at 40 to 50 + fps at 1080p low to medium on the intel ARC igpus (no AI upscaling).

To anyone actually paying attention, igpus have come a long way. They are no longer an 'I can play minecraft' thing.

xp84 5 days ago | parent [-]

That performance is not surprising, Arc seems pretty dope in general.

I hadn't realized that "Arc" and "Integrated" overlapped, I thought that brand and that level of power was only being used on discrete cards.

I do think that integrated Arc will probably be killed by this deal though, not for being bad as it's obviously great, rather for being a way for Intel to cut costs with no downsides for Intel. If they can make RTX iGPUs now, and the Nvidia and RTX brand being the strongest in the gaming space... Intel isn't going to invest the money in continuing to develop Arc, even if Nvidia made it clear that they don't care, it just doesn't make any business sense now.

That is a loss for the cause of gaming competition. Although having Nvidia prop up Intel may prove to be a win for competition in terms of silicon in general versus them being sold off in parts, which could be a real possibility it seems.

fluoridation 6 days ago | parent | prev | next [-]

"Gaming" = "real-time-graphics-intensive application". You could be playing chess online, or emulated SNES games, but that's not what "gaming" refers to in a hardware context.

6 days ago | parent [-]
[deleted]
KronisLV 6 days ago | parent | prev [-]

> Sure we don't play the most abusively unoptimized AAA games like RDR2.

Wait, RDR2 is badly optimized? When I played it on my Intel Arc B580 and Ryzen 7 5800X, it seemed to work pretty well! Way better than almost any UE5 title, like The Forever Winter (really cool concept, but couldn't get past 20-30 FPS, even dropping down to 10% render scale on a 1080p monitor). Or with the Borderlands 4 controversy, I thought there'd be way bigger fish to fry.

jandrese 6 days ago | parent | prev | next [-]

It would be amusing to see nVidia cores integrated into the chipset instead of the Intel GPU cores. I doubt that is in the cards unless Intel is looking to slash the workforce by firing all of their graphics guys.

TiredOfLife 6 days ago | parent | prev [-]

Out of 9 desktop GT 7xx cards only 2 were Fermi rest were Kepler.

Out of 12 mobile GT 7xx cards only 3 were Fermi (and 2 of those were M and not GT) rest were Kepler.