Remix.run Logo
scrlk 7 days ago

> For personal computing, Intel will build and offer to the market x86 system-on-chips (SOCs) that integrate NVIDIA RTX GPU chiplets. These new x86 RTX SOCs will power a wide range of PCs that demand integration of world-class CPUs and GPUs.

https://www.intc.com/news-events/press-releases/detail/1750/...

What’s old is new again: back in 2017, Intel tried something similar with AMD (Kaby Lake-G). They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped: https://www.tomshardware.com/news/intel-discontinue-kaby-lak...

phkahler 7 days ago | parent | next [-]

I don't think this is Intel trying to save itself, it's nVidia. Intel GPUs have been in 3rd place for a long time, but their integrated graphics are widely available and come in 2nd place because nVidia can't compete in the x86 space. Intel graphics have been closing the gap with AMD and are now within what? A factor of 2 or less (1.5?)

IMHO we will soon see more small/quiet PCs without a slot for a graphics card, relying on integrated graphics. nVidia has no place in that future. But now, by dropping $5B on Intel they can get into some of these SoCs and not become irrelevant.

The nice thing for Intel is that they might be able to claim graphics superiority in SoC land since they are currently lagging in CPU.

jonbiggums22 7 days ago | parent | next [-]

Way back in the mid-late 2000s Intel CPUs could be used with third party chipsets not manufactured by Intel. This had been going on forever but the space was particularly wild with Nvidia being the most popular chipset manufacturer for AMD and also making in-roads for Intel CPUs. It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.

This was all for naught as AMD purchased ATi, shutting out all other chipsets and Intel did the same. Things actually looked pretty grim for Nvidia at this point in time. AMD was making moves that suggested APUs were the future and Intel started releasing platforms with very little PCIe connectivity, prompting Nvidia to build things like the Ion platform that could operate over an anemic pcie 1x link. There were really were the beginnings of strategic moves to lock Nvidia out of their own market.

Fortunately, Nvidia won a lawsuit against Intel that required them to have pcie 16x connectivity on their main platforms for 10 years or so and AMD put out non-competitive offerings in the CPU space such that the APU take off never happened. If Intel had actually developed their integrated GPUs or won that lawsuit or if AMD had actually executed Nvidia might well be an also-ran right around now.

To their credit, Nvidia really took advantage of their competitors inability to press their huge strategic advantage during that time. I think we're in a different landscape at the moment. Neither AMD nor Intel can afford boot Nvidia since consumers would likely abandon them for whoever could still slot in an Nvidia card. High performance graphics is the domain of add-in boards now and will be for awhile. Process node shrinks aren't as easy and cooling solutions are getting crazy.

But Nvidia has been shut out of the new handheld market and haven't been a good total package for consoles as SoC both rule the day in those spaces so I'm not super surprised at the desire for this pairing. But I did think nvidia had given up these ambitions was planning to try to build an adjacent ARM based platform as a potential escape hatch.

to11mtm 6 days ago | parent | next [-]

> It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.

This feels like a 'brand new sentence' to me because I've never met an ALi chipset that I liked. Every one I ever used had some shitty quirk that made VIA or SiS somehow more palatable [0] [1].

> Intel started releasing platforms with very little PCIe connectivity,

This is also a semi-weird statement to me, in that it was nothing new; Intel already had an established history of chipsets like the i810, 845GV, 865GV, etc which all lacked AGP. [2]

[0] - Aladdin V with it's AGP Instabilities, MAGiK 1 with it's poor handling of more than 2 or 3 'rows' of DDR (i.e. two double-sided sticks of DDR turned it into a shitshow no matter what you did to timings. 3 usually was 'ok-ish' and 2 was stable.)

[1] - SIS 730 and 735 were great chipsets for the money and TBH the closest to the AMD760 for stability.

[2] - If I had a dollar for every time I got to break the news to someone that there was no real way to put a Geforce or 'Radon' [3] in their eMachine, I could have had a then-decent down payment for a car.

[3] - Although, in an odd sort of foreshadowing, most people who called it a 'Radon', would specifically call it an AMD Radon... and now here we are. Oddly prescient.

hakfoo 6 days ago | parent | next [-]

I'm thinking the era of "great ALI chipsets" was more after they became ULi in the Athlon 64 era.

I had a ULi M1695 board (ASRock 939SLI32-eSATA2) and it was unusual for the era in that it was a $90 motherboard with two full x16 slots. Even most of the nForce boards at the time had it set up as x8/x8. For like 10 minutes you could run SLI with it until nVidia deliberately crippled the GeForce drivers to not permit it, but I was using it with a pretty unambitious (but fanless-- remember fanless GPUs?) 7600GS.

They also did another chipset pairing that offered a PCI-Ex16 slot and a fairly compatible AGP-ish slot for people who had bought an expensive (which then meant $300 for a 256MB card) graphics card and wanted to carry it over. There were a few other boards using other chipsets (maybe VIA) that tried to glue together something like that, but the support was much more hit-or-miss.

OTOH, I did have an Aladdin IV ("TXpro") board back in the day, and it was nice because it supported 83MHz bus speeds when a "better" Intel TX board wouldn't. A K6-233 overclocked to 250 (3x83) was detectably faster than at 262 (3.5x75)

RulerOf 5 days ago | parent | prev | next [-]

> most people who called it a 'Radon', would specifically call it an AMD Radon

If I had a dollar for the number of times I heard an IT professional say "Intel Xenon" I'd probably match your down payment.

jonbiggums22 6 days ago | parent | prev [-]

ALi was indeed pretty much on the avoid list for me for most of their history. It was only when they came out with the ULi M1695 made famous by the Asrock939dual-sata2 that they were a contender for best out of nowhere. One of the coolest boards I ever owned and was rock solid for me even with all of the weird configs I ran on it. I kind of wish I hadn't sold it even today!

I remember a lot disappointed people on forums who couldn't upgrade their cheap PCs as well, but there were still motherboards available with AGP to slot into for Intel's best products. Intel couldn't just remove it from the landscape altogether (assuming they wanted to) because they weren't the only company making Intel supporting chipsets. IIRC Intel/AMD/Nvidia were not interested in making AGP+PCIe supporting chipsets at all, but VIA/ALi and maybe SiS made them instead because it was a free for all space still. Once that went away Nvidia couldn't control their own destiny.

ninetyninenine 7 days ago | parent | prev | next [-]

nvidia does build SOCs already. The AGXs and other offerings. I'm curious why they want intel despite having that technical capability of building SOCs.

I realize the AGX is more of a low power solution and it's possible that nvidia is still technically limited when building SOCs but this is just speculation.

Does anybody know actual ground truth reasoning why Nvidia is buying Intel despite the fact that nvidia can make their own SOCs?

KeplerBoy 6 days ago | parent [-]

Why is Nvidia partnering with Mediatek for CPU cores in the dgx spark? Different question, probably the same answer.

whatevaa 7 days ago | parent | prev [-]

Nvidia just doesn't care about console and handheld markets. They are unwilling to make customisations and it's low margin business.

buildbot 7 days ago | parent [-]

? https://blogs.nvidia.com/blog/nintendo-switch-2-leveled-up-w...

dontlaugh 7 days ago | parent [-]

The point stands, they're not willing to make something that could go in an ROG Ally, for example.

sniffers 7 days ago | parent | next [-]

The rog ally probably won't sell a million units. The switch will sell 100 million. The switch is the mobile market, like it or not.

numpad0 6 days ago | parent | prev | next [-]

Sometimes HN users appear to have absolutely zero sense of scales. Lifetime sales numbers of those are like hours to days worth equivalent of Switch 2.

bigyabai 7 days ago | parent | prev | next [-]

You can take a Nintendo Switch 1, hack open the bootloader, and install Linux with Vulkan-compatible drivers.

Make no mistake - there is no reason to do this besides shortening the hardware lifespan with Box86. But it is possible, most certainly.

Yokolos 7 days ago | parent | prev [-]

You mean like this? https://www.rockpapershotgun.com/msi-claw-8-ai-plus-review

wirybeige 7 days ago | parent | prev | next [-]

Xe2 is superior to current AMD integrated already

yujzgzc 7 days ago | parent | next [-]

I think the comparison was between Nvidia standalone graphics chips and Intel integrated graphics capabilities.

7 days ago | parent | prev [-]
[deleted]
mrheosuper 6 days ago | parent | prev | next [-]

> Intel graphics have been closing the gap with AMD and are now within what? A factor of 2 or less (1.5?)

Apart from that APU(395+) from AMD, intel iGPU is on par with AMD right now.

The 395+ is more like dGPU and CPU on same die.

SilverbeardUnix 7 days ago | parent | prev | next [-]

Intel hasn't had desktop GPUs for a long time. Your timescale is off compared to how long AMD and Nvidia have had to polish their GPUs.

edm0nd 6 days ago | parent | prev [-]

I'm curious, why do you type it as nVidia instead of NVIDIA or Nvidia ?

joz1-k 7 days ago | parent | prev | next [-]

RIP Arc and Gaudi. There is no other way how to read this. Fewer competitors => higher prices.

jonbiggums22 7 days ago | parent | next [-]

I think it is bad news for the GPU market (AMD has had a beachhead with their integrated solution here as they've lost out elsewhere) but good for x86 which I've worried would be greatly diminished as Intel became less competitive.

numpad0 6 days ago | parent | prev | next [-]

I just realized there's worse possibility. They might offer it as successor to xx50/60 RTX GPUs to unsupport CUDA on low ends.

philistine 7 days ago | parent | prev [-]

Absolutely. This is terrible news for high emission gamers, who have been living under the boot of Nvidia for decades.

ddalex 7 days ago | parent | prev | next [-]

That was targeted at supporting more tightly integrated and performant Macbooks .... it flopped because Apple came up with M1, not because it was bad per se.

JonChesterfield 7 days ago | parent | next [-]

The ryzen APUs had a rocky start but are properly good now, the concept is sound

intvocoder 7 days ago | parent | prev [-]

apple never shipped a product with that, but it made for an excellent hackintosh

ddalex 6 days ago | parent [-]

Exactly

linuxftw 7 days ago | parent | prev | next [-]

To me, this just validates what AMD has been doing for over a decade. Integrated GPUs for personal computing are the way forward.

mrheosuper 6 days ago | parent | prev | next [-]

>They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped.

Not sure if it's flopped, because the only machine with that CPU i could find is the intel nuc.

22c 6 days ago | parent | prev | next [-]

> What’s old is new again

Let's go back even further.. I get strong nForce vibes from that extract!

newsclues 7 days ago | parent | prev | next [-]

Stick some CUDA cores on the CPU and market is for AI?

herodoturtle 7 days ago | parent | prev [-]

> Intel tried something similar with AMD (Kaby Lake-G). They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped

/me picturing Khaby Lame gesturing his hands at an obvious workaround.