Remix.run Logo
jonbiggums22 7 days ago

Way back in the mid-late 2000s Intel CPUs could be used with third party chipsets not manufactured by Intel. This had been going on forever but the space was particularly wild with Nvidia being the most popular chipset manufacturer for AMD and also making in-roads for Intel CPUs. It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.

This was all for naught as AMD purchased ATi, shutting out all other chipsets and Intel did the same. Things actually looked pretty grim for Nvidia at this point in time. AMD was making moves that suggested APUs were the future and Intel started releasing platforms with very little PCIe connectivity, prompting Nvidia to build things like the Ion platform that could operate over an anemic pcie 1x link. There were really were the beginnings of strategic moves to lock Nvidia out of their own market.

Fortunately, Nvidia won a lawsuit against Intel that required them to have pcie 16x connectivity on their main platforms for 10 years or so and AMD put out non-competitive offerings in the CPU space such that the APU take off never happened. If Intel had actually developed their integrated GPUs or won that lawsuit or if AMD had actually executed Nvidia might well be an also-ran right around now.

To their credit, Nvidia really took advantage of their competitors inability to press their huge strategic advantage during that time. I think we're in a different landscape at the moment. Neither AMD nor Intel can afford boot Nvidia since consumers would likely abandon them for whoever could still slot in an Nvidia card. High performance graphics is the domain of add-in boards now and will be for awhile. Process node shrinks aren't as easy and cooling solutions are getting crazy.

But Nvidia has been shut out of the new handheld market and haven't been a good total package for consoles as SoC both rule the day in those spaces so I'm not super surprised at the desire for this pairing. But I did think nvidia had given up these ambitions was planning to try to build an adjacent ARM based platform as a potential escape hatch.

to11mtm 6 days ago | parent | next [-]

> It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.

This feels like a 'brand new sentence' to me because I've never met an ALi chipset that I liked. Every one I ever used had some shitty quirk that made VIA or SiS somehow more palatable [0] [1].

> Intel started releasing platforms with very little PCIe connectivity,

This is also a semi-weird statement to me, in that it was nothing new; Intel already had an established history of chipsets like the i810, 845GV, 865GV, etc which all lacked AGP. [2]

[0] - Aladdin V with it's AGP Instabilities, MAGiK 1 with it's poor handling of more than 2 or 3 'rows' of DDR (i.e. two double-sided sticks of DDR turned it into a shitshow no matter what you did to timings. 3 usually was 'ok-ish' and 2 was stable.)

[1] - SIS 730 and 735 were great chipsets for the money and TBH the closest to the AMD760 for stability.

[2] - If I had a dollar for every time I got to break the news to someone that there was no real way to put a Geforce or 'Radon' [3] in their eMachine, I could have had a then-decent down payment for a car.

[3] - Although, in an odd sort of foreshadowing, most people who called it a 'Radon', would specifically call it an AMD Radon... and now here we are. Oddly prescient.

hakfoo 6 days ago | parent | next [-]

I'm thinking the era of "great ALI chipsets" was more after they became ULi in the Athlon 64 era.

I had a ULi M1695 board (ASRock 939SLI32-eSATA2) and it was unusual for the era in that it was a $90 motherboard with two full x16 slots. Even most of the nForce boards at the time had it set up as x8/x8. For like 10 minutes you could run SLI with it until nVidia deliberately crippled the GeForce drivers to not permit it, but I was using it with a pretty unambitious (but fanless-- remember fanless GPUs?) 7600GS.

They also did another chipset pairing that offered a PCI-Ex16 slot and a fairly compatible AGP-ish slot for people who had bought an expensive (which then meant $300 for a 256MB card) graphics card and wanted to carry it over. There were a few other boards using other chipsets (maybe VIA) that tried to glue together something like that, but the support was much more hit-or-miss.

OTOH, I did have an Aladdin IV ("TXpro") board back in the day, and it was nice because it supported 83MHz bus speeds when a "better" Intel TX board wouldn't. A K6-233 overclocked to 250 (3x83) was detectably faster than at 262 (3.5x75)

RulerOf 5 days ago | parent | prev | next [-]

> most people who called it a 'Radon', would specifically call it an AMD Radon

If I had a dollar for the number of times I heard an IT professional say "Intel Xenon" I'd probably match your down payment.

jonbiggums22 6 days ago | parent | prev [-]

ALi was indeed pretty much on the avoid list for me for most of their history. It was only when they came out with the ULi M1695 made famous by the Asrock939dual-sata2 that they were a contender for best out of nowhere. One of the coolest boards I ever owned and was rock solid for me even with all of the weird configs I ran on it. I kind of wish I hadn't sold it even today!

I remember a lot disappointed people on forums who couldn't upgrade their cheap PCs as well, but there were still motherboards available with AGP to slot into for Intel's best products. Intel couldn't just remove it from the landscape altogether (assuming they wanted to) because they weren't the only company making Intel supporting chipsets. IIRC Intel/AMD/Nvidia were not interested in making AGP+PCIe supporting chipsets at all, but VIA/ALi and maybe SiS made them instead because it was a free for all space still. Once that went away Nvidia couldn't control their own destiny.

ninetyninenine 7 days ago | parent | prev | next [-]

nvidia does build SOCs already. The AGXs and other offerings. I'm curious why they want intel despite having that technical capability of building SOCs.

I realize the AGX is more of a low power solution and it's possible that nvidia is still technically limited when building SOCs but this is just speculation.

Does anybody know actual ground truth reasoning why Nvidia is buying Intel despite the fact that nvidia can make their own SOCs?

KeplerBoy 6 days ago | parent [-]

Why is Nvidia partnering with Mediatek for CPU cores in the dgx spark? Different question, probably the same answer.

whatevaa 7 days ago | parent | prev [-]

Nvidia just doesn't care about console and handheld markets. They are unwilling to make customisations and it's low margin business.

buildbot 7 days ago | parent [-]

? https://blogs.nvidia.com/blog/nintendo-switch-2-leveled-up-w...

dontlaugh 7 days ago | parent [-]

The point stands, they're not willing to make something that could go in an ROG Ally, for example.

sniffers 7 days ago | parent | next [-]

The rog ally probably won't sell a million units. The switch will sell 100 million. The switch is the mobile market, like it or not.

numpad0 6 days ago | parent | prev | next [-]

Sometimes HN users appear to have absolutely zero sense of scales. Lifetime sales numbers of those are like hours to days worth equivalent of Switch 2.

bigyabai 7 days ago | parent | prev | next [-]

You can take a Nintendo Switch 1, hack open the bootloader, and install Linux with Vulkan-compatible drivers.

Make no mistake - there is no reason to do this besides shortening the hardware lifespan with Box86. But it is possible, most certainly.

Yokolos 7 days ago | parent | prev [-]

You mean like this? https://www.rockpapershotgun.com/msi-claw-8-ai-plus-review