Remix.run Logo
evanjrowley 7 days ago

Nvidia's stake in Intel could have terrible consequences. First, it is in Nvidia's interest to kill Intel's Arc graphics, and that would be very bad because it is the only thing brighing GPU prices down for consumers. Second, the death of Intel graphics / Arc would be extremely bad for Linux, because Intel's approach to GPU drivers is the best for compatibility, wheras Nvidia is actively hostile to drivers on Linux. Third, Intel is the only company marketing consumer-grade graphics virtualization (SR-IOV), and the loss of that would make Nvidia's enterprise chips the only game in town, meaning the average consumer gets less performance, less flexibility, and less security on their computers.

ho_schi 6 days ago | parent | next [-]

Conclusion: Buy AMD. Excellent Linux support with in-tree drivers. For 15 years! A bug is something which will be fixed.

Nvidias GPUs are theoretically fast on initial benchmarks. But that’s mostly optimization by others for Nvidia? That’s it.

Everything Nvidia has done is a pain. Closed-source drivers (old pain), out of tree-drivers (new pain), ignoring (or actively harming) Wayland (everyone handles implicit sync well, except Nvidia which required explicit sync[1]), and awkward driver bugs declared as “it is not a bug, it is a feature”. The infamous bug:

    This extension provides a    way for applications to discover when video
    memory content has been lost, so that the application can re-populate
    the video memory content as necessary.
https://registry.khronos.org/OpenGL/extensions/NV/NV_robustn...

This extension will be soon ten years old. At least they intend to fix it? They just didn’t in the past 9 years! Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on. The good news is, after years someone figured that out and implemented a workaround. For X11 with GNOME:

https://www.phoronix.com/news/NVIDIA-Ubuntu-2025-SnR

I hope in the meantime somebody implemented a patch for Wayland.

What we need? Reliability. And Linux support. That’s why I purchase AMD. And previously Intel.

[1] I don’t judge whether implicit sync or explicit are better.

adrian_b 6 days ago | parent | next [-]

AMD is not competing enough with NVIDIA, so they are not a solution.

What I mean is that whenever NVIDIA removed features from their "consumer" GPUs in order to reduce production costs and increase profits, AMD immediately followed them, instead of attempting to offer GPUs that have something that NVIDIA does not have.

Intel at least tries to be a real competitor, e.g. by offering much, much better FP64 performance or by offering more memory.

If Intel's discrete GPUs disappear, there will be no competition in consumer GPUs, as AMD tries to compete only in "datacenter" GPUs. I have ancient AMD GPUs that I cannot upgrade to newer AMD GPUs, because the newer GPUs are worse, not better (for computational applications; I do not care about games), while Intel offers acceptable substitutes, due to excellent performance per $.

Moreover, NVIDIA also had excellent Linux driver support for more than 2 decades, not only for games, but also for professional graphics applications (i.e. much better OpenGL support than AMD) and for GPU computing applications (i.e. CUDA). AMD gets bonus points for open-source drivers and much more complete documentation, but the quality of their drivers has been typically significantly worse.

NVIDIA always had good support even for FreeBSD, where I had to buy discrete NVIDIA GPU cards for computers with AMD APUs that were not supported for any other OS except Windows and Linux.

AMD "consumer" GPUs are a great choice for those who are interested only in games, but not for those interested in any other GPU applications. AMD "datacenter" GPUs are good, but they are far too expensive to be worthwhile for small businesses or for individuals.

clhodapp 6 days ago | parent | prev | next [-]

I've found the amdgpu Linux driver to be fairly buggy running dual monitors with my Radeon VII, and found things like the fTPM to be highly buggy on Threadripper 2k/x399 to the point that I had to add a dTPM. They never got things truly working properly with those more-niche products before they just.. kind of... stopped working on them. And of course ROCm is widely regarded to be a mess.

On the other hand, my Steam Deck has been exceedingly stable.

So I guess I would say: Buy AMD but understand that they don't have the resources to truly support all of their hardware on any platform, so they have to prioritize.

mjevans 6 days ago | parent [-]

I seem to recall the Vega era as 'when I wouldn't buy a GPU because AMDs were just unstable' (and of course never closed source Nvidia).

Took me almost 5 min to drill through enough Wikipedia pages to find the Radeon VII string.

https://en.wikipedia.org/wiki/List_of_AMD_graphics_processin... https://en.wikipedia.org/wiki/Radeon_RX_Vega_series

Contrast that with the earlier R9 285 that I used for nearly 10 years until I was finally able to get a 9070XT that I'm very happy with. They were still refining support for that aged GCN 1.2 driver even today, even if things are a lower priority to backport.

Overall the ONLY things I'm unhappy about this GPU generation.

* Too damned expensive * Not enough VRAM (and no ECC off of workstation cards?) * Too hard for average consumers to just buy direct and cut out the scalpers

The only way I could get my hands on a card was to buy through a friend that lives within range of a Microcenter. The only true saints of computer hardware in the whole USA.

lmm 6 days ago | parent | prev | next [-]

> What we need? Reliability. And Linux support

Both of which NVidia does a lot better in practice! I'm all for open-source in-tree drivers, but in practice, 15 years on, AMD is still buggy on Linux, whereas NVidia works well (not just on Linux but on FreeBSD too).

> I don’t judge whether implicit sync or explicit are better.

Maybe you should.

shmerl 6 days ago | parent [-]

> Both of which NVidia does a lot better in practice!

Correction - if they care. And they don't care to do it on Linux, so you get them dragging feet for decades for something like Wayland support, PRIME, you name it.

Basically, the result is that in practice they offer abysmally bad support, otherwise they'd have upstream kernel drivers and no userspace blobs. Linux users should never buy Nvidia.

lmm 6 days ago | parent | next [-]

> And they don't care to do it on Linux

I don't understand what you're saying here. I've used NVidia on Linux and FreeBSD a lot. They work great.

If your argument is they don't implement some particular feature that matters to you, fair enough. But that's not an argument that they don't offer stability or Linux support. They do.

shmerl 5 days ago | parent [-]

Taking very long to implement stuff is a perfect argument of bad support for the platform. Timely support isn't any less important than support in general.

jpc0 5 days ago | parent [-]

Are you a product manager? Or do you just not see the irony on your comment?

Long term support means my thing that has been working great continues to work great. New feature implementation has nothing to do with that and is arguably directly against long term support.

And Nvidia seems justified in this since effectively no distro dropper X11 until Nvidia had support.

shmerl 4 days ago | parent [-]

If you think taking decades is an acceptable rate while others do it in a timely manner it's your own problem. For any normal user it's completely unacceptable and is the opposite of great (add to it, that even after decades of dragging their feet they only offer half cooked support and still can't even sort out upstreaming their mess). Garbage support is what it is.

jpc0 4 days ago | parent [-]

AMD is notorious for not having ROCM support on in production currently sold GPUs, and horrendous bugs that actually make using the devices unusable.

I use AMD gpus on linux, I generally regret not just buying an Nvidia GPU purely because of AMDs lacklustre support for compute use cases in general.

Intel is still too new in the dGPU market to trust and on top of that there is so much uncertainty about whether that entire product line will disappear.

So at this point the CUDA moat makes is a non issue, on top of that what works works and keeps working, whereas with AMD I constantly wonder whether something will randomly not work after an update.

A timeline of decades for “features” your biggest consumers don’t care about is a reasonable tradeoff, even more so if actually pushing those features would reduce stability.

shmerl 4 days ago | parent [-]

That's exactly the point. Nvidia might care about industrial use cases, while they don't care about desktop Linux usage and their support is garbage in result.

bigyabai 6 days ago | parent | prev | next [-]

Wayland support hasn't been an issue since GLX was depreciated for EGLStream. I think the Nvidia backend has been "functional" for ~3 years and nearly flawless for the past year or so.

Both Mutter and KWin have really good Nvidia Wayland sessions nowadays.

shmerl 5 days ago | parent [-]

It got better, but my point is how long it took to get better. That's the indicator of how much they care about Linux use cases in general. Which is way below acceptable level - it's simply not their priority (which is also exacerbated by their hostile approach to upstreaming).

I.e. if anything new will need something implemented tomorrow, Nvidia will make their users wait another decade again. Which I consider an unacceptable level of support and something that flies in the face of those who claim that Nvidia supports Linux well.

jgb1984 6 days ago | parent | prev [-]

I've been using Nvidia gpus exclusively on debian linux for the past 20 years, using the binary Nvidia drivers. Rock solid stability and excellent performance. I don't care for Wayland as I plan to stay on Xorg + Openbox for as long as I can.

guerrilla 6 days ago | parent | prev | next [-]

Buying AMD (for graphics) has been the only ethical choive for a long time. We must support the underdogs. Since regulation has flown the coop, we must take respondibility ourselves to fight monopolies. The short term costs may be a bit higher but the long term payoff is the only option for our self-interest!

/ steps down from soap box /

mort96 6 days ago | parent | prev | next [-]

> Conclusion: Buy AMD. Excellent Linux support with in-tree drivers.

Funnily, AMD's in-tree drivers are kind of a pain in the ass. For up to a year after a new GPU is released, you have to deal with using mesa and kernel packages from outside your distro.. While if you buy a brand new nVidia card, you just install the latest release of the proprietary drivers and it'll work.

Linux's driver model really is not kind to new hardware releases.

Of course, I still buy AMD because Nvidia's drivers really aren't very good. But that first half a year was not pleasant last time I got a relatively recently released (as in, released half a year earlier) AMD card.

account42 6 days ago | parent [-]

Use a better distro that includes drivers for new hardware.

mort96 6 days ago | parent [-]

A lot of people want to use Ubuntu or Ubuntu-based distros.

I have since switched from Ubuntu to Fedora, maybe Fedora ships mesa and kernel updates within a week or two from release, I don't know. But being unable to use the preferred distro is a serious downside for many people.

est31 6 days ago | parent | prev | next [-]

> Excellent Linux support with in-tree drivers. For 15 years!

Linux support has been excellent on AMD for less than 15 years though. It got really good around 10 years ago, not before.

kimixa 6 days ago | parent [-]

ATI/AMD open source linux support has been blowing hot and cold for over 25 years now.

They were one of the first to actually support open source drivers, with the r128 and original radeon (r100) drivers. Then went radio silence for the next few years, though the community used that as a baseline to support the next few generations (r100 to r500).

Then they reemerged with actually providing documentation for their Radeon HD series (r600 and r700), and some development resources but limited - and often at odds with the community-run equivalents at the time (lots of parallel development with things like the "radeonhd" driver and disagreements on how much they should rely on their "atombios" card firmware).

That "moderate" level of involvement continued for years, releasing documentation and some initial code for the GCN cards, but it felt like beyond the initial code drops most of the continuing work was more community-run.

Then only relatively recently (the last ~10 years) have they started putting actual engineering effort into things again, with AMDGPU and the majority of mesa changes now being paid for by AMD (or Valve, which is "AMD by proxy" really as you can guarantee every $ they spend on an engineer is $ less they pay to AMD).

So hopefully that's a trend you can actually rely on now, but I've been watching too long to think that can't change on a dime.

ahartmetz 6 days ago | parent [-]

It is possible that at some point, maybe 15 years ago, AMD provided sufficient documentation to write drivers, but even 10 years ago, a lot of documentation was missing (without even mentioning that fact), which made trying to contribute rather frustrating. Not too bad, because as you said, they had a (smallish) number of employees working on the open drivers by then.

hdjfjzhej 6 days ago | parent | prev | next [-]

Agreed! This is great news for AMD and users.

Those who want to run Linux seriously will buy AMD. Intel will be slowly phased out, and this will reduce maintenance and increase the quality of anything that previously had to support both Intel and AMD.

However, if Microsoft or Apple scoop up AMD, all hell will break loose. I don’t think either would have interest in Linux support.

account42 6 days ago | parent [-]

> Agreed! This is great news for AMD and users.

Less competition is NOT good news for AMD users. Their CPUs are already at lot less competitively priced now that they beat Intel for market share.

trklausss 6 days ago | parent | prev | next [-]

Oh boy that strikes a nerve with the "Video memory could be gone after Suspend/Resume". Countless hours lost trying to fix a combination of drivers and systemd hooks for my laptop to be able to suspend/hibernate and wake up back again without issues... Which makes it even more complicated when using Wayland.

I have been looking at high-end laptops with dedicated AMD Graphics chip, but can't find many... So I will probably go with AMD+NVidia with MUX switch, let's see how it goes... Unless someone else has other suggestions?

codedokode 6 days ago | parent | prev | next [-]

> Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on.

This actually makes sense: for example, a new task has swapped out previous task's data, or host and guest are sharing the GPU and pushing each others data away. I don't understand why this is not a part of GPU-related standards.

As for solution, discarding all the GPU data after resume won't help? Or keeping the data in the system RAM.

jacobgorm 6 days ago | parent | prev | next [-]

Last I tried to file a bug for a crash in an AMD Windows driver I had to through an anonymous developer I found on Discord, and despite weeks of efforts writing and sharing test case they choose to ignore the bug report I the end. The developer even asked not to be named as he might face repercussions for trying to help out.

ekianjo 6 days ago | parent | prev | next [-]

Excellent Linux support. Except for ROCm which is a big mess.

bobajeff 6 days ago | parent | prev | next [-]

I once had an mini pc with Nvidia. I got it for Cuda dev. One day the support for it was dropped for it so I was unable to update my system without it messing things up. So regardless of Cuda I decided Nvidia is not for me.

However, doing research when buying a new pc, I've found that AMD kind of sucks too. ROCm isn't even supported on many of the systems i was looking into. Also, I've heard their Linux graphics drivers are poor.

So basically I just rock a potato with Intel integrated graphics now. GPUs cost too much to deal with that nonsense.

ahartmetz 6 days ago | parent [-]

I would really disagree with AMD's Linux graphics drivers are poor. Only ROCm is.

bobajeff 6 days ago | parent [-]

In your case maybe, but not according to some of the comments here in this very thread and also in some forums and YouTube videos back when I'd last checked.

bigyabai 6 days ago | parent | prev [-]

FWIW, my experience gaming/web browsing/coding on a 3070 with modern drivers has been fine. Mutter and KWin both have very good Wayland sessions if you're running the new (>550-series) drivers.

bee_rider 6 days ago | parent | prev | next [-]

Apparently it is 5% ownership. Does that give them enough leverage to tank Intel’s iGPUs?

That would seem weird to be. Intel’s iGPUs are an incredibly good solution for their (non-glamorous) niche.

Intel’s dGPUs might be in a risky spot, though. (So… what’s new?)

Messing up Intel’s iGPUs would be a huge practical loss for, like, everyday desktop Linux folks. Tossing out their dGPUs, I don’t know if it is such a huge loss.

sodality2 6 days ago | parent | next [-]

> Tossing out their dGPUs, I don’t know if it is such a huge loss

It would be an enormous loss to the consumer/enthusiast GPU buyer, as a third major competitor is improving the market from what feels like years and years of dreadful price/perf ratio.

behringer 6 days ago | parent | next [-]

They were one release away from completely upending the market. A sad day this is.

cluckindan 6 days ago | parent | prev [-]

You don’t say… on the very same day AMD launched a new RDNA3 card (RX 7700).

Literally a previous gen card.

sim7c00 6 days ago | parent [-]

amd is slow and steady. they were behind many times and many times they surprrised with amazing innovations overtaking intel. they will do it again, for both CPU and GPU.

tart-lemonade 6 days ago | parent | prev | next [-]

Intel's iGPUs don't seem very at risk because the market for low-power GPUs isn't very profitable to begin with. As long as Nvidia is able to sell basically any chip they want, why waste engineering hours and fab time on low-margin chips? The GT 1030 (Pascal) never got a successor, so that line is as good as dead.

Even before the Pascal GTs, most of the GT 7xx cards, which you would assume were Maxwell or Kepler from the numbering, were rebadged Fermi cards (4xx and 5xx)! That generation was just a dumping ground for all the old chips they had laying about, and given the prominence of halfway decent iGPUs by that point, I can't say I blame them for investing so little in the lineup.

That said, the dGPUs are definitely somewhat at risk, but I think the risk is only slightly elevated by this investment, given that it isn't exactly a cash cow and Intel has been doing all sorts of cost-cutting lately.

hakfoo 6 days ago | parent | next [-]

Aren't a lot of those cards sold for the audience that needs more display heads rather than necessarily performance?

This has been somewhat improved-- some mainboards will have HDMI and DisplayPort plumbed to the iGPU, but the classic "trader desk" with 4-6 screens hardly needs a 5090.

They could theoretically sell the same 7xx and 1030 chips indefinitely. I figure it's a static market like those strange 8/16Mb VGA chipsets that you sometimes see on server mainboards, just enough hardware to run diagnostics on a normally headless box.

xp84 6 days ago | parent | prev | next [-]

Agree. Not only would there be no money in it to try to replace Iris graphics or whatever they call them now -- it would be ultra pointless because the only people buying integrated graphics are those where gaming, on-device AI, and cryptocurrency aren't even part of the equation. Now, that is like 80%+ of the PC market, but it's perfectly well served already.

I saw this move more as setting up a worthy competitor to Snapdragon X Elite, and it could also probably crush AMD APUs if these RTX things are powerful.

behringer 6 days ago | parent | next [-]

Intel sells discrete cards and their next card was setup to do AI and games competently. They were poised to compete with the low to mid range Nvidia cards at HALF the cost.

It was definitely going to upset the market. Now i understand the radio silence on a card that was supposed to have been coming by Xmas.

xp84 5 days ago | parent [-]

Oh for sure. Arc is in jeopardy. Though tbh it was already, wasn't it? Can't you see an alternate universe where this story never happened, but Intel announced today "Sorry, because our business is dying in general and since Arc hasn't made us a ton of money yet anyway, we need to cut Arc to focus on our core blah blah blah".

I just meant their integrated GPUs are what's completely safe here.

behringer 5 days ago | parent [-]

I doubt it's safe, it competes directly with Nvidia on handhelds.

Also the arc wasn't in jeopardy, the arc cards have been improving with every release and the latest one got pretty rave reviews.

xp84 5 days ago | parent [-]

It wasn't in jeopardy for being no good, it was in jeopardy because Intel is so troubled. Like the Bombardier C-Series jet: Everyone agreed it was a great design and very promising, but in the end they had no choice but to sell it to Airbus (who calls it the A220), I think because they didn't really have the money to scale up production. In like manner, Intel lacks the resources to make Arc the success it technically deserves to be, and without enough scale, they'll lose money on Arc, which Intel can hardly afford at this point.

kmacdough 6 days ago | parent | prev [-]

Calling BS on "gaming not part of the equation". Several of my friends and I have exclusively games on integrated graphics. Sure we don't play the most abusively unoptimized AAA games like RDR2. But we're here and we're gaming.

utternerd 6 days ago | parent | next [-]

RDR2 is quite optimized. We spend a lot of time profiling before release, and while input latency can be a tad high, the rendering pipeline is absolutely highly optimized as exhibited by the large amount of benchmarks on the web.

purpleflame1257 6 days ago | parent | next [-]

This is why I love HN. You get devs from any software or hardware project you care to name showing up in the comments.

uncircle 6 days ago | parent | prev [-]

RDR2 ran beautifully on Linux for me. If you were part of the team, excellent work.

xp84 6 days ago | parent | prev | next [-]

Sorry, I'm happy for you, and I do play Minecraft on an iGPU. I just meant that about 80% of the PCs sold seem to be for "business use" or Chromebooks, and the people writing those POs aren't making their selections with gaming in mind.

(And also, I'm pretending Macs don't exist for this statement. They aren't even PCs anymore anyway, just giant iPhones, from a silicon perspective.)

og_kalu 6 days ago | parent [-]

RDD2, Ghosts Of Tsushima, Black Myth Wukong. These games will play at 40 to 50 + fps at 1080p low to medium on the intel ARC igpus (no AI upscaling).

To anyone actually paying attention, igpus have come a long way. They are no longer an 'I can play minecraft' thing.

xp84 5 days ago | parent [-]

That performance is not surprising, Arc seems pretty dope in general.

I hadn't realized that "Arc" and "Integrated" overlapped, I thought that brand and that level of power was only being used on discrete cards.

I do think that integrated Arc will probably be killed by this deal though, not for being bad as it's obviously great, rather for being a way for Intel to cut costs with no downsides for Intel. If they can make RTX iGPUs now, and the Nvidia and RTX brand being the strongest in the gaming space... Intel isn't going to invest the money in continuing to develop Arc, even if Nvidia made it clear that they don't care, it just doesn't make any business sense now.

That is a loss for the cause of gaming competition. Although having Nvidia prop up Intel may prove to be a win for competition in terms of silicon in general versus them being sold off in parts, which could be a real possibility it seems.

fluoridation 6 days ago | parent | prev | next [-]

"Gaming" = "real-time-graphics-intensive application". You could be playing chess online, or emulated SNES games, but that's not what "gaming" refers to in a hardware context.

6 days ago | parent [-]
[deleted]
KronisLV 6 days ago | parent | prev [-]

> Sure we don't play the most abusively unoptimized AAA games like RDR2.

Wait, RDR2 is badly optimized? When I played it on my Intel Arc B580 and Ryzen 7 5800X, it seemed to work pretty well! Way better than almost any UE5 title, like The Forever Winter (really cool concept, but couldn't get past 20-30 FPS, even dropping down to 10% render scale on a 1080p monitor). Or with the Borderlands 4 controversy, I thought there'd be way bigger fish to fry.

jandrese 6 days ago | parent | prev | next [-]

It would be amusing to see nVidia cores integrated into the chipset instead of the Intel GPU cores. I doubt that is in the cards unless Intel is looking to slash the workforce by firing all of their graphics guys.

TiredOfLife 6 days ago | parent | prev [-]

Out of 9 desktop GT 7xx cards only 2 were Fermi rest were Kepler.

Out of 12 mobile GT 7xx cards only 3 were Fermi (and 2 of those were M and not GT) rest were Kepler.

freedomben 6 days ago | parent | prev | next [-]

I would guess Nvidia doesn't care at all about the iGPUs, so I agree they are probably not at risk. dGPUs though I absolutely agree They are in a risky spot. Perhaps Intel was planning to kill their more ambitious GPU goals anyway, but That seems extremely unhealthy for pretty much everyone except Nvidia

dijit 6 days ago | parent | prev | next [-]

5% of Ubisoft was all it took for Tencent to have very deep reaching ramifications.

They were felt at an IC level.

monocasa 6 days ago | parent | prev | next [-]

We'd have to see their cap table approximation, but I've seen functional control over a company with just a hair over 10% ownership given the voting patterns of the other stock holders.

5% by about any accounting makes you a very, very influential stockholder in a publicly traded company with a widely distributed set of owners.

misiek08 6 days ago | parent | prev | next [-]

Intel was already dead, even money from gov didn’t help them. It is old, legacy, bad corp. I think NV just wants to help them and use however it wants - Intel management will do anything they say.

beached_whale 6 days ago | parent | prev | next [-]

Intels gpus are a better solution for almost all computing outside high end gaming, ai, and a few other tasks. For most things a better gpu is overkill and wastes energy

JustExAWS 6 days ago | parent [-]

So they are better except in all the ways that people care about…

beached_whale 2 days ago | parent [-]

Most computing tasks are not those. They may care, but boring is the norm.

giveita 6 days ago | parent | prev [-]

Would be antitrust right?

monocasa 6 days ago | parent [-]

Which would take an administration that cared about enforcing anti trust for the stated reasons behind anti trust laws.

arkmm 6 days ago | parent | prev | next [-]

This misses the forest from the trees IMO:

- The datacenter GPU market is 10x larger than the consumer GPU market for Nvidia (and it's still growing). Winning an extra few percentage points in consumer is not a priority anymore.

- Nvidia doesn't have a CPU offering for the datacenter market and they were blocked from acquiring ARM. It's in their interest to have a friend on the CPU side.

- Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC. Intel is one of the only other leading fabs onshoring, which significantly improves Nvidia's supplier negotiation position and hedges geopolitical risk.

tw04 6 days ago | parent | next [-]

> Nvidia doesn't have a CPU offering for the datacenter market and they were blocked from acquiring ARM. It's in their interest to have a friend on the CPU side.

Someone should tell nvidia that. They sure seem to think they have a datacenter CPU.

https://www.nvidia.com/en-us/data-center/grace-cpu-superchip...

kimixa 6 days ago | parent | next [-]

I wonder if this signal a lack of confidence in their CPU offerings going forward?

But there's always TSMC being a pretty hard bottleneck - maybe they just can't get enough (and can't charge close to their GPU offerings per wafer), and pairing with Intel themselves is preferable to just using Intel's Foundry services?

gpm 6 days ago | parent | prev | next [-]

> Someone should tell nvidia that

To be fair from what I hear someone really should tell at least half of nvidia that.

high_na_euv 6 days ago | parent | prev [-]

Jensen was literally talking about the need for x86 CPU on yesterdays webcast

trhway 6 days ago | parent | prev | next [-]

>Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC.

East India Company has been conducting continental wars on its own. A modern company with $4T valuation and a country-GDP-size revenue and possessing key military technology of today and tomorrow wars - AI software and hardware, including robotics - can successfully wage such a continental war through a suitable proxy, say an oversized private military contractor (especially if it massively armed with drones and robots), and in particular is capable of defending an island like Taiwan. (or thinking backwards - an attack on Taiwan would cause a trillion or two drop in NVDA valuation. What options get on the table when there is a threat of a trillion dollar loss ... To compare - 20 years of Iraq cost 3 trillions, ie. 150B/year buy you a lot of military hardware and action, and efficient defense of Taiwan would cost much less than that.)

purpleflame1257 6 days ago | parent [-]

Defending against territorial conquest is considerably easier than defending against kinetic strikes on key manufacturing facilities

trhway 5 days ago | parent | next [-]

Not necessarily. Territorial war requires people. Defense from kinetic strikes on key objects concentrated on smallish territory requires mostly high-tech - radars and missiles - and that would be much easier for a very rich high-tech US corporation.

An example - Starlink antenna, sub-$500, a phased array which actually is like a half or a third of such an array on a modern fighter jet where it cost several millions. Musk naturally couldn't go the way of a million-per-antenna, so he had to develop and source it on his own. The same with anti-missile defense - if/when NVDA gets to it to defend the TSMC fabs, NVDA would produce such defense systems orders of magnitude cheaper, and that defense would work much better than the modern military systems.

AtlasBarfed 6 days ago | parent | prev [-]

If China bombs tsmc, we blockade the Malacca straits.

China's economy shuts down in a month, their population starves in another month

throwaway2037 6 days ago | parent | prev | next [-]

    > Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC. Intel is one of the only other leading fabs onshor[e]
TSMC is building state of the art fabs in Arizona, USA. Samsung in Texas, USA. I assume these are being done to reduce geopolitical risk on all sides.

Something that I never read about: Why can't NVidia use Samsung fabs? They are very close to TSMC state of the art.

re-thc 6 days ago | parent | next [-]

> They are very close to TSMC state of the art.

They're not. Most have tried at 1 point. Apple had a release with TSMC + Samsung and users spotted a difference. There was quite a bit of negativity.

high_na_euv 6 days ago | parent | prev | next [-]

TSMC will not have state of the art on US soil.

Taiwanese gov prevents them from doing it. Leading node has to be on Taiwanese soil

throwaway2037 4 days ago | parent [-]

    > Taiwanese gov prevents them from doing it. Leading node has to be on Taiwanese soil
This is bold claim. Do you have a public evidence to share? I have never once seen this mentioned in any newspaper articles that I have read about TSMC and their expansion in the US.
high_na_euv 3 days ago | parent [-]

https://www.tomshardware.com/tech-industry/semiconductors/ta...

mrheosuper 6 days ago | parent | prev [-]

Maybe after being bitten by Samsung on their RTX3000 GPU. Power Spike and a lot of heat.

behringer 6 days ago | parent | prev [-]

Intel just released a halfway decent workstation (eg data center) card and we were expecting an even better set of cards by Xmas before this happened.

lacy_tinpot 6 days ago | parent | prev | next [-]

Not necessarily true. This might be a Microsoft funding a bankrupt Apple kind of moment.

American competition isn't a zero sum, and it's in Nvidias' best interest to keep the market healthy.

rapind 6 days ago | parent | next [-]

> American competition isn't a zero sum, and it's in Nvidias' best interest to keep the market healthy.

Looking at Google's recent antitrust settlement, I'm not sure this is true at present.

tonyhart7 6 days ago | parent | next [-]

Google literally "won" antitrust case ???

the fact that google pay firefox anually meaning that its in best interest of google that there is no monopoly, judge says

lacy_tinpot 6 days ago | parent | prev [-]

Nvidia's options are fund your competition to keep the market dynamic, or let the government do it by breaking you a part.

So yes. That's how American competition works.

It isn't a zero sum game. We try to create a market environment that is competitive and dynamic.

Monopolies are threat to both the company and a free open dynamic market. If Nvidia feels it could face an antitrust suit, which is reasonable, it is in its best interest to fund the future of Intel.

That's American capitalism.

NooneAtAll3 6 days ago | parent | next [-]

> or let the government do it by breaking you a part.

Looking at Google's recent antitrust settlement, I'm not sure this is true at present.

prasadjoglekar 6 days ago | parent | next [-]

There are at least 2 more anti trust suits against Google on going. One is about to enter the remedies phase in Virginia.

lacy_tinpot 6 days ago | parent | prev [-]

Because the recent settlement determined, in my opinion correctly, that the market is still dynamic and competitive.

Google search is genuinely being threatened.

Google is not a monopoly, not entirely.

If AI usage also starts accruing to Google then there should be a new antitrust suit.

mcintyre1994 6 days ago | parent | prev | next [-]

I can’t imagine Nvidia has any concerns about that with the current administration.

account42 6 days ago | parent | next [-]

We have at least seen anti-trust suits proceed against Google under the current (US) administration. The same cannot be said for the previous one.

mcintyre1994 6 days ago | parent [-]

Are you referring to the case that started in 2023 under the previous administration?

pedroma 6 days ago | parent | prev | next [-]

Will Nvidia continue to exist beyond the current administration? If yes, then would it be prudent to consider the future beyond the current administration?

at-fates-hands 6 days ago | parent | prev [-]

But it did when Biden was in office?

siva7 6 days ago | parent | prev [-]

Which government? This one?

rasz 6 days ago | parent | prev | next [-]

Microsoft wasnt funding bankrupt Apple, Microsoft was settling lawsuit with Jobs just on the cusp of DOJ monopoly lwasuit. Microsoft was stealing and shipping Apple QuickTime sourcecode.

https://www.theregister.com/1998/10/29/microsoft_paid_apple_...

> handwritten note by Fred Anderson, Apple's CFO, in which Anderson wrote that "the [QuickTime] patent dispute was resolved with cross-licence and significant payment to Apple." The payment was $150 million

humanfromearth9 6 days ago | parent [-]

Wow quicktime... That's a name I haven't heard for a long time.

villgax 6 days ago | parent | prev | next [-]

You might want to re-read about that Apple-Microsoft incident.

Quicktime got stolen by an ex-Apple employee & in return Apple had Microsoft commit money & promise to have Office suite available on macOS/OS X

andirk 6 days ago | parent [-]

According to [0] it was a contractor working for both Apple and Microsoft. Not an ex-Apple employee but still an interesting read, if true.

[0] https://thisdayintechhistory.com/12/06/apple-sues-over-quick...

AtlasBarfed 6 days ago | parent [-]

Wouldn't we call that industrial espionage not a contract, not simple contracting?

aDyslecticCrow 6 days ago | parent | prev [-]

One interesting parallel is Intel and AMD back in x86 1991, which is today the reason AMD is at all allowed to produce x86 without massive patent royalties to intel. [Asianometry](https://youtu.be/5oOk_KXbw6c) had a nice summery of it.

Nvidia is leaning more into data centres, but lack a CPU architecture or expertise. Intel is struggling financially, but have knowledge in iGPUs and a vast amount of patents.

They could have alot to give one another, and it's a massive win if it keeps intel afloat.

fnord123 6 days ago | parent | prev | next [-]

> Nvidia is actively hostile to drivers on Linux

Nvidia is contributing to Nova, the new Nvidia driver for GSP based hardware.

https://rust-for-linux.com/nova-gpu-driver

Alexandre Courbot, an Nvidia dev, is comaintainer.

https://www.phoronix.com/news/NOVA-Core-Co-Maintainer

OccamsMirror 6 days ago | parent [-]

Yeah I think Nvidia were hostile to Linux when they saw no value in it. Now it's where the machine learning is. It's the OS powering the whole AI hype train. Then there is also Steamdeck making Linux gaming not a complete write off anymore.

The days of Nvidia ignoring Linux is over.

spoaceman7777 6 days ago | parent [-]

For real. Nvidia is even selling desktop Linux computers now, with the launch of the DGX Spark.

The "F you Nvidia" Linus Torvalds moment in 2012 is a meme that will not die.

agentcoops 6 days ago | parent | prev | next [-]

The article hints at it, but my guess would be this investment is intended towards Intel foundry and getting it to a place where NVIDIA can eventually rely on them over TSMC — and the ownership largely to give them upside if/when Intel stock goes up on news of an NVIDIA contract etc. It isn’t that uncommon of an arrangement for enterprise deals of such a potential magnitude. Long-term, however, and without NVIDIA making the call that could definitely have the effect of leading to Intel divesting from directly competing in as many markets, ie Arc.

For context, I highly recommend the old Stratechery articles on the history of Intel foundry.

janc_ 6 days ago | parent | next [-]

My first thought was also that this relates to Intel's foundry business. Even if only to be able to use it in price negotiations with TSMC (it's hard to threaten to go elsewhere when there is no elsewhere left).

ldng 6 days ago | parent | prev [-]

Which is relatively funny since the new CEO has made it clear he wants to divest on the foundry side ...

ksec 4 days ago | parent | prev | next [-]

Do we want Intel to fall and bankrupt? Or do we want Intel to survive. I dont think most people are clear what is happening here. This is it. The Margin call moment.

Intel could either transform into Fabless company compete on design, and manufacture with TSMC. Or they continue to be a Foundry player, crucial to US strategic interest. You can only pick one, and competing on just one of them is already a monumental task.

GPU is burning money. With no short term success in sight that could make them cash flow positive in 3 - 4 years time frame. I have been stating this since 2016 and we are now coming close to 2026, recent market share suggest Intel is at less than 1% discrete market share. Especially given the strong roadmap Nvidia has.

This gives a perfect excuse for Intel to quit GPU. Nvidia to provide the cash flow to hopefully continue to develop A18 and A14. Manufacture Nvidia GPU for them, and slowly transition itself out to only x86 + Foundry model. Or even solely manufacture for Nvidia. US administration further force Apple, Qualcomm and Broadcom to use Intel in some capacity. Assuming Intel can keep up with TSMC, which is probably a comparatively easier task than to tackle the GPU market.

I am assuming the Intel board is happy with that direction though. Because so far they have shown they are completely lack of any strategic vision.

random3 6 days ago | parent | prev | next [-]

If only antitrust laws would exist

bobby_mcbrown 6 days ago | parent | next [-]

Or if monopoly laws like copyright didn't

dsr_ 6 days ago | parent | prev [-]

and be enforced

upboundspiral 6 days ago | parent | prev | next [-]

This seems like it could be a long term existential threat for AMD. AMD CPU + GPU combos are finally coming out strong, both with MI300+ series in the supercomputing space, Strix Halo in laptops, etc. They have the advantage of being able to run code already optimized for x86 (important for gamers and HPC code), which NVIDIA doesn't have. Imagine if Grace Blackwell had x86 chips instead of Arm. If NVIDIA can get Intel CPUs with its chip offerings, it could be poised to completely take over so many new portions of the market/consolidate its current position by using its already existing mindshare and market dominance.

hakfoo 6 days ago | parent [-]

x86 is important for gamers, but is it for HPC? That tends to be far less dependent on binary blobs.

high_na_euv 6 days ago | parent [-]

Jensen was talking about it during webcast, so apparently yes

jm4 6 days ago | parent | prev | next [-]

This seems more like the deal where Microsoft invested in Apple. It’s basically charity and they will flip it in a few years when Intel gets back on their feet.

freedomben 6 days ago | parent | prev | next [-]

You absolutely nailed it IMHO. I wish I had more upvotes to give. I guess time will tell, but this seems like a clear conflict of interest.

dchftcs 5 days ago | parent | prev | next [-]

Yeah, Nvidia has trillions at stake, Intel a mere 100B. It's more in the interests of Nvidia to interfere with Intel's GPU business than to help it, and the only things they want from Intel are the fabs.

mihaaly 6 days ago | parent | prev | next [-]

Now the whole purchase makes sense! : /

Using fortunes falling in the lap to kill competition is a common practice of economics (vs. technology) oriented organizations. That brings benefits only for the organization, for others it brings damages and disappointments.

classicmotto 6 days ago | parent | prev | next [-]

At this point Nvidia is just shooting themselves in the foot with hostility towards Linux - they are actively using Linux systems for DGX systems and the dependency on Linux is only going to grow internally.

citizenpaul 6 days ago | parent | prev | next [-]

Something about this reminds me of other industry gobbling purchases. None of them ever turned out better for the product, price or general well being of society.

mensetmanusman 6 days ago | parent | prev | next [-]

Microsoft’s investment in Apple was helpful for the world.

xp84 6 days ago | parent [-]

As an Apple user (and even an Apple investor), I'd rather that Apple went out of business back then. If we could re-roll the invention of the (mainstream) smartphone, maybe we'd get something other than two monopolistic companies controlling everything.

For instance, maybe if there were 4 strong vendors making the devices with diverse operating systems, native apps wouldn't have ever become important, and the Web platform would have gotten better sooner to fill that gap.

Or maybe it'd have ended up the same or worse. But I just don't think Apple being this dominant has been good for the world.

tracker1 6 days ago | parent | next [-]

Or... we could still be using blackberry-like devices without much in the way of active/touch interface development at all. Or worse, the Windows CE or Palm with the pen things.

magarnicle 6 days ago | parent | next [-]

Nah, the LG Prada phone would have taken over the world.

slater 6 days ago | parent [-]

I think you mean the Vertu!

edit: god they're still around https://vertu.com/

rhetocj23 6 days ago | parent | prev | next [-]

Lol exactly. That poster should quickly realise hes got it pretty good given the alternatives.

xp84 5 days ago | parent [-]

Why? Was Steve Jobs literally the only human who was capable of seeing the massive unserved demand that existed back then?

Sidekick was amazing for its time, but only on one also-ran carrier. BlackBerry had great features like BBM (essentially iMessage) but underpowered for multimedia and more difficult to learn. If Apple was out of business, one or more companies would have made the billions on MP3 players that iPod made, and any of them could have branched into phones and made a splash the same way. Perhaps Sony, perhaps Microsoft. Microsoft eventually figured it out -- the only reason they failed was that they waited for both Apple and Android to become entrenched so in this timeline they could have been the second-mover, but unlike with Apple and Android, maybe neither MS nor Google would have automatically owned the US marketshare the way Apple does[1]. If that were the case, we may have competition, instead of the unhealthy thing we have where Apple just does whatever they want.

[1] https://gs.statcounter.com/vendor-market-share/mobile/united...

rhetocj23 5 days ago | parent [-]

With all due respect theres a simple answer to why Apple was destined to win the smartphone race - they had a 5 year lead over everyone else because they had the OS and touch interface tightly integrated. On top of that they managed to scale up the production of the glass necessary for the touch to work and partnered with a network provider to overcome the control network providers had over handset producers.

They had such a lead that nobody was going to catch up and eat into their economic profits. Sure Samsung et al have captured marketshare, but not eaten into Apples economic profits.

Whether you like it or not, this hard work, effort and creativity deserves to be rewarded - in the form of monopoly/oligopoly profits.

Apple has shown itself to be very disciplined with its cash. That cannot be said of for Google, who instead of taking an endless stream of vanity projects, should return that cash back to shareholders.

Lorin 6 days ago | parent | prev | next [-]

I still miss my KeyOne keyboard.

ahartmetz 6 days ago | parent | prev | next [-]

BB10 was the shit. Fantastic OS and (some models) a great hardware keyboard. But it was already a response to the iPhone, wouldn't have happened without...

mrheosuper 6 days ago | parent | prev [-]

with focus on privacy and security? Sign me up.

mensetmanusman 6 days ago | parent | prev [-]

Nope, we know exactly where it was headed. Phones controlled by carriers full of NFL ads.

xp84 5 days ago | parent [-]

There's nothing supernatural about Apple that meant only they could do something better than that shitty generation of devices. Remember, the portable consumer electronics market would certainly have other huge players if Apple hadn't existed to make the iPod. BlackBerry, Microsoft, and Sony come to mind. iPhone, based mainly on Apple's popularity from the iPod era, got a huge jump from that, and then the rush for native apps, which encourages consolidation, smothered every other company's competing devices (such as WebOS, BlackBerry 10, Windows Mobile) before they had a chance to compete.

To be honest, Android may have met a similar fate if Apple had been able to negotiate a non-exclusive contract with Cingular/AT&T. My understanding though was that they had to give exclusivity as a bargaining chip to get all the then-unthinkable concessions, as yeah, every phone was full of garbage bloatware and festooned with logos.

bitexploder 6 days ago | parent | prev | next [-]

Markets seem to be at least reasonably competitive with 3 vendors. 2 vendors in a space often leads to less desirable outcomes for consumers.

TiredOfLife 6 days ago | parent | prev | next [-]

> wheras Nvidia is actively hostile to drivers on Linux

Is that's why to use Cuda in Windows you have to use WSL2 (virtualized linux)

6r17 6 days ago | parent | prev | next [-]

Reading this after that memo about China's attitude to Nvidia is actually chilling - they just don't care do they ?

sim7c00 6 days ago | parent | prev | next [-]

why would they wanna kill intel? amd has better cpus and also goes gpu better than intel :?? (*yes i may be missing things,. pls do tell!)

bentt 6 days ago | parent | prev | next [-]

I dont agree here. Nvidia could simply segment the market and keep intel on the low end of gpus. I think AMD is the real target/victim here.

itsthecourier 6 days ago | parent | prev | next [-]

a usable top gaming Intel GPU at good price is a myth :,(

tonyhart7 6 days ago | parent | prev | next [-]

nah, nvidia wouldnt do that

it would invite an DOJ case

bootsmann 6 days ago | parent | next [-]

Thats something Jensen Huang can do away with by bringing a golden gpu statue to the white house.

jandrese 6 days ago | parent | prev [-]

Assuming the DoJ is functional and paying attention.

cyanydeez 6 days ago | parent | prev | next [-]

No no, its in NVIDIA interest to ensure it's just good enough for the plebs, so they can continue to gouge the high rate market.

matheusmoreira 6 days ago | parent | prev [-]

I agree. As a Linux user who favors Intel hardware due to their Linux support, I gotta say the future looks bleak.

throwaway2037 6 days ago | parent | next [-]

    > As a Linux user who favors Intel hardware due to their Linux support
I'm confused here. Are you talking about Intel CPUs or GPUs? And does AMD not have excellent Linux support for their own CPUs and GPUs?
matheusmoreira 6 days ago | parent [-]

> Intel CPUs or GPUs

Both. Also things like sound cards, network cards, peripherals in general.

My happiness and stability while using Linux has been well correlated with the number of devices with Intel in the name. Every single device without Intel invariably becomes a major pain in my ass every single time.

It's gotten to the point I assume it will just work if it's Intel.

> And does AMD not have excellent Linux support for their own CPUs and GPUs?

They're making a lot of progress but Intel is still years ahead of them.

Earlier this year I was researching PC parts for a build and discovered AMD was still working on merging functionality like on die temperature sensors into the kernel. It makes me think I won't have a full feature set on Linux if I buy one of their processors.

tliltocatl 6 days ago | parent | prev [-]

Well, AMD isn't going away yet, and they do seem to have finally released the advantage of open-source drivers. But that's still bad very for competition and prices.