| ▲ | rich_sasha 4 days ago |
| Can someone ELI5 this to me? Nvidia has the market cap of a medium-sized country precisely because apparently (?) no one else can make chips like them. Great tech, hard to manufacture, etc - Intel and AMD are nowhere to be seen. And I can imagine it's very tricky business! China, admittedly full of smart and hard working people, then just wakes up one day an in a few years covers the entire gap, to within some small error? How is this consistent? Either: - The Chinese GPUs are not that good after all - Nvidia doesn't have any magical secret sauce, and China could easily catch up - Nvidia IP is real but Chinese people are so smart they can overcome decades of R&D advantage in just s few years - It's all stolen IP To be clear, my default guess isn't that it is stolen IP, rather I can't make sense of it. NVDA is valued near infinity, then China just turns around and produces their flagship product without too much sweat..? |
|
| ▲ | rsynnott 4 days ago | parent | next [-] |
| > because apparently (?) no one else can make chips like them No, that's not really why. It is because nobody else has their _ecosystem_; they have a lot of soft lock-in. This isn’t just an nvidia thing. Why was Intel so dominant for decades? Largely not due to secret magic technology, but due to _ecosystem_. A PPC601 was substantially faster than a pentium, but of little use to you if your whole ecosystem was x86, say. Now nvidia’s ecosystem advantage isn’t as strong as Intel’s was, but it’s not nothing, either. (Eventually, even Intel itself was unable to deal with this; Itanium failed miserably, largely due not to external competition but due to competition with the x86, though it did have other issues.) It’s also notable that nvidia’s adventures in markets where someone _else_ has the ecosystem advantage have been less successful. In particular, see their attempts to break into mobile chip land; realistically, it was easier for most OEMs just to use Qualcomm. |
| |
| ▲ | zenmac 4 days ago | parent | next [-] | | If what you say is true, isn't what one of the big contribution of Deepseek is that they wrote some custom lower level GPU cluster to GPU cluster communication protocol instead using of the nvidia soft ecosystem? And that is open sourced? | | |
| ▲ | rsynnott 3 days ago | parent [-] | | Well, they wrote it _for_ Nvidia stuff, though; if anything that was a contribution to the Nvidia ecosystem! Though it does show a willingness to go outside the _established_ Nvidia ecosystem. I'm always a little surprised that Nvidia is _so_ highly valued, because it seems inevitable to me that there is a tipping point where big companies will either make their own chips (see Google) or take the hit and build their own giant clusters of AMD or Huawei or whoever chips, and that knowledge will leak out, and ultimately there will be alternatives. Nvidia to me feels a bit like dot-com era Sun. For a while, if you wanted to do internet stuff, you pretty much _had_ to buy Sun servers; the whole ecosystem was kinda built around Sun. Sun's hardware was expensive, but you could just order a bunch of it, shove it in racks, and it worked and came with good tooling. Admins knew how to run large installations of Sun machines. You could in theory use cheaper x86 machines running Linux or BSD, but no-one really knew how to do that at scale. And then, as the internet companies got big, they started doing their own thing (usually Linux-based), building up administration tooling and expertise, and by the early noughties Linux/Apache was the default and Sun was increasingly irrelevant. |
| |
| ▲ | robotnikman 4 days ago | parent | prev [-] | | >In particular, see their attempts to break into mobile chip land; I wouldn't exactly say it was a failure, all those chips ended up being used in the Nintendo Switch | | |
| ▲ | rsynnott 4 days ago | parent | next [-] | | If you are aiming to have your chips in a decent portion of all mid/high-end phones sold, which they appear to have been aiming for, then the Nintendo Switch isn't really that much of a consolation prize. The Switch had very high sales... for a console, with 150 million over 7 years. Smartphone sales peaked at 1.5 billion units a year. You'd probably prefer to be Qualcomm than Nvidia in this particular market segment, all things considered. | | |
| ▲ | oblio 4 days ago | parent [-] | | Yearly global smartphone sales are around 300 million. | | |
| ▲ | rsynnott 4 days ago | parent [-] | | ... Where are you getting that? The iPhone _alone_ sells about 200 million units a year. There are almost 5 billion smartphone users; sales of 300 million a year would imply that those are only replaced every 16 years, which is obviously absurd. | | |
| ▲ | oblio 3 days ago | parent [-] | | Oh, that was quarterly: https://canalys.com/newsroom/global-smartphone-market-q2-202... On a separate note, speaking of the average lifespan of a phone, I'm fairly sure that with how expensive they're becoming, smartphone lifespans are increasing. Especially with: * hardware performance largely plateauing (not in the absolute sense, that of "this phone can do most of what I need") * the EU pushing for easy battery and screen replacement and also for 7 years of OS updates * the vast majority of phones having cases to protect against physical damage | | |
| ▲ | rsynnott 3 days ago | parent [-] | | Yeah, peak sales per year were a few years back. People are definitely keeping them longer than they used to. |
|
|
|
| |
| ▲ | xgkickt 4 days ago | parent | prev [-] | | I thought those came from the automotive sector. | | |
| ▲ | rsynnott 4 days ago | parent [-] | | Nah, they also managed to fob them off on the auto sector to some extent, but they were originally envisaged as a mobile chip. |
|
|
|
|
| ▲ | bbatha 4 days ago | parent | prev | next [-] |
| It’s several factors and all of your alternatives are true to some degree: 1. An h20 is about 1.5 generations behind Blackwell. This chip looks closer to about 2 generations behind top end Blackwell chips. So ~5ish years behind is not as impressive especially since EUV is likely going to be a major obstacle to catching up which China has no capacity for 2. Nvidia continues to dominate on the software side. Amd chips have been competitive on paper for a while and have had limited uptake. Now Chinese government mandates could obviously correct this after substantial investment in the software stack — but this is probably several years behind. 3. China has poured trillions of dollars into its academic system and graduates more than 3x the number of electrical engineers the US does. The US immigration system has also been training Chinese students but having a much more limited work visa program has transferred a lot of knowledge back without even touching IP issues 4. Of course ip theft covers some of it |
| |
| ▲ | Melatonic 3 days ago | parent | next [-] | | They also have some insane power generation capability - doesn't seem that far fetched that they just build a shitload of slower chips and eat the costs of lower power efficiency. | |
| ▲ | bangaladore 4 days ago | parent | prev [-] | | > China has poured trillions of dollars into its academic system and graduates more than 3x the number of electrical engineers the US does. This metric is not as important as it seems when they have ~5x the population. | | |
| ▲ | jacomoRodriguez 4 days ago | parent | next [-] | | It is. The outcome rate will not grow by the relative number of electrical engineers to population but by the absolute number of the engineers. | | |
| ▲ | bangaladore 4 days ago | parent [-] | | In theory, but I'm not sure that's true in practice. There are plenty of mundane, non-groundbreaking tasks that will likely be done by those electrical engineers and the more people, the more space, the more tasks are to be done. And not to mention more engineers does not equal better engineers. And the types to work on these sorts of projects are going to be the best engineers, not the "okay" ones. It's certainly non-linear. | | |
| ▲ | immibis 4 days ago | parent | next [-] | | The more engineers you can sample from (in absolute number), the better (in absolute goodness, whatever that is) the top, say, 500 of them are going to be. | | |
| ▲ | bangaladore 4 days ago | parent | next [-] | | That's assuming top-tier engineers are a fixed percent of graduates. That's not true and has never been. Does 5x the number of math graduates increase the number of people with ability like Terrance Tao? Or even meaningfully increase the number of top tier mathematicians? It really doesn't. Same with any other science or art. There is a human factor involved. | | |
| ▲ | immibis 3 days ago | parent [-] | | Suppose there's only one Terrance Tao. Then sampling from 5x the number of people increases the probability he's in the sample (by about 5x). Suppose there's more than one. Then sampling from 5x the number of people increases the average number of him that you get (by about 5x). |
| |
| ▲ | KerryJones 4 days ago | parent | prev [-] | | This is not necessarily true. Hypothetical, if most breakthroughs are coming from PHDs and they aren't making any PHDs, then that pool is not necessarily larger. |
| |
| ▲ | tonyhart7 4 days ago | parent | prev [-] | | "not to mention more engineers does not equal better engineers." funny that you mention this because many top AI talent from big tech companies are from chinnese Ivy league graduate US literally importing AI talent war as highest as ever and yet you still have doubt | | |
| ▲ | bangaladore 4 days ago | parent [-] | | You just said what I said. I didn't say that 100% of the graduates are stupid, but certainly not all high tier either. We aren't in extreme need of the average electrical engineer or the average software engineer. That's a fact. Look at unemployment rates. | | |
| ▲ | tonyhart7 4 days ago | parent [-] | | I don't like this argument since you can apply this into any country on earth and the answer would be the same You are trying too hard to be right meanwhile 40% top AI talent in big tech is chinnese so higher number = more chance smart people is indeed true and your argument is just waste of time |
|
|
|
| |
| ▲ | stefan_ 4 days ago | parent | prev [-] | | Doesn’t seem to work for India. Wuhan university alone probably has more impact than the sum of. Of course a competent state and strategic investment matters. |
|
|
|
| ▲ | FooBarWidget 4 days ago | parent | prev | next [-] |
| What gave you the impression that it's "without too much sweat"? They sweated insanely for the past 6 years. They also weren't starting from scratch, they already had a domestic semiconductor ecosystem, but it was fragmented and not motivated. The US sanctions united them and gave them motivation. Also "good" is a matter of perspective. For logic and AI chips they are not Nvidia level, yet. But they've achieved far more than what western commentators gave them credit for 4-5 years ago. And they're just getting started. Even after 6 years, what you're seeing is just the initial results of all that investment. From their perspective, not having Nvidia chips and ASML equipment and TSMC manufacturing is still painful. They're just not paralyzed, and use all that pain to keep developing. With power chips they're competitive, maybe even ahead. They're very strong at GaN chip design and manufacturing. Western observers keep getting surprised by China's results because they buy into stereotypes and simple stories too much ("China can't innovate and can only steal", "authoritarianism kills innovation","China is collapsing anyway", "everything is fake, they rely on smuggled chips lol" are just few popular tropes) instead of watching what China is actually doing. Anybody even casually paying attention to news and rumors from China instead of self-congratulating western reports about China could have seen this day coming. This attitude and the phenomenon of keep getting surprised is not limited to semiconductors. |
|
| ▲ | FuriouslyAdrift 4 days ago | parent | prev | next [-] |
| AMDs chips outperform nVidia's (Instinct is the GPU compute line at AMD) and at a lower per watt and per dollar range. AMD literally can't make enough chips to satisfy demand because nVidia buys up all the fab capacity at TSMC. |
| |
| ▲ | greenpizza13 4 days ago | parent | next [-] | | Would you care to provide sources? It's NVIDIA, not nVIDIA. I don't think AMD outperforms NVIDIA chips at price per watt. You need to defend this claim. | | |
| ▲ | FuriouslyAdrift 4 days ago | parent | next [-] | | By NVIDIA's own numbers and widely available testing numbers for FP8, the AMD MI355X just edges out the NVIDIA B300 (both the top performers) at 10.1 PFLOPs per chip at around 1400 W per chip. Neither of these thngs are available as a discrete device... you're going to be buying a system, but typically AMD Instinct systems run about 15% less than the comparable NVIDIA ones. NIVIDIA is a very pricey date. https://wccftech.com/mlperf-v5-1-ai-inference-benchmark-show... https://semianalysis.com/2024/04/10/nvidia-blackwell-perf-tc... https://semianalysis.com/2025/06/13/amd-advancing-ai-mi350x-... | | |
| ▲ | SamFold 4 days ago | parent [-] | | There’s a difference between raw numbers on paper and actual real world differences when training frontier models. There’s a reason no frontier lab using AMD models for training, because the raw benchmarks for performance for a single chip for a single operation type don’t translate to performance during an actual full training run. | | |
| ▲ | FuriouslyAdrift 4 days ago | parent [-] | | Meta, in particular, is heavily using AMDs for inference training. Also, anyone doing very large models tend to prefer AMDs because they have 288GB per chip and outperform for very large models. Outside of these use cases, it’s a toss up. AMD is also much more aligned with the supercomputing (HPC) world were they are dominant (AMD cpus and GPUs power around 140 of the top 500 HPC systems and 8 of the top 10 most energy efficient) |
|
| |
| ▲ | random_ind_dude 3 days ago | parent | prev [-] | | >It's NVIDIA, not nVIDIA Take a look at their logo. It starts with a lowercase n. |
| |
| ▲ | markus92 4 days ago | parent | prev [-] | | Per dollar sure but they’re quite a bit off per watt. Plus the software ecosystem is still not there. |
|
|
| ▲ | amelius 4 days ago | parent | prev | next [-] |
| My question would be: how did they fab it without access to ASML's high-end lithography machines? https://www.theguardian.com/technology/2024/jan/02/asml-halt... |
| |
| ▲ | FooBarWidget 4 days ago | parent | next [-] | | They've gone all-in with using less advanced equipment (DUV instead of EUV) but advanced techniques (multi patterning). Also combined with advanced packaging techniques. Also, they're working hard on replacing ASML DUV machines as well since the US is also sanctioning the higher end of DUV machines. Not to mention multiple parallel R&D tracks for EUV. You also need to distinguish between design and manufacturing. A lot of Chinese chip news is about design. Lots of Chinese chip designers are not yet sanctioned, and fabricate through TSMC. Chip design talent pool is important to have, although I find that news a bit boring. The real excitement comes from chip equipment manufacturers, and designers that have been banned from manufacturing with TSMC and need to collaborate with domestic manufacturers. | | |
| ▲ | amelius 4 days ago | parent [-] | | > They've gone all-in with using less advanced equipment (DUV instead of EUV) but advanced techniques (multi patterning). But that still seems like a huge step behind using EUV + advanced techniques. Anyway, I'm curious to know how far that gets them in terms of #transistors per square mm. Also, do we know there aren't secret contracts with TSMC? | | |
| ▲ | FooBarWidget 4 days ago | parent | next [-] | | You need to see it from their perspective. "huge step behind" is better than "we have nothing, let's just die". This is the best they have right now, and they're going all in with that until R&D efforts produce something better (e.g., domestic EUV). It could also happen that all their DUV investment allows them to discover a valuable DUV-derived tech tree branch that the west hasn't discovered yet. Results are at least good enough that Huawei can produce 7nm-5nm-ish phones and sell them at profit. A teardown of the latest Huawei phone revealed that the chips produced more heat than TSMC equivalent. However, Huawei worked around that by investing massively into avdanced heat dissipation technology improvements, and battery capacity improvements. Success in semiconductor products is not achieved along only a single dimension, there are multiple ways to overcome limitations. Another perspective is that, by domestically designing and producing chips, they no longer need to pay the generous margins for foreign IP (e.g., Qualcomm licensing fees), which is a huge cost saving and is beneficial for the economics of everything. | | |
| ▲ | martinald 4 days ago | parent | next [-] | | Yes exactly. Also to a certain degree you can just throw loads of GPUs at the problem. So instead of 100k GB200s, you have ~1m of these cards. One thing china _is_ good at is is mass manufacturing. There's all sorts of caveats to that, but I really think people are overlooking this scenario. I strongly suspect that they could ramp output of (much?) weaker cards far quicker than TSMC can ramp EUV fabrication. Plus China has vastly superior grid infrastructure. They have a massive oversupply of heavy industry, so even if they hit capacity issues with such gargantuan amounts of cards I can easily see aluminium plants and what not being totally mothballed and supply rerouted to nearby newly built data centres. | |
| ▲ | amelius 4 days ago | parent | prev [-] | | > You need to see it from their perspective. "huge step behind" is better than "we have nothing, let's just die". Yes but that doesn't answer the question of how they got so close to nvidia. > It could also happen that all their DUV investment allows them to discover a valuable DUV-derived tech tree branch that the west hasn't discovered yet. But why wouldn't the west discover that same branch but now for EUV? > Results are at least good enough that Huawei can produce 7nm-5nm-ish phones and sell them at profit. Sidenote, I'd love to see some photos and an analysis of the quality of their process. | | |
| ▲ | FooBarWidget 4 days ago | parent | next [-] | | > Yes but that doesn't answer the question of how they got so close to nvidia. Talent pool and market conditions. China was already cultivating a talent pool for decades, with limited success. But it had no market. Nobody, including Chinese, wanted to buy Chinese stuff. Without customers, they lacked practice to further develop their qualities. The sanctions gave them a captive market. That allowed them to get more practice to get better. > But why wouldn't the west discover that same branch but now for EUV? DUV and EUV are very different. They will have different branches. The point however is not whether the west can reach valuable branches or not. It's that western commentators have a tendency to paint Chinese efforts as futile, a dead end. For the Chinese, this is about survival. This is why western commentators keep being surprised by Chinese progress: they expected the Chinese to achieve nothing. From the Chinese perspective, any progress is better than none, but no progress is ever enough. | |
| ▲ | hadlock 4 days ago | parent | prev | next [-] | | China has been producing ARM chips like the A20, H40 (raspberry pi class competitors, dual and quad core SOC; went in to a lot of low end 720p tablets in the early 2010s) for a while now, their semiconductor industry is not zero. The biden administration turning off the chip supply in 2022 was nearly 3 years ago; three years is not nothing, especially with existing industry, and virtually limitless resources to focus on it. Probably more R&D capacity will be coming online here in the next year or two as the first crop of post-export control grads start entering the workforce in China. | |
| ▲ | 4 days ago | parent | prev [-] | | [deleted] |
|
| |
| ▲ | FooBarWidget 2 days ago | parent | prev [-] | | Another perspective: they don't need to create chips that are as good as Nvidia. Current strategy is to create less powerful chips but that have better yields due to smaller die size. They then scale out huge multi-node AI clusters. This requires more power and bandwidth, but they have plenty of those. Data centers are located near renewable energy sources, for example in the desert, where power is nearly free. They are very good at building networking so bandwidth is not an issue. They are very good at building efficient power systems (less heat when routing energy) because they are not behind, even in some areas ahead, in power semiconductors (GaN). They still need to innovate in power delivery systems and cooling systems to be able to handle the scale that's required, but that's easier than solving litography. In other words, they are working on litography and nanometers, but they're not very worried about those areas because they don't really need them. HN is too myopic, focusing only on single-chip performance and logic chips. |
|
| |
| ▲ | RyanShook 4 days ago | parent | prev [-] | | I think Alibaba uses TSMC for their foundries, like everyone else. I would assume that they did use ASML machines for this. |
|
|
| ▲ | BrawnyBadger53 4 days ago | parent | prev | next [-] |
| The article seems to only depict it being similar to the H20 in memory specs (and still a bit short). Regardless, Nvidia has their moat through cuda, not the hardware. |
|
| ▲ | impossiblefork 4 days ago | parent | prev | next [-] |
| >- Nvidia doesn't have any magical secret sauce, and China could easily catch up This is the simple explanation. We'll also see European companies matching them in time, probably on inference first. |
| |
| ▲ | 3eb7988a1663 4 days ago | parent [-] | | This is more my thinking as well. How many big tech companies are working on their own internal TPU chip? Google's started using them in 2015. It sounds like the basic theory of getting silicon to do matrix multiplication is well established. Sure you can always be more efficient, but getting a working chip sounds very approachable. AMD hardware has been ~competitive the entire time, but they have squandered all good will with their atrocious software support. If China sees an existential risk to getting compute capacity, I can easily see an internal decree to make something happen. Even if it requires designing the hardware + their own CUDA-like stack. |
|
|
| ▲ | mdemare 4 days ago | parent | prev | next [-] |
| > has the market cap of a medium-sized country "According to investors, today's value of Nvidia's expected future profits over its lifetime equals the total monetary value of all final goods and services produced within a medium-sized country in a year." Don't compare market cap with GDP, when you spell it out it's clear how nonsensical it is. |
|
| ▲ | lotsofpulp 4 days ago | parent | prev | next [-] |
| > Nvidia has the market cap of a medium-sized country This makes no sense. Market cap is share price times number of shares, there is no analog for a country. It’s also not comparable to the GDP of a country, since GDP is a measure of flow in a certain time period, whereas market cap is a point in time measurement of expected performance. |
|
| ▲ | anothernewdude 4 days ago | parent | prev | next [-] |
| Flagship? No, H20 was their cut down chip they were allowed to sell to China. |
| |
| ▲ | tmottabr 4 days ago | parent [-] | | No, that was the H800. The H200 is the next generation of the H100. |
|
|
| ▲ | gchadwick 4 days ago | parent | prev | next [-] |
| I'd say there's a mix of 'Chinese GPUs are not that good after all' and 'Nvidia doesn't have any magical secret sauce, and China could easily catch up' going on. Nvidia GPUs are indeed remarkable devices with a complex software stack that offers all kinds of possibilities that you cannot replicate over night (or over a year or two!) However they've also got a fair amount of generality, anything you might want to do that involves huge amounts of matmuls and vector maths you can probably map to a GPU and do a half decent job of it. This is good for things like model research and exploration of training methods. Once this is all developed you can cherry pick a few specific things to be good at and build your own GPU concentrating on making those specific things work well (such as inference and training on Transformer architectures) and catch up to Nvidia on those aspects even if you cannot beat or match a GPU on every possible task, however you don't care as you only want to do some specific things well. This is still hard and model architectures and training approaches are continuously evolving. Simplify things too much and target some ultra specific things and you end up with some pretty useless hardware that won't allow you to develop next year's models, nor run this year's particularly well. You can just develop and run last year's models. So you need to hit a sweet spot between enough flexibility to keep up with developments but don't add so much you have to totally replicate what Nvidia have done. Ultimately the 'secret sauce' is just years of development producing a very capable architecture that offers huge flexibility across differing workloads. You can short-cut that development by reducing flexibility or not caring your architecture is rubbish at certain things (hence no magical secret sauce). This is still hard and your first gen could suck quite a lot (hence not that good after all) but when you've got a strong desire for an alternative hardware source you can probably put up with a lot of short-term pain for the long-term pay off. |
| |
| ▲ | FooBarWidget 4 days ago | parent [-] | | What does "are not good after all" even mean? I feel there are too many value judgements in that question's tone, that blindsides western observers. I feel like the tone has the hidden implication of "this must be fake after all, they're only good at faking/stealing, nothing to see here move along". Are they as good as Nvidia? No. News reporters have a tendency to hype things up beyond reality. No surprises there. Are they useless garbage? No. Can the quality issues be overcome with time and R&D? Yes. Is being "worse" a necessary interim step to become "good"? Yes. Are they motivated to become "good"? Yes. Do they have a market that is willing to wait for them to become "good"? Also yes. It used to be no, but the US created this market for them. Also, comparing Chinese AI chips to Nvidia is a bit like comparing AWS with Azure. Overcoming compatibility problems is not trivial, you can't just lift and shift your workload to another public cloud, you are best off redesigning your entire infra for the capabilities of the target cloud. | | |
| ▲ | rich_sasha 4 days ago | parent | next [-] | | I think my question made it clear I'm not simply assuming China is somehow cheating here - either in the specs of their current product, or in stealing IP. No, I just struggle to reconcile (but many answers here go some way to clarifying) Nvidia being the pinnacle of the R&D-driven tech industry - not according to me but to global investors - and China catching up seemingly easily. | | |
| ▲ | FooBarWidget 4 days ago | parent [-] | | Unfortunately I think global investors are quite dumb. For example all the market analysts were very positive about ASML, Nvidia, etc but they all assumed sales to China would continue according to projections that don't take US sanctions or Chinese competition into account. Every time a sanction landed or a Chinese competitor made major step forward, it was surprise pikachu, even though enthusiasts who follow news on this topic saw it coming years ago. |
| |
| ▲ | gchadwick 4 days ago | parent | prev [-] | | To me at least "not good after all" means their current latest hardware has issues which means it cannot replace Nvidia GPUs yet. This is a hard problem so not getting there yet doesn't imply bad engineering just a reflection of the scale of the challenge! It also doesn't imply that if this generation is a miss following generations couldn't be large win. Indeed I think it would be very foolish to assume that Alibaba or other Chinese firms cannot build devices that can challenge Nvidia here on the basis of current generation not being up to it yet. As you say they have a large market that's willing to wait for them to become good. Plus it may not be true, this new Alibaba chip could turn out to be brilliant. |
|
|
|
| ▲ | ndai 4 days ago | parent | prev | next [-] |
| Isn’t NVIDIA fabless? I imagine (I jump to conclusions) that design is less of a challenge than manufacturing. EUV lithography is incredibly difficult- almost implausible. Perhaps one day a clever scientist will come up with a new, seemingly implausible, yet less difficult way, using “fractal chemical” doping techniques. |
| |
| ▲ | hollerith 4 days ago | parent [-] | | >design is less of a challenge than manufacturing. If so, can you explain why Nvidia's market cap is much higher than TSMC's? (4.15 trillion versus 1.10 trillion) | | |
| ▲ | JeremyNT 4 days ago | parent | next [-] | | I'd just say "market irrationality" and call it a day. TSMC is far closer to a monopoly than NVIDIA is, and they win no matter which fabless company is buying their capacity. | |
| ▲ | ndai 4 days ago | parent | prev [-] | | You could be right. But it could also be due to things like: automatic 401k injections into the market, easy retail investing, and general speculative attitudes. |
|
|
|
| ▲ | SilverElfin 4 days ago | parent | prev | next [-] |
| Perhaps China’s actions are less of a problem for Nvidia and more of a problem for other chip makes. After all, if Alibaba can make this chip, what justifies the valuation of companies like Groq? |
|
| ▲ | spacephysics 4 days ago | parent | prev | next [-] |
| Defaulting to China stealing IP is a perfectly reasonable first step. China is known for their countless theft of Europe and especially American IP, selling it for a quarter of the price, and destroying the original company nearly overnight. Its so bad even NASA has begun to restrict hiring Chinese nationals (which is more national defense, however illegally killing American companies can be seen as a national defense threat as well) https://www.bbc.com/news/articles/c9wd5qpekkvo.amp https://www.csis.org/analysis/how-chinese-communist-party-us... |
| |
| ▲ | robotnikman 4 days ago | parent [-] | | I'm not sure why you are being downvoted, this is well known knowledge and many hacks in the past decade and a half involved exfiltrating stolen IP from various companies. |
|
|
| ▲ | tsoukase 4 days ago | parent | prev | next [-] |
| Just some 2c totally out my head: - Chinese labs managed to "overcome decades of R&D" because they have been trying for many years now with unlimited resources, government support and total disrespect of IP laws - Chinese chips may not be competitive at process power/W with Western but they have cheaper electricity and again unlimited loss capacity - they will probably hit wall at the software/ecosystem level. CUDA ergonomy is something very difficult to replicate and, you know, developers love ease of use |
|
| ▲ | fearmerchant 4 days ago | parent | prev | next [-] |
| China's corporate espionage might have surpassed France at the winners podium. |
|
| ▲ | buckle8017 4 days ago | parent | prev [-] |
| It's all stolen IP. Virtually all products out of china still are. If you want something manufacturered the best way is still to fake a successful crowd sourcing campaign. You'll be able to buy whatever it is on AliExpress (minus any safety features) within 6 months. |
| |
| ▲ | edm0nd 4 days ago | parent [-] | | Yup this right here. The Chinese are estimated to steal hundreds of billions of dollars worth of US IP every single year. It's the Chinese way, they just steal or copy everything. Whatever gets them ahead. | | |
| ▲ | yunyu 3 days ago | parent [-] | | I wonder what the ethnicity of the Americans who founded, run, and developed NVIDIA and TSMC are. |
|
|