Remix.run Logo
Fiveplus 8 hours ago

Calling Nvidia niche feels a bit wild given their status-quo right now, but from a foundry perspective, it seems true. Apple is the anchor tenant that keeps the lights on across 12 different mature and leading-edge fabs.

Nvidia is the high-frequency trader hammering the newest node until the arb closes. Stability usually trades at a discount during a boom, but Wei knows the smartphone replacement cycle is the only predictable cash flow. Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?

anoojb 7 hours ago | parent | next [-]

So let's say TMSC reciprocated Apple's consistency as a customer by giving them preferential treatment for capacity. It's good business after all.

However, everyone knows that good faith reciprocity at that scale is not rewarded. Apple is ruthless. There are probably thousands of untold stories of how hard Apple has hammered it's suppliers over the years.

While Apple has good consumer brand loyalty, they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.

Aurornis 5 hours ago | parent | next [-]

At this scale and volume, it's not really about good faith.

Changing fabs is non-trivial. If they pushed Apple to a point where they had to find an alternative (which is another story) and Apple did switch, they would have to work extra hard to get them back in the future. Apple wouldn't want to invest twice in changing back and forth.

On the other hand, TSMC knows that changing fabs is not really an option and Apple doesn't want to do it anyway, so they have leverage to squeeze.

At this level, everyone knows it's just business and it comes down to optimizing long-term risk/reward for each party.

philistine 4 hours ago | parent | next [-]

Apple has used both Samsung and TSMC for its chips in the past. Until the A7 it was Samsung, A8 was TSMC, and the A9 was dual-sourced by both! Apple is used to switching between suppliers fairly often for a tech company; it's not that it's too hard for them to switch fab, it's that TSMC is the only competitive fab right now.

There are rumours that Intel might have won some business from them in 2 years. I could totally see Apple turning to Intel for the Mac chips, since they're much lower volume. I know it sounds crazy, we just got rid of Intel, but I'm talking using Intel as a fab, not going back to x86. Those are done.

lukan 3 hours ago | parent | next [-]

But wasn't the reason they split with Samsung because they copied the iphone in the perspective of Jobs (to which he reacted with thermonuclear threats)?

They did had the expertise building it after all. What would happen, if TSMC now would build a M1 clone? I doubt this is a way anyone wants to go, but it seems a implied threat to me that is calculated in.

thewebguyd 3 hours ago | parent | next [-]

Job's thermonuclear threats were about Android & Google, not Samsung because Schmidt was on Apple's board during the development of Android.

> "I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong. I’m going to destroy Android, because it’s a stolen product. I’m willing to go thermonuclear war on this."

The falling out with Samsung was related, but more about the physical look of the phone

fragmede 2 hours ago | parent | prev [-]

Doesn't seem likely, TBH. Nevermind the legal agreements they would be violating, TSMC fabs Qualcomm's Snapdragon line of ARM processors. The M1 is good, but not that good (it's a couple generations old by this point, for one). Samsung had a phone line of their own to put it in as well. TSMC does not.

chippiewill an hour ago | parent | prev [-]

I thought Intel was too far behind on their process nodes?

wtallis an hour ago | parent [-]

At the end of the month, laptops with Intel's latest processors will start shipping. These use Intel's 18A process for the CPU chiplet. That makes Intel the first fab to ship a process using backside power delivery. There's no third party testing yet to verify if Intel is still far behind TSMC when power, performance and die size are all considered, but Intel is definitely making progress, and their execs have been promising more for the future, such as their 14A process.

CodeWriter23 4 hours ago | parent | prev | next [-]

Apple is the company that just over 10 years ago made a strategic move to remove Intel from their supply chain by purchasing a semiconductor firm and licensing ARM. Managing 'painful' transitions is a core competency of theirs.

Zafira 3 hours ago | parent | next [-]

I think you’re correct that they’re good at just ripping the band-aid off, but the details seem off. AFAIK, Apple has always had a license with ARM and a very unique one since they were one of the initial investors when it was spun out from Acorn. In fact, my understanding is that Apple is the one that insisted they call themselves Advanced RISC Machines Ltd. because they did not want Acorn (a competitor) in the name of a company they were investing in.

wtallis 4 hours ago | parent | prev [-]

Which acquisition are you referring to? Apple bought PA Semi in 2008 and Intrinsity in 2010.

MBCook 2 hours ago | parent | prev | next [-]

Not all of Apple‘s chips need to be fabbed at the smallest size, those could certainly go elsewhere. I’m sure they already do.

Is there anyone who can match TSMC at this point for the top of the line M or A chips? Even if Intel was ready and Apple wanted to would they be able to supply even 10% of what Apple needs for the yearly iPhone supply?

7speter 3 hours ago | parent | prev | next [-]

I would imagine they could split their orders between different fabricators; they can put in orders for the most cutting edge chips for the latest Macs and iPhones at TSMC and go elsewhere for less cutting edge chips?

fsckboy 3 hours ago | parent [-]

presumably they already do that (since non cutting edge chip fab is likely to be more competitive and less expensive) so, given they are already doing that, this problem refers to the cutting edge allocations which are getting scare as exemplified at least by Nvidia's growth

jongjong 2 hours ago | parent | prev [-]

It's ridiculous that a trillion dollar company feels beholden to a supplier. With that kind of money, it should be trivial to switch. People forget Nvidia didn't even exist 35 years ago. It would probably take like 3 to 5 years to catch up with the benefit of hindsight and existing talent and tools?

And anyway consumers don't really need beefy devices nowadays. Running local LLM on a smartphone is a terrible idea due to battery life and no graphics card; AI is going to be running on servers for quite some time if not forever.

It's almost as if there is a constant war to suppress engineer wages... That's the only variable being affected here which could benefit from increased competition.

If tech sector is so anti-competitive, the government should just seize it and nationalize it. It's not capitalism when these megacorps put all this superficial pressure but end up making deals all the time. We need more competition, no deals! If they don't have competition, might as well have communism.

weslleyskah 2 hours ago | parent | next [-]

I know you are maybe joking but I don't think the government nationalizing the tech sector would be a good idea. They can pull down the salaries even more if they want. It can become a dead end job with you stuck on archaic technology from older systems.

Government jobs should only be an option if there are enough social benefits.

jongjong 2 hours ago | parent [-]

I'm joking yes but as an engineer who has seen the bureaucracy in most big tech companies, the joke is getting less funny over time.

I've met many software engineers who call themselves communists. I can kind of understand. This kind of communist-like bureaucracy doesn't work well in a capitalist environment.

It's painful to work in tech. It's like our hands are tied and are forced to do things in a way we know is inefficient. Companies use 'security' as an excuse to restrict options (tools and platforms), treat engineers as replaceable cogs as an alternative to trusting them to do their job properly... And the companies harvest what they sow. They get reliable cogs, well versed in compliance and groupthink and also coincidentally full-blown communists; they're the only engineers remaining who actually enjoy the insane bureaucracy and the social climbing opportunities it represents given the lack of talent.

weslleyskah an hour ago | parent [-]

I understand completely.

I'm going through a computer engineering degree at the moment, but I am thinking about pursuing Law later on.

Looking at other paths: Medicine requires expensive schooling and isn't really an option after a certain age and law, on the other hand, opened its doors too widely and now has a large underclass of people with third-tier law degrees.

Perhaps you can try to accept the realities of the system while trying to live the best life that you can?

Psyching yourself all the way, trying to find some sort of escape towards a good life with freedom later on...

cgio 2 hours ago | parent | prev [-]

It can be interpreted a different way too. Apple is just a channel for TSMCs technology. Also the cost to build a fab that advanced, in say a 3 year horizon, let alone immediately available, is not one even Apple can commit to without cannibalising its core business.

sellmesoap 2 hours ago | parent | prev | next [-]

About 17 years ago I worked at a company that was clamoring to get products into Costco, when we did I was shocked at the fees they charged us for returns. If they're the gold standard for supplier relations it's a wonder anyone bothers being a supplier.

an hour ago | parent [-]
[deleted]
hinkley 4 hours ago | parent | prev | next [-]

Apple loaned TSMC money in order to build manufacturing capacity back around the M1 era. They’ve done that for a number of suppliers and the “interest payments” were priority access to capacity. Everyone was complaining about how Apple got ARM chips while others had to wait in line.

That said, they did that for a sapphire glass supplier for the Apple Watch and when their machines had QC problems they dropped them like a rock and went back to Corning.

But is that really any different from any other supplier? And who tf do you think they’re going to drop TSMC for right now? They are the cock of the walk.

bigyabai 3 hours ago | parent [-]

> And who tf do you think they’re going to drop TSMC for right now?

Don't look now: https://www.macrumors.com/2025/11/28/intel-rumored-to-supply...

boringg 6 hours ago | parent | prev | next [-]

Counter argument is that is NVIDIA friendly to their supply chain? I have to think that maybe they are with their massive margins because they can be - their end buyer is currently willing to absorb costs at no expense. But I don't know, and that will change as their business changes.

Your underlying statement implies that whoever is replacing apple is a better buyer which I don't think is necessarily true.

philistine 4 hours ago | parent | next [-]

Nvidia is famously a pain to work with. Apple vowed never to use their chips, Microsoft and Sony can't get them to make any GPU for their consoles.

The only complete package integrator that manages to make a relationship work with Nvidia is Nintendo.

7speter 3 hours ago | parent | next [-]

> The only complete package integrator that manages to make a relationship work with Nvidia is Nintendo.

And thats probably because Nintendo isn’t adding any pressure to neither TSMC nor Nvidia capacity wise; iirc Nintendo uses something like Maxwell or Pascal on really mature processes for Switch chips/socs.

Macha 3 hours ago | parent [-]

And also the Switch 1 was just the hardware for a nvidia shield tablet from nVidia’s perspective, without the downside of managing the customer facing side and with the greater volume from Nintendo’s market reach. (Not that it wasn’t more than that for consumers or Nintendo, just talking nvidia here)

randall 3 hours ago | parent | prev [-]

I think that works out tremendously well for Nintendo, especially when you look at the Wii-U vs the Switch.

I shot a video at CNET in probably 2011 which was a single touchscreen display (i think it was the APX 2500 prototype iirc?) and it has the precise dimensions to the switch 1.

Nintendo was reluctantly a hardware company... they're a game company who can make hardware, but they know they're best when they own the stack.

Y-bar 5 hours ago | parent | prev | next [-]

> EVGA Terminates Relationship With Nvidia, Leaves GPU Business

> According to Han, Nvidia has been difficult to work with for some time now. Like all other GPU board partners, EVGA is only told the price of new products when they're revealed to everyone on stage, making planning difficult when launches occur soon after. Nvidia also has tight control over the pricing of GPUs, limiting what partners can do to differentiate themselves in a competitive market.

https://www.gamespot.com/articles/evga-terminates-relationsh...

boringg 5 hours ago | parent [-]

So not favorable to apple as a buyer.

Y-bar 4 hours ago | parent [-]

Funnily enough Apple and Nvidia has old beef with one another, this especially led them to sever ties:

https://www.semiaccurate.com/2010/07/11/investigation-confir...

marcosdumay 6 hours ago | parent | prev [-]

If your customers are known to be antagonistic to business partners, the correct answer is to diversify them as much as you can, even at reasonable costs from anything else.

That means deprioritizing your largest customer.

boringg 5 hours ago | parent [-]

Fair I feel like that also speaks to nation+states trade policy.

Also theres the devil you know and the devil you dont know.

simonh 4 hours ago | parent [-]

Yep, you can be close allies with a nation and have many shared interests, and even a trade deficit with them as we in Britain did, and then they stab you in the back with tariffs.

leoc 4 hours ago | parent | prev | next [-]

Even if Apple isn't very good at reciprocating faithful service from its suppliers, there's also the matter of how it treats suppliers who cause it problems instead.

5 hours ago | parent | prev | next [-]
[deleted]
boplicity 3 hours ago | parent | prev | next [-]

Suppliers really hate working with Costco. They're slow to pay, allow for only small margins, and often need too high of a percentage of a businesses revenue, all of which is not friendly towards suppliers.

bethekidyouwant 7 hours ago | parent | prev | next [-]

Agreed TSMC can do whatever they want. in 2027 no other fabs will match what tsmc has today, anything that requires the latest process node is going to get more expensive, so your apple silicone and your AMD chips

high_na_euv 3 hours ago | parent [-]

As of today Intel is very around leading node

girvo 2 hours ago | parent [-]

I'll believe it when I see it (at scale). I hope 18A is good enough as competition is good, and a weak Intel is bad for us all.

dheera 5 hours ago | parent | prev | next [-]

No public company will be loyal or nice to their suppliers. That is just not in the playbook for public companies. They have "fiduciary duty", not human duty.

Private companies can be nice to their suppliers. Owners can choose to stay loyal to suppliers they went to high school with, even if it isn't the most cost-efficient.

Forgeties79 7 hours ago | parent | prev [-]

> they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.

I’m not saying you’re wrong but you’re previous paragraph sounding like you were wondering if it was the case vs. here you’re saying it’s known. Is this all true? Do they have a reputation for hammering their suppliers?

xp84 6 hours ago | parent | next [-]

Apple is so notoriously ravenous for profit margin that they can’t not be that way.

Forgeties79 4 hours ago | parent [-]

It felt like a more confident statement and I was legitimately asking. I have little love for Apple. Ditched my Mac Studio earlier this year for a Linux only build after 20 years of being on Macs. I say this because I think folks think I am trying to sealion/“just ask questions:tm:” or some nonsense, when I am legitimately asking if this is a documented practice and what the extent is. I am not finding it easy to find info on this.

bigyabai 6 hours ago | parent | prev [-]

Apple dealt exclusively with Chinese labor prices until they were directly threatened by the POTUS. You tell me.

yurishimo 6 hours ago | parent [-]

I got a bridge to sell you if you think that Apple is going to bring any of their manufacturing to the US...

WillPostForFood 5 hours ago | parent | next [-]

https://www.bbc.com/news/articles/c86jx18y9e2o

Apple has responded and has started moving a lot of manufacturing out of China. It just makes sense for risk management.

mullingitover 4 hours ago | parent | next [-]

Well, from your article:

> China will remain the country of origin for the vast majority of total products sold outside the US, he added.

And international sales are a solid majority of Apple's revenue.

Forgeties79 4 hours ago | parent | prev [-]

From your article:

> Meanwhile, Vietnam will be the chief manufacturing hub "for almost all iPad, Mac, Apple Watch and AirPods product sold in the US".

> We do expect the majority of iPhones sold in US will have India as their country of origin," Mr Cook said.

Still not made in the US and no plan to change that. They will be selling products made in India/Vietnam domestically and products made in China internationally.

The tariffs are not bringing these jobs home.

bigyabai 6 hours ago | parent | prev | next [-]

I've seen the leaked BOMs, I'm not dumb enough to think that Americans can match it.

savorypiano 6 hours ago | parent | next [-]

Where do you find the leaked BOMs?

bigyabai 2 hours ago | parent [-]

RedNote usually, before it's deleted.

cindyllm 5 hours ago | parent | prev [-]

[dead]

godzillabrennus 6 hours ago | parent | prev [-]

It would be a $6000 phone if they built it in America.

rafterydj 8 hours ago | parent | prev | next [-]

I tend to agree with you, feels to me like the root of this is essentially whether foundries will "go all in" on AI like the rest of the S&P 500. But why trade away one trillion-dollar customer for another trillion-dollar customer if the first one is never going away, and the second one might?

Fiveplus 8 hours ago | parent | next [-]

I think it is less of a trade and more of a symbiotic capital cycle, if I can call it that?

Nvidia's willingness to pay exorbitant prices for early 2nm wafers subsidizes the R&D and the brutal yield-learning curve for the entire node. But you can't run a sustainable gigafab solely on GPUs...the defect density math is too punishing. You need a high-volume, smaller-die customer (Apple) to come in 18 months later, soak up the remaining 90% of capacity and amortize that depreciation schedule over a decade.

alex43578 8 hours ago | parent [-]

Isn’t the smaller die aspect more valuable early in the node’s maturity, where defects are less punishing?

Fiveplus 8 hours ago | parent [-]

That is the traditional textbook yield curve logic, if I'm not wrong? Smaller area = higher probability of a surviving die on a dirty wafer. But I wonder if the sheer margin on AI silicon basically breaks that rule? If Nvidia can sell a reticle-sized package for 25k-30k USD, they might be perfectly happy paying for a wafer that only yields 30-40% good dies.

Apple OTOH operates at consumer electronics price points. They need mature yields (>90%) to make the unit economics of an iPhone work. There's also the binning factor I am curious about. Nvidia can disable 10% of the cores on a defective GPU and sell it as a lower SKU. Does Apple have that same flexibility with a mobile SoC where the thermal or power envelope is so tightly coupled to the battery size?

genocidicbunny 8 hours ago | parent | next [-]

I am curious about the binning factor too since in the past, AMD and Intel have both made use of defect binning to still sell usable chips by disabling cores. Perhaps Apple is able to do the same with their SoCs? It's not likely to be as granular as Nvidia who can disable much smaller areas of the silicon for each of their cores. On the other hand, the specifics of the silicon and the layout of the individual cores, not to mention the spread of defects over the die might mitigate that advantage.

ricw 7 hours ago | parent [-]

They do bin their chips. Across the range (A- and M-series) they have the same chip with fewer / disabled cpu and gpu cores. You pray a premium for ones with more cores. Unsure about the chip frequencies - Apple doesn’t disclose those openly from what I know.

nebula8804 8 hours ago | parent | prev | next [-]

I thought they binned CPUs for things like AppleTV and lower cost iPads?

jsheard 8 hours ago | parent [-]

Yeah, most of their chips have two or more bins with different core configs, and the lower bins probably use salvaged dies.

For example the regular M4 can have 4 P-cores / 6 E-cores / 10 GPU cores, or 3/6/10 cores, or 4/4/8 cores, depending on the device.

They even do it on the smaller A-series chips - the A15 could be 2/4/5, 2/4/4, or 2/3/5.

alex43578 8 hours ago | parent | prev | next [-]

With current AI pricing for silicon, I think the math’s gone out the window.

For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads - and that’s after the node yields have been improved via the gazillion iPhone SoC dies.

NVIDIAs flexibility came from using some of those binned dies for GeForce cards, but the VRAM situation is clearly making that less important, as they’re cutting some of those SKUs for being too vram heavy relative to MSRP.

atq2119 7 hours ago | parent | next [-]

Datacenter GPU dies cannot be binned for Geforce because they lack fixed function graphics features. Raytracing acceleration in particular must be non-trivial area that you wouldn't want to spend on a datacenter die. Not to mention the data fabric is probably pretty different.

touisteur an hour ago | parent | next [-]

The A40, L40S and Blackwell 6000 Pro Server have RT cores. 3 datacenter GPUs.

If you want binning in action, the RTX ones other than the top ones, are it. Look for the A30 too, of which I was surprised there was no successor. Either they had better yields on Hopper or they didn't get enough from the A30...

alex43578 7 hours ago | parent | prev [-]

I’m not saying their binning between data center and 3060s, but within gaming and between gaming and RTX Pro cards, there’s binning.

As you cut SMs from a die you move from the 3090 down the stack, for instance. That’s yield management right there.

wtallis 8 hours ago | parent | prev [-]

> For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads

The Pro and Max chips are different dies, and the Ultra currently isn't even the same generation as the Max. And the iPads have never used any of those larger dies.

> NVIDIAs flexibility came from using some of those binned dies for GeForce cards

NVIDIA's datacenter chips don't even have display outputs, and have little to no fixed-function graphics hardware (raster and raytracing units), and entirely different memory PHYs (none of NVIDIA's consumer cards have ever used HBM).

alex43578 7 hours ago | parent | next [-]

They’re binning within those product lines - both NVIDIA and Apple.

Not binning an M4 Max for an iPhone, but an M4 Pro with a few GPU or CPU cores disabled is clearly a thing.

Same for NVIDIA. The 4080 is a 4090 die with some SMs disabled.

wtallis 6 hours ago | parent [-]

> The 4080 is a 4090 die with some SMs disabled.

The desktop 4090 uses the AD102 die, the laptop 4090 and desktop 4080 use the AD103 die, and the laptop 4080 uses the AD104 die. I'm not at all denying that binning is a thing, but you and other commenters are exaggerating the extent of it and underestimating how many separate dies are designed to span a wide product line like GPUs or Apple's computers/tablets/phones.

seanmcdirmid 7 hours ago | parent | prev [-]

There are levels inside pro, max, and ultra that might be the product of binning?

sgjohnson 7 hours ago | parent [-]

"Ultra" isn't even binned - it's just 2x "Max" chips connected together.

Otherwise, yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.

And even all of the above mentioned marketing names come in different core configurations. M4 Max can be 14 CPU Cores / 32 GPU cores, and it can also be 16 CPU cores and 40 GPU cores.

So yeah, I'd agree that Apple has _extreme_ binning flexibility. It's likely also the reason why we got A19 / A19 Pro / M5 first, and we still don't have M5 Pro or M5 Max yet. Yields not high enough for M5 Max yet.

Unfortunately I don't think they bin down even lower (say, to S chips used in Apple Watches), but maybe in the future they will.

In retrospect, Apple ditching Intel was truly a gamechanging move. They didn't even have to troll everyone by putting an Intel i9 into a chassis that couldn't even cool an i7 to boost the comparison figures, but I guess they had to hedge their bet.

wtallis 7 hours ago | parent | next [-]

> yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.

No, that's entirely wrong. All of those are different dies. The larger chips wouldn't even fit in phones, or most iPad motherboards, and I'm not sure a M4 Max or M4 Pro SoC package could even fit in a MacBook Air.

As a general rule, if you think a company might ever be selling a piece of silicon with more than half of it disabled, you're probably wrong and need to re-check your facts and assumptions.

seanmcdirmid 6 hours ago | parent | prev | next [-]

No, I think you have it wrong.

There are two levels of Max Chip, but think of a Max as two pros on die (this is simplification, you can also think of as pro as being two normal cores tied together), so a bad max can't be binned into a pro. But a high-spec Max can be binned into a low-spec Max.

7 hours ago | parent | prev [-]
[deleted]
7 hours ago | parent | prev [-]
[deleted]
alt227 5 hours ago | parent | prev [-]

Why are foundries going 'All In' on AI? They fab chips for customers, doesnt matter what chips they are and who the customer is.'Who will pay the most for us to make their chips first' is the only question TMSC will be asking. The market of the customer is irrelevant.

jonas21 3 hours ago | parent | prev | next [-]

AI capex may or may not flatten in the near future (and I don't necessarily see a reason why it would). But smartphone capex already has.

Like smartphones, AI chips also have a replacement cycle. AI chips depreciate quickly -- not because the old ones go bad, but because the new ones are so much better in performance and efficiency than the previous generation. While smartphones aren't making huge leaps every year like they used to, AI chips still are -- meaning there's a stronger incentive to upgrade every cycle for these chips than smartphone processors.

chuckadams 3 hours ago | parent [-]

> AI chips depreciate quickly -- not because the old ones go bad

I've heard that it's exactly that, reports of them burning out every 2-3 years. Haven't seen any hard numbers though.

TeMPOraL 2 hours ago | parent [-]

Lifetime curve is something they can control. If they can predict replacement rate, makes sense to make chips go bad on the same schedule, saving on manufacturing costs.

nialv7 28 minutes ago | parent | prev | next [-]

> the smartphone replacement cycle is the only predictable cash flow

people are holding onto their phones for longer: https://www.cnbc.com/2025/11/23/how-device-hoarding-by-ameri...

to11mtm 16 minutes ago | parent [-]

Still more predictable than GPU buys in the current climate. Power connector melting aside, GPUs in most cases get replaced less frequently than cell phones, unless of course you have lots of capital/profit infusion to for whatever reason stay ahead of the game.

Heck if Apple wanted to be super cheeky, they could probably still pivot on the reserved capacity to do something useful (e.x. revised older design for whatever node they reserved where they can get more chips/wafer for cheaper models.)

NVDA on the other hand is burning a lot of good-will in their consumer space, and if a competitor somehow is able to outdo them it could be catastrophic.

onion2k 4 hours ago | parent | prev | next [-]

Nvidia have been using TSMC since the Riva 128. That's before Apple started making any of their own silicon. GPUs are easily as predictable as mobile phones.

AceJohnny2 4 hours ago | parent [-]

> GPUs are easily as predictable as mobile phones

They really, absolutely, are not.

It's not about "will there be a new hardware", it's about "is their order quantity predictable"

3 hours ago | parent | prev | next [-]
[deleted]
Spooky23 5 hours ago | parent | prev | next [-]

Apple has to price in the risk of the US government forcing their hand in various ways. They have a negotiating disadvantage.

kelnos 4 hours ago | parent | prev | next [-]

On the other hand, it's not like Apple can just switch fabs without any cost or difficulty. Sure, TSMC is undoubtedly happy to have a customer with predictable needs, but Apple is also subject to some level of lock-in.

8 hours ago | parent | prev | next [-]
[deleted]
epolanski 8 hours ago | parent | prev | next [-]

Regardless of that, fab industry is based on a short and mid term auction-like planning.

If Nvidia pays more, Apple has to match.

swiftcoder 7 hours ago | parent [-]

> Regardless of that, fab industry is based on a short and mid term auction-like planning

Not a system that necessarily works all that well if one player has a short-term ability to vastly outspending all the rest.

You can't let all your other customers die just because Nvidia is flush with cash this quarter...

xp84 6 hours ago | parent | next [-]

> die

Is the argument that Apple will go out of business? AAPL?

Wait,

> one player has a short-term ability to vastly outspending all the rest.

I assure you, Apple has the long-term and short-term ability to spend like a drunken sailor all day and all night, indefinitely, and still not go out of business. Of course they’d prefer not to. But there is no ‘ability to pay’ gap here between these multi-trillion-dollar companies.

Apple will be forced to match or beat the offer coming from whoever is paying more. It will cost them a little bit of their hilariously-high margins. If they don’t, they’ll have to build less advanced chips or something. But their survival is not in doubt and TSMC knows that.

epolanski 5 hours ago | parent | prev | next [-]

That's exactly how it is supposed to work and Apple has outspent competitors for ages getting prio.

TSMC isn't running a charity, it sells capacity to the highest bidder.

Of course customers as big as Apple will have a relationship and insane volumes that they will be guaranteed important quotes regardless.

michaelt 4 hours ago | parent [-]

Why should it be short term, though?

If it takes 4 years to build a new fab and Apple is willing to commit to paying pay the price of an entire fab, for chips to be delivered in 4 years time - why not take the order and build the capacity?

epolanski 4 hours ago | parent [-]

I mean, these things are likely already written down and Apple still gets lots of capacity for the reasons you mention.

But Nvidia has also spent billions/year in TSMC for more than a decade and this just keeps increasing.

bigyabai 6 hours ago | parent | prev [-]

> Not a system that necessarily works all that well if one player has a short-term ability to vastly outspending all the rest.

Well yeah, people were identifying that back when Apple bought out the entirety of the 5nm node for iPhones and e-tchotchkes. It was sorta implicitly assumed that any business that builds better hardware than Apple would boot them out overnight.

swiftcoder 3 hours ago | parent [-]

> It was sorta implicitly assumed that any business that builds better hardware than Apple would boot them out overnight

It's not "build better hardware" though, it's "continue to ship said hardware for X number of years". If someone buys out the entire fab capacity and then goes under next year, TSMC is left holding the bag

bigyabai 2 hours ago | parent [-]

It's not that, either. Low-margin, high-volume contracts are the worst business you can take. It devalues TSMC's work and creates an unnatural downward force on the price of cutting-edge silicon. By ignoring Apple's demands they're creating natural competition that raises the value of their entire portfolio.

It really is about making better hardware. Apple would be out-bidding Nvidia right now, but only if the iPhone had equivalent value-add to Nvidia hardware. Alas, iPhones are overpriced and underpowered, most people will agree.

827a 6 hours ago | parent | prev | next [-]

I would also bet significant money that Apple's unique market position will give them the confidence to invest in in-house fabrication before 2030.

paulmist 4 hours ago | parent | next [-]

Would it be feasible for them to buy Intel instead? Starting your own foundry would likely take over a decade.

827a 2 hours ago | parent [-]

Yup; or potentially just purchasing a fab from them, given that Intel has signaled they want to leverage TSMC more, and much of Intel's remaining value is wrapped up in server-grade chips that Apple wouldn't be interested in.

But also; Apple is one of the very few companies at their size that seems to have the political environment to make, and more importantly succeed, at decade investments. The iPhone wasn't an obvious success for 5 or 6 years. They started designing their own iPhone chips ~the iPhone 4 iirc, and pundits remarked: this isn't a good idea; today, the M5 in the iPad Pro outperforms every chip made by EVERYONE else in the world, by 25%, at a tenth the power draw and no active cooling (e.g. 9950X3D). Apple Maps (enough said). We're seeing similar investments today, things we could call "failures" that in 10 years we'll think were obviously going to be successful (cough vision pro).

eschneider 6 hours ago | parent | prev [-]

Very much this.

Bombthecat 7 hours ago | parent | prev | next [-]

I doubt that we will hit diminishing returns in AI. We still find new ways to make them faster or cheaper or better or even train themselves...

The flat line prediction is now 2 years old...

eikenberry 5 hours ago | parent | next [-]

I thought the prediction was that the scaling of LLMs making them better would plateau, not that all advancement would stop? And that has pretty much happened as all the advancements over the last year or more have been architectural, not from scaling up.

aaronblohowiak 7 hours ago | parent | prev | next [-]

Feels like top of s curve lately

sfn42 an hour ago | parent | prev [-]

You say that, but to me they seem roughly the same as they've been for a good while. Wildly impressive technology, very useful, but also clearly and confidently incorrect a lot. Most of the improvement seems to have come from other avenues - search engine integration, image processing (still blows my mind every time I send a screenshot to a LLM and it gets it) and stuff like that.

Sure maybe they do better in some benchmarks, but to me the experience of using LLMs is and has been limited by their tendency to be confidently incorrect which betrays their illusion of intelligence as well as their usefulness. And I don't really see any clear path to getting past this hurdle, I think this may just be about as good as they're gonna get in that regard. Would be great if they prove me wrong.

apercu 7 hours ago | parent | prev | next [-]

"Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?"

That's the take I would pursue if I were Apple.

A quiet threat of "We buy wafers on consumer demand curves. You’re selling them on venture capital and hype"

Tuna-Fish 3 hours ago | parent | next [-]

Why should that change TSMC decision making even a little?

The reality is that TSMC has no competition capable of shipping an equivalent product. If AI fizzles out completely, the only way Apple can choose to not use TSMC is if they decide to ship an inferior product.

A world where TSMC drains all the venture capital out of all the AI startups, using NVidia as an intermediary, and then all the bubble pops and they all go under is a perfectly happy place for TSMC. In these market conditions they are asking cash upfront. The worst that can happen is that they overbuild capacity using other people's money that they don't have to pay back, leaving them in an even more dominant position in the crash that follows.

apercu an hour ago | parent [-]

Because apple can play hard(er) ball in 12 or 18 or 24 months when this (likely) irrational spend spree dies?

Business is a little more nuanced than this audience thinks, and it’s silly to think Apple has no leverage.

bigyabai 6 hours ago | parent | prev | next [-]

Nvidia is not a venture capital outlet. They are a self-sustaining business with several high-margin customers that will buy out their whole product line faster than any iPhone or Mac.

From TSMC's perspective, Apple is the one that needs financial assistance. If they wanted the wafers more than Nvidia, they'd be paying more. But they don't.

toasterlovin 6 hours ago | parent [-]

> several high-margin customers

This is the "venture capital and hype" being referred to, not Nvidia themselves.

apercu 5 hours ago | parent | next [-]

Thanks. I didn't think my comment was super nuanced.

bigyabai 3 hours ago | parent | prev [-]

But Nvidia has had high-profile industry partners for decades. Nintendo isn't "venture capital and hype" nor is PC gaming and HPC datacenter workloads.

That line is purified cope.

toasterlovin an hour ago | parent [-]

But Nvidia wasn't able to compete with Apple for capacity on new process nodes with Nintendo volumes (the concept is laughable; compare Apple device unit volumes to game console unit volumes). What has changed in the semiconductor industry is overwhelming demand for AI focused GPUs, and that is paid for largely with speculative VC money (at this point, at least; AI companies are starting to figure out monetization).

6 hours ago | parent | prev [-]
[deleted]
dude250711 7 hours ago | parent | prev [-]

[flagged]

morsch 6 hours ago | parent [-]

Louis Vuitton didn't make 18% of all handbags in 2024.