Remix.run Logo
darth_avocado 10 hours ago

Isn’t that what Michael Burry is complaining about? That five years is actually too generous when it comes to depreciation of these assets and that companies are being too relaxed with that estimate. The real depreciation is more like 2-3 years for these GPUs that cost tens of thousands of dollars a piece.

https://x.com/michaeljburry/status/1987918650104283372

enopod_ 2 hours ago | parent | next [-]

That's exactly the thing. It's only about bookkeeping.

The big AI corps keep pushing depreciation for GPUs into the future, no matter how long the hardware is actually useful. Some of them are now at 6 years. But GPUs are advancing fast, and new hardware brings more flops per watt, so there's a strong incentive to switch to the latest chips. Also, they run 24/7 at 100% capacity, so after only 1.5 years, a fair share of the chips is already toast. How much hardware do they have in their books that's actually not useful anymore? Noone knows! Slower depreciation means more profit right now (for those companies that actually make profit, like MS or Meta), but it's just kicking the can down the road. Eventually, all these investments have to get out of the books, and that's where it will eat their profits. In 2024, the big AI corps invested about $1 trillion in AI hardware, next year is expected to be $2 trillion. Only the interest payments for that are crazy. And all of this comes on top of the fact that none of the these companies actually make any profit at all with AI. (Except Nvidia of course) There's just no way this will pan out.

gizmo 2 minutes ago | parent [-]

[delayed]

duped 7 hours ago | parent | prev [-]

How different is this from rental car companies changing over their fleets? I don't know, this is a genuine question. The cars cost 3-4x as much and last about 2x as far as I know, and the secondary market is still alive.

logifail 5 hours ago | parent | next [-]

> How different is this from rental car companies changing over their fleets?

New generations of GPUs leapfrog in efficiency (performance per watt) and vehicles don't? Cars don't get exponentially better every 2–3 years, meaning the second-hand market is alive and well. Some of us are quite happy driving older cars (two parked outside our home right now, both well over 100,000km driven).

If you have a datacentre with older hardware, and your competitor has the latest hardware, you face the same physical space constraints, same cooling and power bills as they do? Except they are "doing more" than you are...

Would we could call it "revenue per watt"?

wongarsu 2 hours ago | parent | next [-]

The traditional framing would be cost per flop. At some point your total costs per flop over the next 5 years will be lower if you throw out the old hardware and replace it with newer more efficient models. With traditional servers that's typically after 3-5 years, with GPUs 2-3 years sounds about right

The major reason companies keep their old GPUs around much longer with now are the supply constraints

bbarnett 2 hours ago | parent | prev [-]

The used market is going to be absolutely flooded with millions of old cards. I imagine shipping being the most expensive cost for them. The supply side will be insane.

Think 100 cards but only 1 buyer as a ratio. Profit for ebay sellers will be on "handling", or inflated shipping costs.

eg shipping and handling.

3form 2 hours ago | parent [-]

I assume NVIDIA and co. already protects themselves in some way, either by the fact of these cards not being very useful after resale, or requiring them to go to the grinder after they expire.

bbarnett an hour ago | parent [-]

Cards don't "expire". There are alternate strategies to selling cards, but if they don't sell the cards, then there is no transfer of ownership, and therefore NVIDIA is entering some form of leasing model.

If NVIDIA is leasing, then you can't get use those cards as collateral. You can't also write off depreciation. Part of what we're discussing is that terms of credit are being extended too generously, with depreciation in the mix.

The could require some form of contractual arrangement, perhaps volume discounts for cards, if they agree to destroy them at a fixed time. That's very weird though, and I've never heard of such a thing for datacenter gear.

They may protect themselves on the driver side, but someone could still write OSS.

afavour 7 hours ago | parent | prev | next [-]

Rental car companies aren’t offering rentals at deep discount to try to kickstart a market.

It would be much less of a deal if these companies were profitable and could cover the costs of renewing hardware, like car rental companies can.

cjonas 7 hours ago | parent | prev | next [-]

I think it's a bit different because a rental car generates direct revenue that covers its cost. These GPU data centers are being used to train models (which themselves quickly become obsolete) and provide inference at a loss. Nothing in the current chain is profitable except selling the GPUs.

sho 5 hours ago | parent [-]

> and provide inference at a loss

You say this like it's some sort of established fact. My understanding is the exact opposite and that inference is plenty profitable - the reason the companies are perpetually in the red is that they're always heavily investing in the next, larger generation.

I'm not Anthropic's CFO so i can't really prove who's right one way or the other, but I will note that your version relies on everyone involved being really, really stupid.

elktown 4 hours ago | parent | next [-]

“like it's some sort of established fact” -> “My understanding”?! a.k.a pure speculation. Some of you AI fans really need to read your posts out loud before posting them.

teodosin 3 hours ago | parent [-]

You misread the literal first snippet you quoted. There's no contradiction in what you replied to.

elktown 2 hours ago | parent [-]

No?

darkwater 4 hours ago | parent | prev | next [-]

The current generation of today was the next generation of yesterday. So, unless the services sold on inference can cover the cost of inference + training AND gain money, they are still operating at loss.

rvba 3 hours ago | parent | prev [-]

Or just "everyone" being greedy

chii 7 hours ago | parent | prev [-]

> the secondary market is still alive.

this is the crux. Will these data center cards, if a newer model came out with better efficiency, have a secondary market to sell to?

It could be that second hand ai hardware going into consumers' hands is how they offload it without huge losses.

vesrah 6 hours ago | parent | next [-]

The GPUs going into data centers aren't the kind that can just be reused by putting them into a consumer PC and playing some video games, most don't even have video output ports and put out FPS similar to cheap integrated GPUs.

geerlingguy 5 hours ago | parent [-]

And the big ones don't even have typical PCIe sockets, they are useless outside of behemoth rackmount servers requiring massive power and cooling capacity that even well-equipped homelabs would have trouble providing!

physicsguy 6 hours ago | parent | prev | next [-]

Data centre cards a don’t have fans and don’t have video out these days.

chii 6 hours ago | parent [-]

i dont mean consumer market for video cards - i mean a consumer buying ai chips to run themselves so they can have it locally.

If i can buy a $10k ai card for less than $5000 dollars, i probably would, if i can use it to run an open model myself.

mike_hearn 2 hours ago | parent | next [-]

Why would you do that when you can pay someone else to run the model for you on newer more efficient and more profitable hardware? What makes it profitable for you and not for them?

mkjs 5 hours ago | parent | prev | next [-]

At that point it isn't a $10k card anymore, it's a $5k card. And possibly not a $5k card for very long in the scenario that the market has been flooded with them.

darkwater 4 hours ago | parent | prev | next [-]

How many "yous" are there in the world? Probably a number that can buy what's inside one Azure DC?

physicsguy 6 hours ago | parent | prev | next [-]

Ah well yes to a degree that’s possible but at least at the moment you’d still be better off buying a $5k Mac Studio if it’s just inference you’re doing

esseph 2 hours ago | parent | prev [-]

You need the hardware to wrap that in, and the power draw is going to be... significant.

6 hours ago | parent | prev [-]
[deleted]