Remix.run Logo
bluGill 16 hours ago

I question depreciation. those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question. cpu's stopped getting exponetially faster 20 years ago, (they are faster but not the jumps the 1990s got)

rlpb 16 hours ago | parent | next [-]

> those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question

Doesn't one follow from the other? If newer GPUs aren't worth an upgrade, then surely the old ones aren't obsolete by definition.

bluGill 13 hours ago | parent | next [-]

There is the question - will they be worth the upgrade? Either because they are that much faster, or that much more energy efficient. (and also assuming you can get them, unobtainium is worth that what you have).

Also a nod to the other reply that suggests they will wear out in 5 years. I cannot comment on if that is correct but it is a valid worry.

carlCarlCarlCar 16 hours ago | parent | prev [-]

MTBF for data center hardware is short; DCs breeze through GPUs compared to even the hardest of hardcore gamers.

And there is the whole FOMO effect to business purchases; decision makers will worry their models won't be as fast.

Obsolete doesn't mean the reductive notion you have in mind, where theoretically it can still push pixels. Physics will burn them up, and "line go up" will drive demand to replace them.

zozbot234 10 hours ago | parent [-]

Source? Anecdotally, GPUs sourced from cryptomining were absolutely fine MTBF-wise. Zero apparent issues of wear-and-tear or any shortened lifecycle.

dghlsakjg 10 hours ago | parent | next [-]

My bellybutton fluff, uninformed opinion is that heat cycling and effective cooling are probably a much more limiting factor.

If you are running a gpu at 60C for months at a time, but never idling it (crypto use case), I would actually hazard a guess that it is better than cycling it with intermittent workloads due to thermal expansion.

That of course presupposes effective, consistent cooling.

brokenmachine 6 hours ago | parent | prev [-]

Anecdotally, I killed two out of two that I was hobby-mining on for a couple of years. They certainly didn't sound like they would work forever.

Negitivefrags 16 hours ago | parent | prev | next [-]

I recently compared performance per dollar for CPUs and GPUs on benchmarks for GPUs today vs 10 years ago, and suprisingly, CPUs had much bigger gains. Until I saw that for myself, I thought exactly the same thing as you.

It seems shocking given that all the hype is around GPUs.

This probably wouldn't be true for AI specific workloads because one of the other things that happened there in the last 10 years was optimising specifically for math with lower size floats.

PunchyHamster 10 hours ago | parent | next [-]

It's coz of use cases. Consumer-wise, if you're gamer, CPU just needs to be at "not the bottleneck" level for majority of games as GPU does most of the work when you start increasing resolution and details.

And many pro-level tools (especially in media space) offload to GPU just because of so much higher raw compute power.

So, basically, for many users the gain in performance won't be as visible in their use cases

selectodude 15 hours ago | parent | prev [-]

That makes sense. Nvidia owns the market and is capturing all the surplus value. They’re competing with themselves to convince you to buy a new card.

levocardia 10 hours ago | parent | prev | next [-]

It's not that hard to see the old GPUs being used e.g. for inference on cheaper models, or sub-agents, or mid-scale research runs. I bet Karpathy's $100 / $1000 nanochat models will be <$10 / <$100 to train by 2031

maxglute 16 hours ago | parent | prev | next [-]

I think real issue is current costs / demand = Nvidia gouging GPU price that costs for hardware:power consumption is 70:20 instead of 50:40 (10 for rest of datacenter). Reality is gpus are serendipidous path dependent locked from gaming -> mining. TPUs are more power efficient, if bubble pops and demand for compute goes down, Nvidia + TMSC will still be around, but nexgen AI first bespoke hardware premium will revert towards mean and we're looking at 50% less expensive hardware (no AI race scarcity tax, i.e. 75% Nvidia margins) that use 20% less power / opex. All of a sudden existing data centers becomes not profitable stranded assets even if they can be stretched past 5 years.

lo_zamoyski 16 hours ago | parent | prev [-]

> those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them

Then they won't be obsolete.