Remix.run Logo
gmm1990 4 days ago

Strange unit of measurement. Who would find that more useful than expected compute or even just the number of chips.

skhameneh 4 days ago | parent | next [-]

I wouldn't be surprised if power consumption is a starting point due to things like permitting and initial load planning.

I imagine this as a subtractive process starting with the maximum energy window.

zozbot234 4 days ago | parent | prev | next [-]

It's a very useful reference point actually because once you hit 1.21 GW the AI model begins to learn at a geometric rate and we finally get to real AGI. Last I've heard this was rumored as a prediction for AI 2027, so we're almost there already.

outside2344 4 days ago | parent | next [-]

Is this a crafty reference to Back to the Future? If so I applaud you.

4 days ago | parent | prev | next [-]
[deleted]
the_70x 2 days ago | parent | prev | next [-]

Came only here searching for 1.21GW

jsnell 4 days ago | parent | prev | next [-]

1.21GW is an absurd level of precision for this kind of prediction.

leptons 4 days ago | parent [-]

It's from the movie "Back to the Future"

4 days ago | parent | prev [-]
[deleted]
isoprophlex 4 days ago | parent | prev | next [-]

If a card costs x money, and operating it every year/whatever costs y money in electricity, and y >> x, it makes sense to mostly talk about the amount of electricity you are burning.

Because if some card with more FLOPS comes available, and the market will buy all your FLOPS regardless, you just swap it in at constant y / for no appreciable change in how much you're spending to operate.

(I have no idea if y is actually much larger than x)

credit_guy 4 days ago | parent | prev | next [-]

A point of reference is that the recently announced OpenAI-Oracle deal mentioned 4.5 GW. So this deal is more than twice as big.

aprdm 4 days ago | parent | prev | next [-]

At large scales a lot of it is measured on power instead of compute, as power is the limitation

ben_w 4 days ago | parent | prev | next [-]

For a while, it's become increasingly clear that the current AI boom's growth curve rapidly hits the limits of the existing electricity supply.

Therefore, they are listing in terms of the critical limit: power.

Personally, I expect this to blow up first in the faces of normal people who find they can no longer keep their phones charged or their apartments lit at night, and only then will the current AI investment bubble pop.

leetharris 4 days ago | parent | prev [-]

Probably because you can't reliably predict how much compute this will lead to. Power generation is probably the limiting factor in intelligence explosion.

sedawkgrep 4 days ago | parent [-]

That, and compute always goes up.