Remix.run Logo
onlyrealcuzzo 4 days ago

I dunno.

Google is pretty useful.

It uses >15 TWh per year.

Theoretically, AI could be more useful than that.

Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.

It could be a short-term crunch to pull-forward (slightly) AI advancements.

Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.

Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.

VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).

dns_snek 4 days ago | parent | next [-]

According to Google's latest environmental report[1] that number was 30 TWh per year in 2024, but as far as I can tell that's their total consumption of their datacenters, which would include everything from Google Search, to Gmail, Youtube, to every Google Cloud customer. Is it broken down by product somewhere?

30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.

Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.

[1] https://sustainability.google/reports/google-2025-environmen...

onlyrealcuzzo 3 days ago | parent [-]

Data centers typically use 60% (or less) on average of their max rating.

You over-provision so that you (almost) always have enough compute to meet your customers needs (even at planet scale, your demand is bursty), you're always doing maintenance on some section, spinning up new hardware and turning down old hardware.

So, apples to apples, this would likely not even be 2x at 30TWh for Google.

tmiku 4 days ago | parent | prev | next [-]

For other readers: "15 Twh per year" is equivalent to 1.71 GW, 17.1% of the "10GW" number used to describe the deal.

mNovak 4 days ago | parent [-]

This is ignoring the utilization factor though. Both Google and OpenAI have to overprovision servers for the worst case simultaneous users. So 1.71 GW average doesn't tell use the maximum instantaneous GW capacity of Google -- if we pull a 4x out of the hat (i.e. peak usage is 4x above average), it becomes ~7 GW of available compute.

More than a "Google" of new compute is of course still a lot, but it's not many Googles' worth.

Capricorn2481 4 days ago | parent | prev | next [-]

Does Google not include AI?

4 days ago | parent | prev [-]
[deleted]