Remix.run Logo
ComputerGuru 4 days ago

Framing it in gigawatts is very interesting given the controversy about skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years, primarily driven by AI growth. If, as another commenter notes, this 10GW is how much Chicago and NYC use combined, then we need to have a serious discussion about where this power is going to come from given the dismal status of the USA's power grid and related infrastructure and the already exploding costs that have been shifted to residential users in order to guarantee electric supply to these biggest datacenters (so they can keep paying peanuts for electricity and avoid shouldering any of the infrastructural burden to maintain or improve the underlying grid/plants required to meet their massive power needs).

I'm not even anti-datacenter (wouldn't be here if I were), I just think there needs to be serious rebalancing of these costs because this increase in US residential electric prices in just five years (from 13¢ to 19¢, a ridiculous 46% increase) is neither fair nor sustainable.

So where is this 10GW electric supply going to come from and who is going to pay for it?

Source: https://fred.stlouisfed.org/series/APU000072610

EDIT:

To everyone arguing this is how DCs are normally sized: yes, but normally it's not the company providing the compute for the DC owner that is giving these numbers. nVidia doesn't sell empty datacenters with power distribution networks, cooling, and little else; nVidia sells the GPUs that will stock that DC. This isn't a typical PR netnewswire bulletin "OpenAI announces new 10GW datacenter", this is "nvidia is providing xx compute for OpenAI". Anyway, all this is a segue from the question of power supply, consumption, grid expansion/stability, and who is paying for all that.

elbasti 4 days ago | parent | next [-]

I work in the datacenter space. The power consumption of a data center is the "canonical" way to describe their size.

Almost every component in a datacenter is upgradeable—in fact, the compute itself only has a lifespan of ~5 years—but the power requirements are basically locked-in. A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.

The fact that we use this unit really nails the fact that AI is basically refining energy.

aurareturn 4 days ago | parent | next [-]

  A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.
This here underscores how important TSMC's upcoming N2 node is. It only increases chip density by ~1.15x (very small relative to previous nodes advancements) but it uses 36% less energy at the same speed as N3 or 18% faster than N3 at the same energy. It's coming at the right time for AI chips used by consumers and energy starved data centers.

N2 is shaping up to be TSMC's most important node since N7.

alberth 4 days ago | parent [-]

> N2 is shaping up to be TSMC's most important node since N7

Is it?

N2, from an energy & perf improvement seems on par with any generation node update.

          N2:N3   N3:N5  N5:N7
  Power   ~30%    ~30%    ~30%
  Perf    ~15%    ~15%    ~15%
https://www.tomshardware.com/news/tsmc-reveals-2nm-fabricati...
aurareturn 4 days ago | parent [-]

Yes. It has more tape outs at this stage of development than both N5 or N3. It’s wildly popular for chip designers it seems.

alberth 3 days ago | parent [-]

I thought Apple gets exclusive access to the latest node for the first 1-2 years. Is that not the case?

aurareturn 3 days ago | parent [-]

No. That's not the case. Maybe for a few months only.

alberth 3 days ago | parent [-]

Correct me if I'm wrong but didn't TSMC launch N3 in 2022, and still only Apple uses this latest/smallest node.

Both AMD and NVIDIA are using N4.

aurareturn 2 days ago | parent [-]

Apple, Mediatek, Qualcomm, Intel

pseudosavant 4 days ago | parent | prev | next [-]

I love that term "refining energy". We need to plan for massive growth in electricity production to have the supply to refine.

tmalsburg2 4 days ago | parent [-]

Sounds smart but it’s abusing the semantics of “refine” and is therefore ultimately vacuous.

pseudosavant 3 days ago | parent [-]

I think it is really just the difference between chemically refining something and electrically refining something.

Raw AC comes in, then gets stepped down, filtered, converted into DC rails, gated, timed, and pulsed. That’s already an industrial refinement process. The "crude" incoming power is shaped into the precise, stable forms that CPUs, GPUs, RAM, storage, and networking can actually use.

Then those stable voltages get flipped billions of times per second into ordered states, which become instructions, models, inferences, and other high-value "product."

It sure seems like series of processes for refining something.

jacquesm 4 days ago | parent | prev | next [-]

It is the opposite of refining energy. Electrical energy is steak, what leaves the datacenter is heat, the lowest form of energy that we might still have a use for in that concentration (but most likely we are just dumping it in the atmosphere).

Refining is taking a lower quality energy source and turning it into a higher quality one.

What you could argue is that it adds value to bits. But the bits themselves, their state is what matters, not the energy that transports them.

elbasti 4 days ago | parent [-]

I think you're pushing the metaphor a bit far, but the parallel was to something like ore.

A power plant "mines" electron, which the data center then refines into words. or whatever. The point is that energy is the raw material that flows into data centers.

fuzzfactor 4 days ago | parent [-]

Maybe more like converting energy to data, as a more specific type of refinement.

phkahler 4 days ago | parent [-]

Using energy to decrease the entropy of data. Or to organize and structure data.

LaGrange 3 days ago | parent | next [-]

This is OpenAI, they are not decreasing the entropy. This is refining coal into waste heat and CO2.

fuzzfactor 4 days ago | parent | prev [-]

I like that. Take random wild electrons and put them neatly into rows & columns where they can sit a spell.

reubenmorais 4 days ago | parent | prev | next [-]

All life is basically refining energy - standing up to entropy and temporarily winning the fight.

HPsquared 4 days ago | parent | next [-]

It's all about putting the entropy somewhere else and keeping your own little area organised.

xnickb 4 days ago | parent [-]

People of the earth, remember: unnecessary arm and leg movements increase the entropy! Fear of the heat death of the universe! Lie down when possible!

antihipocrat 4 days ago | parent | prev | next [-]

Yes, in a very local context it appears so, but net entropy across the system from life's activities is increased

ithkuil 4 days ago | parent | prev [-]

"the purpose of life is to hydrogenate carbon dioxide"

-- Michael Russel

casey2 3 days ago | parent | prev | next [-]

Where do the cards go after 5 years? I don't see a large surplus of mid sized cloud providers coming to buy them (cause AI isn't profitable), Maybe other countries (possibly illegally)? Flood the consumer market with cards they can't use? TSMCs' more than doubled packaging and they are planning on doubling again

protocolture 4 days ago | parent | prev | next [-]

This.

A local to me ~40W datacenter used to be in really high demand, and despite having excess rack space, had no excess power. It was crazy.

nixass 4 days ago | parent [-]

40W - is that ant datacenter? :)

protocolture 4 days ago | parent [-]

Yeah, it was the companies pilot site, and everything about it is tiny.

But it very quickly became the best place in town for carrier interconnection. So every carrier wanted in.

Even when bigger local DC's went in, a lot of what they were doing was just landing virtual cross connects to the tiny one, because thats where everyone was.

inemesitaffia 4 days ago | parent [-]

You lost a M or K next to your W.

I still have an Edison bulb that consumes more power.

protocolture 4 days ago | parent [-]

Yep I see that haha.

pabs3 4 days ago | parent | prev | next [-]

> the power requirements are basically locked-in

Why is that? To do with the incoming power feed or something else?

brendoelfrendo 4 days ago | parent | next [-]

Basically, yes. When you stand up something that big, you need to work with the local utilities to ensure they have the capacity for what you're doing. While you can ask for more power later on, if the utilities can't supply it or the grid can't transport it, you're SOL.

pabs3 3 days ago | parent [-]

You could in theory supplement it with rooftop solar and batteries, especially if you can get customers who can curtail their energy use easily. Datacentres have a lot of roof space, they could at least reduce their daytime energy costs a bit. I wonder why you don't see many doing solar, do the economics not work out yet?

brendoelfrendo 3 days ago | parent | next [-]

I'd have to do the math, but I doubt that makes sense given the amount of power these things are drawing. I've heard of DCs having on-site power generation, but it's usually in the form of diesel generators used for supplemental or emergency power. In one weird case, I heard about a DC that used on-site diesel as primary power and used the grid as backup.

XorNot 3 days ago | parent | prev [-]

Compared to their volume they absolutely do not: you get about ~1kW / m^2 of solar. Some quick googling suggests a typical DC workload would be about 50 kW / m^2, rising too 100 for AI workloads.

jl6 4 days ago | parent | prev | next [-]

Cooling too. A datacenter that takes 200MW in has to dissipate 200MW of heat to somewhere.

djtriptych 4 days ago | parent | prev [-]

guessing massive capital outlays and maybe irreversible site selection/preparation concerns.

kulahan 4 days ago | parent | prev | next [-]

That's pretty interesting. Is it just because the power channels are the most fundamental aspect of the building? I'm sorta surprised you can't rip out old cables and drop in new ones, or something to that effect, but I also know NOTHING about electricity.

libraryofbabel 4 days ago | parent [-]

Not an expert, but it’s probably related to cooling. Every joule of that electricity that goes in must also leave the datacenter as heat. And the whole design of a datacenter is centered around cooling requirements.

vrighter 3 days ago | parent [-]

Exactly. To add to that, I'd like to point out that when this person says every joule, he is not exaggerating (only a teeny tiny bit). The actual computation itself barely uses any energy at all.

pjc50 3 days ago | parent | prev [-]

Refining it into what? Stock prices?

deelowe 4 days ago | parent | prev | next [-]

DC infra is always allocated in terms of watts. From this number, everything else is extrapolated (e.g. rough IT load, cooling needed, etc).

epolanski 4 days ago | parent | prev | next [-]

> is neither fair nor sustainable

That's half what I pay in Italy, I'm sure the richest country in the world will do fine.

FirmwareBurner 3 days ago | parent | next [-]

>I'm sure the richest country in the world will do fine.

You underestimate how addicted the US is to cheap energy and how wasteful it is at the same time.

Remember how your lifestyle always expands to fill the available resources no matter how good you have it? Well if tomorrow they'd have to pay EU prices, the country would have a war.

When you lived your entire life not caring about the energy bill or about saving energy, it's crippling to suddenly have scale back and be frugal even if that price would still be less than what other countries pay.

impjohn 2 days ago | parent [-]

It's hard to appreciate the difference in 'abundance mentality' between the median US and EU person. It always struck me as an interesting culture difference. While both EU and US grew in prosperity post WWII, I feel the US narrative was quite on another level.

modo_mario 3 days ago | parent | prev | next [-]

Here in Belgium a stupid amount of that bill is hidden taxes. i kind of assume it's similar in Italy.

epolanski 3 days ago | parent | next [-]

We import most of our energy, that's really it.

port11 3 days ago | parent | prev [-]

And the substantial increase in profits for all providers, which isn't comparable to that of our neighbours. Our disposable income in Belgium really exists to subsidise energy companies, supermarkets, and a pathetic housing market.

3 days ago | parent | prev [-]
[deleted]
abstractwater 4 days ago | parent | prev | next [-]

> So where is this 10GW electric supply going to come from and who is going to pay for it?

I would also like to know. It's a LOT of power to supply. Nvidia does have a ~3% stake in Applied Digital, a bitcoin miner that pivoted to AI (also a "Preferred NVIDIA Cloud Partner") with facilities in North Dakota. So they might be involved for a fraction of those 10GW, but it seems like it will be a small fraction even with all the planned expansions.

https://www.investopedia.com/applied-digital-stock-soars-on-...

https://ir.applieddigital.com/news-events/press-releases/det...

gitpusher 4 days ago | parent | prev | next [-]

> Framing it in gigawatts is very interesting given the controversy

Exactly. When I saw the headline I assumed it would contain some sort of ambitious green energy build-out, or at least a commitment to acquire X% of the energy from renewable sources. That's the only reason I can think to brag about energy consumption

7952 4 days ago | parent [-]

Or this brings power and prestige to the country that hosts it. And it gives clout precisely because it is seemingly wasteful. Finding the energy is a problem for the civilian government who either go "drill baby drill" or throw wind/solar/nuclear at the problem.

paulsutter 4 days ago | parent | prev | next [-]

Datacenters need to provide their own power/storage, and connect to the grid just to trade excess energy or provide grid stability. Given the 5-7 year backlog of photovoltaic projects waiting for interconnect, the grid is kind of a dinosaur that needs to be routed around

mensetmanusman 4 days ago | parent | prev | next [-]

“ skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years”

This is probably naïve. Prices skyrocketed in Germany for similar reasons before AI data centers were a thing.

paxys 4 days ago | parent | prev | next [-]

Watt is the hottest new currency in big tech. Want to launch something big? You don't have to ask for dollars or headcount or servers or whatever else used to be the bottleneck in the past. There's plenty of all this to go around (and if not it can be easily bought). Success or failure now depends on whether you can beg and plead your way to getting a large enough kilowatt/megawatt allocation over every other team that's fighting for it. Everything is measured this way.

monkeydust 4 days ago | parent [-]

Explains why Meta is entering power trading space

https://subscriber.politicopro.com/article/eenews/2025/09/22...

elphinstone 4 days ago | parent [-]

That gives me Enron vibes, even though these are vastly different situations. But the idea of a social media company trading in this space is nuts.

apercu 3 days ago | parent | prev | next [-]

I had my highest power bill last month in 4 years, in a month that was unseasonably cool so no AC for most of the month. Why are we as citizens without equity in these businesses subsidizing the capital class?

ianks 4 days ago | parent | prev | next [-]

To me, the question is less about “how do we make more energy” and more about “how do we make LLMs 100x more energy efficient.” Not saying this is an easy problem to solve, but it all seems like a stinky code smell.

sothatsit 4 days ago | parent [-]

I'm pretty confident that if LLMs were made 100x more energy efficient, we would just build bigger LLMs or run more parallel inference. OpenAI's GPT-5 Pro could become the baseline, and their crazy expensive IMO model could become the Pro offering. Especially if that energy efficiency came with speedups as well (I would be surprised if it didn't). The demand for smarter models seems very strong.

4 days ago | parent | prev | next [-]
[deleted]
XorNot 3 days ago | parent | prev | next [-]

This feels like a return to the moment just before Deepseek when the market was feeling all fat and confident that "more GPUs == MOAR AI". They don't understand the science, so they really want a simple figure to point to that means "this is the winner".

Framing it in GW is just giving them what they want, even if it makes no sense.

apimade 4 days ago | parent | prev | next [-]

An 8% increase y/o/y is quite substantial, however keep in mind globally we experienced the 2022 fuel shock. In Australia for example we saw energy prices double that year.

Although wholesale electricity prices show double-digit average year-on-year swings, their true long-run growth is closer to ~6% per year, slightly above wages at ~4% during the same period.

So power has become somewhat less affordable, but still remains a small share of household income. In other words, wage growth has absorbed much of the real impact, and power prices are still a fraction of household income.

You can make it sound shocking with statements like “In 1999, a household’s wholesale power cost was about $150 a year, in 2022, that same household would be charged more than $1,000, even as wages only grew 2.5x”, but the real impact (on average, obviously there are outliers and low income households are disproportionately impacted in areas where gov doesn’t subsidise) isn’t major.

https://www.aer.gov.au/industry/registers/charts/annual-volu...

atkailash 4 days ago | parent | next [-]

I wouldn’t call a $100-270 electric bill a “fraction” when it’s about 5% post tax income. I use a single light on a timer and have a small apartment

Especially since these sorts of corporations can get tax breaks or har means of getting regulators to allow spreading the cost. Residential shouldn’t see any increase due to data centers, but they do, and will, supplement them while seeing minimal changes to infrastructure

When people are being told to minimize air conditioning but then these big datacenters are made and aren’t told “reduce your consumption” then it doesn’t matter how big or small the electric bill is, it’s supplementing a multi billion dollar corporation’s toy

bushbaba 4 days ago | parent | prev | next [-]

6% YoY is much higher than the 2-3% inflation target

richrichardsson 4 days ago | parent | prev [-]

So a 6.6x increase in power bill, offset by a 2.5x wage increase has no major impact?

I'm sure none of the other outgoings for a household saw similar increases. /s

mvanbaak 4 days ago | parent | prev | next [-]

0,19 per kwh. Damn man, here it is like 0,97 per kwh (Western Europe) … stop complaining

Rexxar 4 days ago | parent | next [-]

Regulated price in France:

- 0,1952 per kWh for uniform price.

- 0,1635 / 0,2081 for day/nigh pricing

- 0,1232 /... / 0,6468 for variable pricing

https://particulier.edf.fr/content/dam/2-Actifs/Documents/Of...

You have a very bad deal if you pay 0.97€ per kWh.

patrickmcnamara 4 days ago | parent | prev | next [-]

This is not true. The average in the EU is 0,287 €/kWh. I pay 0,34 €/kWh in Berlin.

distances 3 days ago | parent [-]

And in Germany the price includes transmission and taxes, it's the consumer end price. You have to remember that some countries report electricity price without transmission or taxes, also in consumer context, so you need to be careful with comparisons.

4 days ago | parent | prev [-]
[deleted]
whatever1 4 days ago | parent | prev | next [-]

DCs need to align their training cycles with the peak of renewable power generation

justincormack 4 days ago | parent [-]

They are starting to include batteries so they dont have to adjust to external factors

cavisne 4 days ago | parent | prev | next [-]

Utilities always need to justify rate increases with the regulator.

The bulk of cost increases come from the transition to renewable energy. You can check your local utility and see.

It’s very easy to make a huge customer like a data center directly pay the cost needed to serve them from the grid.

Generation of electricity is more complicated, the data centers pulling cheap power from Colombia river hydro are starting to compete with residential users.

Generation is a tiny fraction of electricity charges though.

stogot 4 days ago | parent | prev | next [-]

I actually don’t like this measurement, as it’s vague and dilutes the announcement. Each product has a different efficiency of watts.

Imagine Ford announced “a strategic partnership with FedEx to deploy 10 giga-gallons of ICE vehicles”

mensetmanusman 4 days ago | parent [-]

It’s a sticky metric though because Moores law per power consumption died years ago.

dantillberg 4 days ago | parent | prev | next [-]

Prices of _everything_ went up over the past five years. Datacenter expansion was far from the main driver. Dollars and cents aren't worth what they used to be.

basilgohar 4 days ago | parent [-]

Elsewhere it was mentioned that DCs pay less for electricity per Wh than residential customers. If that is the case, then it's not just about inflation, but also unfair pricing putting more of the infrastructure costs on residential customers whereas the demand increase is coming from commercial ones.

aaronmdjones 4 days ago | parent [-]

Industrial electricity consumers pay lower unit rates per kWh, but they also pay for any reactive power that they consume and then return -- residential consumers do not. As in, what industrial consumers actually pay is a unit cost per kVAh, not kWh.

This means loads with pretty abysmal power factors (like induction motors) actually end up costing the business more money than if they ran them at home (assuming the home had a sufficient supply of power).

Further, they get these lower rates in exchange for being deprioritised -- in grid instability (e.g. an ongoing frequency decline because demand outstrips available supply), they will be the first consumers to be disconnected from the grid. Rolling blackouts affecting residential consumers are the last resort.

There are two sides to this coin.

Note that I am in no way siding with this whole AI electricity consumption disaster. I can't wait for this bubble to pop so we can get back to normality. 10GW is a third of the entire daily peak demand of my country (the United Kingdom). It's ridiculous.

Edit: Practical Engineering (YouTube channel) has a pretty decent video on the subject. https://www.youtube.com/watch?v=ZwkNTwWJP5k

randomNumber7 4 days ago | parent | prev | next [-]

I mean gigawatts is a concise metric to get a grasp of the amount of gpu compute they install, but the honesty seems a bit strange to me imo.

fuzzfactor 4 days ago | parent [-]

Total gigawatts is the maximum amount of power that can be supplied from the power generating station and consumed at the DC through the infrastructure and hardware as it was built.

Whether they use all those gigawatts and what they use them for would be considered optional and variable from time to time.

mullingitover 4 days ago | parent | prev | next [-]

> So where is this 10GW electric supply going to come from

If the US petro-regime wasn't fighting against cheap energy sources this would be a rounding error in the country's solar deployment.

China deployed 277GW of solar in 2024 and is accelerating, having deployed 212GW in the first half of 2025. 10 GW could be a pebble in the road, but instead it will be a boulder.

Voters should be livid that their power bills are going up instead of plummeting.

Saline9515 4 days ago | parent | next [-]

Fyi capacity announced is very far from the real capacity when dealing with renewables. It's like saying that you bought a Ferrari so now you can drive at 300km/h on the road all of the time.

In mid latitudes, 1 GW of solar power produces around 5.5 GWh/day. So the "real" equivalent is a 0.23 GW gas or nuclear plant (even lower when accounting for storage losses).

But "China installed 63 GW-equivalent" of solar power is a bit less interesting, so we go for the fake figures ;-)

FooBarWidget 3 days ago | parent [-]

You think they don't know that too? You can bet they're investing heavily in grid-level storage too.

Saline9515 3 days ago | parent [-]

I was commenting the initial number announcement. And storage at this scale right now doesn't exist. The most common way, water reservoirs, requires hard-to-find sites that are typically in the Himalaya, so far away from the production place. And the environmental cost isn't pretty either.

parineum 4 days ago | parent | prev | next [-]

I'm living in one of the most expensive electricity markets in the US. It has a lot more to do with the state shutting down cheap petro energy (natural gas) and nuclear then replacing it with... tbd.

bushbaba 4 days ago | parent | prev [-]

How would that solar power a DC at night or on a cloudy day? Energy storage isn’t cheap.

mullingitover 4 days ago | parent [-]

In 2025 it’s cheaper to demolish an operating coal plant and replace it with solar and battery, and prices are still dropping.

parineum 4 days ago | parent [-]

Why aren't all these businesses doing that then?

p1necone 4 days ago | parent | prev [-]

Theoretically couldn't you use all the waste heat from the data center to generate electricity again, making the "actual" consumption of the data center much lower?

quasse 4 days ago | parent | next [-]

Given that steam turbine efficiency depends on the temperature delta between steam input and condenser, unlikely unless you're somehow going to adapt Nvidia GPUs to run with cooling loop water at 250C+.

pjc50 3 days ago | parent | prev | next [-]

Thermodynamics says no. In fact you have to spend energy to remove that heat from the cores.

(Things might be different if you had some sort of SiC process that let you run a GPU at 500C core temperatures, then you could start thinking of meaningful uses for that, but you'd still need a river or sea for the cool side just as you do for nuclear plants)

distances 3 days ago | parent | prev | next [-]

In the Nordics the waste heat is used for district heating. This practical heat sink really favors northern countries for datacenter builds. In addition you usually get abundant water and lower population density (meaning easier to build renewables that have excess capacity).

4 days ago | parent | prev | next [-]
[deleted]
Blackthorn 4 days ago | parent | prev [-]

No.