Remix.run Logo
lambda 11 hours ago

Data center power usage has been fairly flat for the last decade (until 2022 or so). While new capacity has been coming online, efficiency improvements have been keeping up, keeping total usage mostly flat.

The AI boom has completely changed that. Data center power usage is rocketing upwards now. It is estimated it will be more than 10% of all electric power usage in the US by 2030.

It's a completely different order of magnitude than the pre AI-boom data center usage.

Source: https://escholarship.org/uc/item/32d6m0d1

azakai 10 hours ago | parent | next [-]

The first chart in your link doesn't show "flat" usage until 2022? It is clearly rising at an increasing rate, and it more than doubles over 2014-2022.

It might help to look at global power usage, not just the US, see the first figure here:

https://arstechnica.com/ai/2024/06/is-generative-ai-really-g...

There isn't an inflection point around 2022: it has been rising quickly since 2010 or so.

lambda 8 hours ago | parent | next [-]

I think you're referring to Figure ES-1 in that paper, but that's kind of a summary of different estimates.

Figure 1.1 is the chart I was referring to, which are the data points from the original sources that it uses.

Between 2010 and 2020, it shows a very slow linear growth. Yes, there is growth, but it's quite slow and mostly linear.

Then the slope increases sharply. And the estimates after that point follow the new, sharper growth.

Sorry, when I wrote my original comment I didn't have the paper in front of me, I linked it afterwards. But you can see that distinct change in rate at around 2020.

azakai 8 hours ago | parent | next [-]

ES-1 is the most important figure, though? As you say, it is a summary, and the authors consider it their best estimate, hence they put it first, and in the executive summary.

Figure 1.1 does show a single source from 2018 (Shehabi et al) that estimates almost flat growth up to 2017, that's true, but the same graph shows other sources with overlap on the same time frame as well, and their estimates differ (though they don't span enough years to really tell one way or another).

NewsaHackO 6 hours ago | parent | prev [-]

I still wouldn't say that your assertion that data center energy use was fairly flat until 2022 is true. Even in Figure 1.2, for global data center usage, tracks more in line with the estimates in the executive summary. It just seems like the run-of-the-mill exponential increase with the same rate since at least 2014, a good amount of time before genAI was used heavily.

techjamie 9 hours ago | parent | prev [-]

Basing off Yahoo historical price data, Bitcoin prices first started being tracked in late 2014. So my guess would be the increase from then to 2022 could have largely been attributed to crypto mining.

somenameforme 8 hours ago | parent | next [-]

The energy impact of crypto is rather exaggerated. Most estimates on this front are aiming to demonstrate as a high value as possible, and so should be taken as higher upper bound, and yet even that upper bound is 'only' around 200TWh a year. Annual energy consumption is in the 24,000TWh range with growth averaging around 2% or so per year.

So if you looked at a graph of energy consumption, you wouldn't even notice crypto. In fact even LLM stuff will just look like a blip unless it scales up substantially more than its currently trending. We use vastly more more energy than most appreciate. And this is only electrical energy consumption. All energy consumption is something like 185,000 TWh. [1]

[1] - https://ourworldindata.org/energy-production-consumption

ToValueFunfetti 8 hours ago | parent | prev [-]

It looks like the number of internet users ~doubled in that time as well: https://data.worldbank.org/indicator/IT.NET.USER.ZS?end=2022...

jotras 5 hours ago | parent | prev | next [-]

This is where the debate gets interesting, but I think both sides are cherrypicking data a bit. The energy consumption trend depends a lot on what baseline you're measuring from and which metrics you prioritize.

Yes, data center efficiency improved dramatically between 2010-2020, but the absolute scale kept growing. So you're technically both right: efficiency gains kept/unit costs down while total infrastructure expanded. The 2022+ inflection is real though, and its not just about AI training. Inference at scale is the quiet energy hog nobody talks about enough.

What bugs me about this whole thread is that it's turning into "AI bad" vs "AI defenders," when the real question should be: which AI use cases actually justify this resource spike? Running an LLM to summarize a Slack thread probably doesn't. Using it to accelerate drug discovery or materials science probably does. But we're deploying this stuff everywhere without any kind of cost/benefit filter, and that's the part that feels reckless.

serf 9 hours ago | parent | prev [-]

"google has been brainwashing us with ads deployed by the most extravagant uses of technology man has ever known since they've ever existed."

"yeah but they became efficient at it by 2012!"