Remix.run Logo
kenjackson 15 hours ago

The article says, "Kirshna said that it takes about $80 billion to fill up a one-gigawatt data center."

But thanks for you insight -- I used your basic idea to estimate and for 1GW it comes to about $30b just for enough GPU power to pull 1GW. And of course that doesn't take into account any other costs.

So $80b for a GW datacenter seems high, but it's within a small constant factor.

That said, power seems like a weird metric to use. Although I don't know what sort of metric makes sense for AI (e.g., a flops counterpart for AI workloads). I'd expect efficiency to get better and GPU cost to go down over time (???).

UPDATE: Below someone posted an article breaking down the costs. In that article they note that GPUs are about 39% of the cost. Using what I independently computed to be $30b -- at 39% of total costs, my estimate is $77b per GW -- remarkably close to the CEO of IBM. I guess he may know what he's talking about. :-)

coliveira 15 hours ago | parent [-]

> power seems like a weird metric to use

Because this technology changes so fast, that's the only metric that you can control over several data centers. It is also directly connected to the general capacity of data center, which is limited by available energy to operate.

pjdesno 14 hours ago | parent | next [-]

To expand on rahimnathwani's comment below - the big capital costs of a data center are land, the building itself, the power distribution and the cooling.

You can get a lot of land for a million bucks, and it doesn't cost all that much to build what's basically a big 2-story warehouse, so the primary capital costs are power and cooling. (in fact, in some older estimates, the capital to build that power+cooling cost more per year than the electricity itself)

My understanding is that although power and cooling infrastructure are long-lived compared to computers, they still depreciate faster than the building, so they dominate costs even more than the raw price would indicate.

The state of the art in power and cooling is basically defined by the cost to feed X MW of computing, where that cost includes both capital and operation, and of course lower is better. That means that at a particular SOTA, and at an appropriate scale for that technology, the cost of the facility is a constant overhead on top of the cost of the equipment it houses. To a rough approximation, of course.

rahimnathwani 14 hours ago | parent | prev [-]

And cooling capacity.