| ▲ | georgeecollins 17 hours ago | ||||||||||||||||||||||
First, I think it's $80b per 100 GW datacenter. The way you figure that out is a GPU costs $x and consumes y power. The $x is pretty well known, for example an H100 costs $25-30k and uses 350-700 watts (that's from Gemini and I didn't check my work). You add an infrastructure (i) cost to the GPU cost, but that should be pretty small, like 10% or less. So a 1 gigawatt data center uses n chips, where yn = 1 GW. It costs = xi*n. I am not an expert so correct me please! | |||||||||||||||||||||||
| ▲ | kenjackson 16 hours ago | parent | next [-] | ||||||||||||||||||||||
The article says, "Kirshna said that it takes about $80 billion to fill up a one-gigawatt data center." But thanks for you insight -- I used your basic idea to estimate and for 1GW it comes to about $30b just for enough GPU power to pull 1GW. And of course that doesn't take into account any other costs. So $80b for a GW datacenter seems high, but it's within a small constant factor. That said, power seems like a weird metric to use. Although I don't know what sort of metric makes sense for AI (e.g., a flops counterpart for AI workloads). I'd expect efficiency to get better and GPU cost to go down over time (???). UPDATE: Below someone posted an article breaking down the costs. In that article they note that GPUs are about 39% of the cost. Using what I independently computed to be $30b -- at 39% of total costs, my estimate is $77b per GW -- remarkably close to the CEO of IBM. I guess he may know what he's talking about. :-) | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | zozbot234 10 hours ago | parent | prev [-] | ||||||||||||||||||||||
1 GW is not enough, you need at least 1.21 GW before the system begins to learn at a geometric rate and reaches AGI. | |||||||||||||||||||||||