| ▲ | kenjackson 17 hours ago | |||||||||||||||||||||||||||||||||||||
I don't understand the math about how we compute $80b for a gigawatt datacenter. What's the costs in that $80b? I literally don't understand how to get to that number -- I'm not questioning its validity. What percent is power consumption, versus land cost, versus building and infrastructure, versus GPU, versus people, etc... | ||||||||||||||||||||||||||||||||||||||
| ▲ | wmf 17 hours ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||
https://www.investing.com/news/stock-market-news/how-much-do... | ||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||
| ▲ | georgeecollins 17 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||
First, I think it's $80b per 100 GW datacenter. The way you figure that out is a GPU costs $x and consumes y power. The $x is pretty well known, for example an H100 costs $25-30k and uses 350-700 watts (that's from Gemini and I didn't check my work). You add an infrastructure (i) cost to the GPU cost, but that should be pretty small, like 10% or less. So a 1 gigawatt data center uses n chips, where yn = 1 GW. It costs = xi*n. I am not an expert so correct me please! | ||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||