| ▲ | jsnell 3 hours ago | |
I don't see how that number could possibly be realistic. A H100 cost 30k when new, and uses 500W of power. 500W for a year is about 4500kWh, which at $0.10/kWh is $450/year if run at full utilization (unrealistic). TCO of an AI data center should be entirely dominated by capex depreciation. | ||
| ▲ | creddit 25 minutes ago | parent [-] | |
In fairness your calculation looks at the most expensive element of the DC but ignores all of the associated parts required to utilize the H100: CPU, memory, cooling, etc. No to say that that flips the calculation (I don't have the answer), but it does leave a lot of power out. | ||