| ▲ | trollbridge 16 hours ago | |||||||||||||
A typical data centre is $2,500 per year per kW load (including overhead, hvac and so on). If it costs $800,000 to replace the whole rack, then that would pay off in a year if it reduces 320 kW of consumption. Back when we ran servers, we wouldn't assume 100% utilisation but AI workloads do do that; normal server loads would be 10kW per rack and AI is closer to 100. So yeah, it's not hard to imagine power savings of 3.2 racks being worth it. | ||||||||||||||
| ▲ | Octoth0rpe 15 hours ago | parent [-] | |||||||||||||
Thanks for the numbers! Isn't it more likely that the amount of power/heat generated per rack will stay constant over each upgrade cycle, and the upgrade simply unlocks a higher amount of service revenue per rack? | ||||||||||||||
| ||||||||||||||