▲ | p1necone 4 days ago | |
Theoretically couldn't you use all the waste heat from the data center to generate electricity again, making the "actual" consumption of the data center much lower? | ||
▲ | quasse 4 days ago | parent | next [-] | |
Given that steam turbine efficiency depends on the temperature delta between steam input and condenser, unlikely unless you're somehow going to adapt Nvidia GPUs to run with cooling loop water at 250C+. | ||
▲ | pjc50 3 days ago | parent | prev | next [-] | |
Thermodynamics says no. In fact you have to spend energy to remove that heat from the cores. (Things might be different if you had some sort of SiC process that let you run a GPU at 500C core temperatures, then you could start thinking of meaningful uses for that, but you'd still need a river or sea for the cool side just as you do for nuclear plants) | ||
▲ | distances 3 days ago | parent | prev | next [-] | |
In the Nordics the waste heat is used for district heating. This practical heat sink really favors northern countries for datacenter builds. In addition you usually get abundant water and lower population density (meaning easier to build renewables that have excess capacity). | ||
▲ | 4 days ago | parent | prev | next [-] | |
[deleted] | ||
▲ | Blackthorn 4 days ago | parent | prev [-] | |
No. |