Remix.run Logo
dangalf 2 days ago

Technically it is inefficiency. The electricity should be doing computer things. Heat is wasted electricity. Just there's not much the data centre could do about it.

stouset 2 days ago | parent | next [-]

Even if the computer does perfectly-efficient computer things with every Joule, every single one of those Joules ends up as one Joule of waste heat.

If you pull 100W of power out of an electric socket, you are heating your environment at 100W of power completely independent of what you use that electricity for.

spyder 2 days ago | parent | next [-]

Only true for our current computers and not true with reversible computing. With reversible computing you can use electricity to perform a calculation and then "push" that electricity back into a battery or a capacitor instead of dumping it to the environment. It's still a huge challenge, but there is a recent promising attempt:

"British reversible computing startup Vaire has demonstrated an adiabatic reversible computing system with net energy recovery"

https://www.eetimes.com/vaire-demos-energy-recovery-with-rev...

https://vaire.co/

Short introduction video to reversible computing:

https://www.youtube.com/watch?v=rVmZTGeIwnc

flave 2 days ago | parent [-]

Actually pretty cool - I was about to comment “nice perpetual motion machine” but looked into a bit more and it’s much more interesting than that (well, a real perpetual motion machine would be interesting but…)

Thanks for posting. Pretty cool.

tatjam a day ago | parent [-]

This kind of stuff could trigger the next revolution in computing, as the theoretical energy consumption of computing is pretty insignificant. Imagine if we could make computers with near-zero energy dissipation! A "solid 3D" computer would then become possible, and Moore's law may keep going until we exhaust the new dimension ;)

thegrim000 2 days ago | parent | prev | next [-]

I read it as the inefficient part isn't the compute efficiency, the inefficient part is dumping all the resulting heat into the environment without capturing it and using it in some way to generate electricity or do work.

On a related/side note, when there's talk about seti and dyson spheres, and detecting them via infrared waste heat, I also don't understand that. Such an alien civilization is seemingly capable of building massive space structures/projects, but then lets the waste heat just pour out into the universe in such insane quantities that we could see it tens/hundreds of light years away? What a waste. Why wouldn't they recover that heat and make use of it instead? And repeat the recovering until the final waste output is too small to bother recovering, at which point we would no longer be able to detect it.

stouset a day ago | parent | next [-]

> but then lets the waste heat just pour out

There is no other alternative! If I build a perfect Dyson sphere and capture the energy output of a star, all of that energy will become heat. The average surface temperature of my Dyson sphere will be (IIRC) the ratio of the surface area of the sphere to that of the contained star, multiplied by the star's effective surface temperature.

"Recovering heat and making use of it" requires a heat differential. You need a cold side and a hot side to use energy. Using that energy causes the cold side to heat and the hot side to cool, until they reach equilibrium. The further the difference, the more usable work you can do. The closer the two sides are, the less work you can do.

Someone else here said it best: waste heat is the graveyard of energy. Once you have used energy, it will become high-entropy, low-grade, diffuse heat which is difficult-to-impossible to extract further work from.

oasisaimlessly 2 days ago | parent | prev [-]

All energy inevitably changes into heat eventually, and in the steady state, power in = power out.

There is no way to get rid of heat. It has to go somewhere; otherwise, the temperature of the system will increase without bound.

thegrim000 a day ago | parent [-]

For example, why couldn't you use the waste heat like a power plant? Use it to boil water, to turn turbines, to generate electricity, which gets sent and consumed elsewhere? Adding to the heat wherever the electricity is finally consumed. (Ignoring various losses along the way).

stouset 19 hours ago | parent [-]

“Elsewhere” is still somewhere on the Dyson sphere.

Or if you magically beam 100% of the captured energy somewhere else, now that place gets to deal with shedding the heat from however many 1e26W+ of power were consumed. God help the poor planet you aim that ray of death at.

tasuki 2 days ago | parent | prev | next [-]

> every single one of those Joules ends up as one Joule of waste heat.

Yes it ends up as heat, but with some forethought, it could be used to eg heat people's homes rather than as waste.

agumonkey 2 days ago | parent | next [-]

These days it's not rare to have data center heated buildings. I guess crypto bros are just not thinking about this. But technically if could be done there too.

KellyCriterion a day ago | parent [-]

There was a startup in EU which explicitly sold heat from crypto mining to the local energy provider. IIRC it was also here on hacker news some time ago.

agumonkey a day ago | parent [-]

Qarnot maybe

KellyCriterion a day ago | parent [-]

I meant this team:

https://terahash.space/en/

agumonkey a day ago | parent [-]

oh nice, i didn't know about them

TheSpiceIsLife 2 days ago | parent | prev [-]

You can say that about any waste heat.

In really, it’s not convenient to move all waste heat to where it’s more needed.

m4rtink 2 days ago | parent | next [-]

Modern industrial scale insulated hot water district heating systems can do dozens of kilometers with the water cooling down only by a degree Celsius.

tremon 2 days ago | parent | prev [-]

It's always more convenient to ignore externalities. That doesn't mean we should be okay with only bottom-of-the-barrel solutions.

robkop 2 days ago | parent | prev | next [-]

Interesting question - how much will end up as sound, or in the ever smaller tail of things like storing a bit in flash memory?

Workaccount2 2 days ago | parent | next [-]

Heat is the graveyard of energy. Everything that uses energy, or is energy, is actually just energy on it's way to the graveyard.

The energy of the universe is a pool of water a top a cliff. Water running off this cliff is used to do stuff (work), and the pool at the bottom is heat.

The "heat death of the universe" is referring to this water fall running dry, and all the energy being in this useless pool of "heat".

devsda 2 days ago | parent [-]

Do thermophotovoltaic cells operate on different kind of heat?

Is it impossible to convert heat into other forms of energy without "consuming" materials like in the case of steam, geothermal or even the ones that need a cold body to utilize thermoelectric effect.

LiamPowell 2 days ago | parent | next [-]

TPVs don't rely solely on the temperature of an object being high, they instead rely on two objects on either side having different temperatures. As heat moves[1] from one side to the other some of the energy from that movement is turned in to electricity.

[1]: Technically the movement itself is heat, the objects don't contain heat, rather they contain internal energy, but the two get mixed up more often than not.

supermatt 2 days ago | parent [-]

That movement is effectively “consuming” the differential.

ajuc 2 days ago | parent | prev [-]

What thermal energy sources actually exploit is temperature difference, not heat. And in the end that difference averages out.

phil21 2 days ago | parent | prev | next [-]

Almost none. A long time ago a friend and I did the math for sound, photons (status LEDs), etc and it was a rounding error of 1% or something silly like that.

And that’s ignoring that sound and photon emissions typically hit a wall or other physical surface and get converted back to heat.

It all ends up as heat in the end, just depends on where that heat is dumped and if you need to cool it or not. Most watts end up being even more than the theoretical heat per watt due to said cooling needs.

There is literally no way around the fact that every watt you burn for compute ends up as a watt of waste heat. The only factor you can control is how many units of compute you can achieve with that same watt.

Terr_ 2 days ago | parent [-]

Well, at least until somebody devises a system that transports or projects it so that the heat ends up somewhere not-Earth. It'd still be heating the universe in general, of course, even in the form of sprays of neutrinos.

That reminds me of a sci-fi book, Sundiver by David Brin, where a ship is exploring the sun by firing a "refrigerator laser" to somehow pump-away excess heat and balance on the thrust.

mrDmrTmrJ 2 days ago | parent | prev [-]

All sound will end up as heat.

anthonj 2 days ago | parent | prev | next [-]

This violates energy conservation principles. Some power will be "wasted" into heat, some other will be used for some other work.

stouset a day ago | parent [-]

If I use energy to move a block one foot over, I have performed useful work. But 100% of the energy used to perform that work is either already heat or shortly will be.

usrnm 2 days ago | parent | prev | next [-]

If I turn my fan on and 100% of the electricity is converted to heat, where does the kinetic energy of moving fan blades come from? Even the Trump administration cannot just repeal the law of conservation of energy.

numb7rs 2 days ago | parent | next [-]

Even if most of the energy goes into kinetic energy of the air, that air will lose momentum via turbulence and friction with the surrounding air, which will end up as... heat.

jo909 2 days ago | parent | prev [-]

While spinning, the blades store a miniscule amount of kinetic energy.

After removing power even that small amount ends up as heat through friction ( both in the bearing but mostly in the air turbulence). And the blades end up in the same zero energy state: sitting still.

So it is correct that a 100% "end up" as heat

usrnm 2 days ago | parent [-]

Most of that energy gets transfered to the air that's being moved by the blades, and who knows what that air does eventually. And we're not even talking about the plant growing light that might be sitting in my room near my house plants literally creating new life from electricity.

stouset a day ago | parent [-]

> who knows what that air does eventually

We do know what that air does eventually. Given no further inputs of energy, it swirls around generating friction, raising its temperature (heat!) as the currents slow down to nearly nothing.

csomar 2 days ago | parent | prev [-]

Theoretically, if your computation is energy efficient, you won't need any electricity at all since the real computation costs zero energy.

majoe a day ago | parent [-]

That's not correct. For ordinary computers there is Landauer's principle, which gives a theoretical lower limit for the energy needed for computation [0].

I say "ordinary computers" because other comments mentioned "reversible computers" for which this limit doesn't apply.

According to the linked wikipedia page, this theoretical limit is around a billion times smaller than current computers use for an operation, so you may call me pedantic.

[0]: https://en.wikipedia.org/wiki/Landauer%27s_principle

mr_toad 2 days ago | parent | prev | next [-]

There’s a minimum level of energy consumption (and thus heat) that has to be produced by computation, just because of physics. However, modern computers generate billions of times more heat than this minimum level.

https://en.wikipedia.org/wiki/Landauer's_principle

RhysU 2 days ago | parent [-]

It'd be super fun to take that as an axiom of physics then to see how far upwards one could build from that. Above my skills by far.

ruined 2 days ago | parent | next [-]

it's called the first law of thermodynamics

RhysU a day ago | parent [-]

The first law involves cwork. The axiom I am thinking of involves information.

UltraSane 2 days ago | parent | prev [-]

The minimum amount of energy needed to compute decreased asymptotically to 0 as the temperature of space goes to 0. This is the reason a common sci-fi trope where advanced civilizations hibernate for extremely long times so that they can do more computation with available energy.

ctmnt 2 days ago | parent [-]

That’s a common trope? Can’t say I’ve run into it. But I’d like to! What are some good examples?

Supermancho 2 days ago | parent | next [-]

In the book Calculating God, a character notes that this is a common civilization-wide choice. Living in virtual reality, rather than trying to expand into the vast expanses of space, is a common trope as much as it's a logical choice. It neatly explains the Fermi Paradox. In some fiction, like The Matrix, the choice might be forced due to cultural shifts, but the outcome is the same. A relatively sterile low-energy state civilization doing pure processing.

ithkuil 2 days ago | parent [-]

I wonder if it's illogical to think that all civilizations must always pick the most logical of the options

yetihehe 2 days ago | parent [-]

Those civilisations that make too much illogical choices probably die off.

ithkuil 2 days ago | parent [-]

True. But it's not a binary choice. All it takes is to make one sub-optmial choice for the universe to be filled up with von-neuman probes in all star systems

triMichael 2 days ago | parent | prev | next [-]

Kurzgesagt just made a video on it a couple months back: https://www.youtube.com/watch?v=VMm-U2pHrXE

CamperBob2 2 days ago | parent | prev | next [-]

Here you go: https://pastebin.com/raw/SUd5sLRC

And it only cost 0.006 rain forests!

UltraSane 2 days ago | parent | prev [-]

https://en.wikipedia.org/wiki/Aestivation_hypothesis

https://www.youtube.com/watch?v=v9sh9NpL4i8

https://mindmatters.ai/2020/10/researchers-the-aliens-exist-...

https://aleph.se/andart2/space/the-aestivation-hypothesis-po...

geoffschmidt 2 days ago | parent | prev | next [-]

Heat is not by itself waste. It's what electricity turns into after it's done doing computer things. Efficiency is a separate question - how many computer things you got done per unit electricity turned into heat.

anon84873628 2 days ago | parent [-]

How many computer things you got done per unit electricity, and how many mechanical things you do with the temperature gradient between the computer and its heat sync.

For example, kinda wasteful to cook eggs with new electrons when you could use the computer heat to help you denature those proteins. Or just put the heat in human living spaces.

(Putting aside how practical that actually is... Which it isn't)

eimrine 2 days ago | parent [-]

Good luck with collecting that heat from air.

anon84873628 2 days ago | parent | prev | next [-]

I think what they mean is that there is not a Carnot engine hooked up between the heat source and sink. Which theoretically something the data center could do about it.

a day ago | parent | prev | next [-]
[deleted]
charcircuit 2 days ago | parent | prev | next [-]

The electricity is doing computer things, building bitcoin blocks.

YetAnotherNick 2 days ago | parent | prev | next [-]

No its not. It would be waste only if the there is a high temperature gradient, which is minimized in mining operation through proper cooling.

It's that computation requires electricity. And almost all of the heat in bitcoin mining comes from computation, technically changing transistor state.

sixtyj 2 days ago | parent | prev [-]

They could make a second floor with eggs and newborn chicken. /s