Remix.run Logo
tenthirtyam 2 days ago

I really can't help but think there might be a fundamental problem here. If heat loss in dirt is so incredibly slow, and we use a heat pump to extract that heat, then wouldn't it be equally slow to replace the heat extracted? (absent heat injection, i.e. reversing the heat pump's operation in summer)

Has anyone looked at the subsurface ground temperatures after days, weeks, months, even years of heat pump operation?

I do seem to remember seeing one article on the subject showing that after one winter the subsurface temperature had declined enough to materially affect the heat pump's COP. But the timescale didn't extend to multiple years.

edit: found this one: https://pangea.stanford.edu/ERE/db/GeoConf/papers/SGW/2024/M...

In one of eight cases studied where heat flow is unidirectional (cooling load only) over a 20 year timescale the authors find:

  "the mean ground temperature ... increased from 21.87 °C to 26.18 °C, ... . This significant rise could have a potential impact on the performance of the system in the later years of its operational life."
The other 7 cases showed weaker or negligible long term variation.

An additional graph shows COP variation over 15 years and the worst case shows a decline of perhaps 10% (just eyeballing it).

Surprisingly, some cases showed a long term improvement in heating COP - presumably the injection of summer heat into the soil made for warmer soil than just sunshine and natural diffusion?

So my takeaway: "it depends." :-)

pfdietz 2 days ago | parent [-]

There are no heat pumps involved in the OP idea.

tenthirtyam a day ago | parent [-]

Oh! You're right, I hadn't realized. But... wouldn't it be more efficient with a heat pump and exploit the 2-4x COP? Ok, maybe I get it now: heat pump wouldn't work at the high temps they are achieving. Thx.

pfdietz a day ago | parent [-]

Why are you optimizing for efficiency rather than overall cost?