Remix.run Logo
KronisLV 5 days ago

> I also double-checked if the CPU temperature of about 100 degrees celsius is too high, but no: this Tom’s Hardware article shows even higher temperatures, and Intel specifies a maximum of 110 degrees. So, running at “only” 100 degrees for a few hours should be fine.

I'd say that even crashing at max temperatures is still completely unreasonable! You should be able to run at 100C or whatever the max temperature is for a week non-stop if you well damn please. If you can't, then the value has been chosen wrong by the manufacturers. If the CPU can't handle that, the clock rates should just be dialed back accordingly to maintain stability.

It's odd to hear about Core Ultra CPUs failing like that, though - I thought that they were supposed to be more power efficient than the 13th and 14th gen, all while not having their stability issues.

That said, I currently have a Ryzen 7 5800X, OCed with PBO to hit 5 GHz with negative CO offsets per core set. There's also an AIO with two fans and the side panel is off because the case I have is horrible. While gaming the temps usually don't reach past like 82C but Prime95 or anything else that's computationally intensive can make the CPU hit and flatten out at 90C. So odd to have modern desktop class CPUs still bump into thermal limits like that. That's with a pretty decent ambient temperature between 21C to 26C (summer).

williamDafoe 5 days ago | parent [-]

Just FYI Google runs their data centers at 85 degrees F (about 30 degrees C). I think Google probably knows more about how to run Intel CPUs for longest life and lowest cost per CPU cycle. After all they are the #5 computer maker on earth. What Intel is doing and what they are recommending is the act of a desperate corporation incapable of designing energy-efficient CPUs, incapable of progressing their performance in MIPS per Watt of power. This is a sign of a failed corporation.

Panzer04 4 days ago | parent | next [-]

Google runs datacenters hot because it's probably cheaper than over-cooling them with AC.

Chips are happy to run at high temperatures, that's not an issue. It's just a tradeoff of expense and performance.

KronisLV 5 days ago | parent | prev [-]

> Just FYI Google runs their data centers at 85 degrees F (about 30 degrees C). I think Google probably knows more about how to run Intel CPUs for longest life and lowest cost per CPU cycle. After all they are the #5 computer maker on earth.

Servers and running things at scale are way different from consumer use cases and the cooling solutions you'll find in the typical desktop tower, esp. considering the average budget and tolerance for noise. Regardless, on a desktop chip, even if you hit tJMax, it shouldn't lead to instability as in the post above, nor should the chips fail.

If they do, then that value was chosen wrong by the manufacturer. The chips should also be clocking back to maintain safe operating temps. Essentially, squeeze out whatever performance is available with a given cooling solution: be it passive (I have some low TDP AM4 chips with passive Alpine radiator blocks), air coolers or AIOs or a custom liquid loop.

> What Intel is doing and what they are recommending is the act of a desperate corporation incapable of designing energy-efficient CPUs, incapable of progressing their performance in MIPS per Watt of power.

I don't disagree with this entirely, but the story is increasingly similar with AMD as well - most consumer chip manufacturers are pushing the chips harder and harder out of the factory, so they can compete on benchmarks. That's why you hear about people limiting the power envelope to 80-90% of stock and dropping close to 10 degrees C in temperatures, similarly you hear about the difficulties of pushing chips all that far past stock in overclocking, because they're already pushed harder than the prior generations.

To sum up: Intel should be less delusional in how far they can push the silicon, take the L and compete against AMD on the pricing, instead of charging an arm and a leg for chips that will burn up. What they were doing with the Arc GPUs compared to the competitors was actually a step in the right direction.