▲ | eadmund 3 days ago | ||||||||||||||||||||||||||||
> I'd assume that a modern CPU would do the same amount of work with a fraction of energy so that it does not even make economical sense to run such outdated hardware. There are 8,760 hours in a non-leap year. Electricity in the U.S. averages 12.53 cents per kilowatt hour[1]. A really power-hungry CPU running full-bore at 500 W for a year would thus use about $550 of electricity. Even if power consumption dropped by half, that’s only about 10% of the cost of a new computer, so the payoff date of an upgrade is ten years in the future (ignoring the cost of performing the upgrade, which is non-negligible — as is the risk). And of course buying a new computer is a capital expense, while paying for electricity is an operating expense. 1: https://www.eia.gov/electricity/monthly/epm_table_grapher.ph... | |||||||||||||||||||||||||||||
▲ | wang_li 3 days ago | parent [-] | ||||||||||||||||||||||||||||
You can buy a mini pc for less than $550. For $200 on Amazon you can get an N97 based box with 12 GB RAM and 4 cores running at 3 GHz and a 500 GB SATA SSD. That’s got to be as fast as their current build systems and supports the required instructions. | |||||||||||||||||||||||||||||
|