| ▲ | Der_Einzige 5 hours ago | |||||||||||||
The amount of FUD and notion that hardware depreciates in this manner is widely held. I blame Michael Burry of the Big Short who is perpetuating these lies to the investor community today. There's a bunch of retro hardware which should make people pause and realize they're stupid to assume hardware slows down on average even 5% 20 years later (it's probably closer to 2% and I'm being generous). HVAC/power delivery and generation are the major factors, and if you didn't skimp/get defective parts for this and replace failed moving parts (usually fans), your hardware is basically the same 20 years down the line as it was today. Also using LLMs locally doesn't even induce sustained 100% GPU usage over significant periods of time for most real (agentic coding in OpenCode) use-cases. | ||||||||||||||
| ▲ | datadrivenangel 3 hours ago | parent [-] | |||||||||||||
There are tons of things that can start failing on hardware. I don't realistically expect some LLM usage to materially reduce the lifespan of the laptop, but running it 24/7 for AI usage makes me think that I'm more likely to get 3 years out of the device instead of 10. | ||||||||||||||
| ||||||||||||||