| ▲ | zamadatix 5 hours ago |
| If you bought a big ass server for your home 10 years ago it probably wouldn't have even have had a GPU/AI accelerator at all. If it did, it would have been something with wimpy compute and VRAM because you needed the video encoder/decoder for security cameras or the like. I'm not sure that really gives confidence hardware has really slowed down enough to invest in it for decades. Single core CPU performance has but that's not really what new things are using. |
|
| ▲ | camdenreslink 5 hours ago | parent | next [-] |
| It really just depends on if the hardware is "good enough" for whatever its purpose is. If the hardware today can locally run whatever models for your security cameras, it's likely they will still be "good enough" in 10 years. Of course, similar to a 10 year old car or appliance, you will be missing any new features or bells and whistles that have become available in the meantime. |
| |
| ▲ | wtallis 5 hours ago | parent [-] | | I agree; it's important to recognize that there are lots of use cases where computers have long since reached "good enough" and aren't really going obsolete anymore for those use cases. My NAS is about 13 years old, the network switches it connects through are even older, and while 2.5GbE now exists I have no need throw out my "good enough" equipment to replace with something marginally faster or more power efficient. I don't even really need to expand the storage of that NAS anytime soon, because my music collection could never come close to filling it, my movie/TV collection isn't growing much anymore due to the shift to streaming, and the volume of other stuff that I need to back up from my other computers just isn't growing much over the years. |
|
|
| ▲ | kennywinker 4 hours ago | parent | prev | next [-] |
| You’re kindof undermining your own point. Ten years later the only thing you’d need to upgrade for your home server might be the GPU - because a new use-case emerged. Okay? Spend $500-$1000 on an eGPU. Problem solved. Will that eGPU setup last another ten years? If all it’s doing is processing security video and routing claw-like tasks, then yes. |
| |
| ▲ | zamadatix 3 hours ago | parent [-] | | Not sure I follow why - that the server from 10 years ago would be completely unfit for purpose now should not imply the one you buy today would therefore be the right hardware 10 years from now. Unless you can somehow guarantee we've reached the final set of new requirements we will ever have just these last few years the GPUs you buy today will probably be just as irrelevant to the new requirements a decade from now. Of course one can always upgrade components piecewise as requirements change, but I don't see why you need to invest in a big ass server to do that. It'd be cheaper to go that route everyone has for decades at this point - upgrade with normal sized stuff as needed and not try to make it an up front multi-decade home investment out of it. On the flip-side, if you intentionally plan to lock in the capabilities to the kinds of things one can run today and know you'll never therefore need to upgrade it then you can get whatever sized system makes sense for today's needs. You just need to be really sure you'll not be interested in "the next big thing" when it comes too. |
|
|
| ▲ | majormajor 5 hours ago | parent | prev | next [-] |
| Decades is a long time for hardware, but "years" seems reasonable soon. The commercial models are "good enough" for a lot of things now, so if that performance makes its way into the on-device space for "home applicance"-level cost (<$5k at the start, basically), I'd expect a lot of stuff to start popping up there. In offices too. Like the PC in the 80s starting to eat up "get a mainframe" or "rent time on a mainframe" uses. |
|
| ▲ | psyclobe 4 hours ago | parent | prev [-] |
| Yeah but, how long do mainframes last? Think of the COBOL systems used in government. No reason to update them, they worked forever; their job is discrete and they performed it well enough where intense updating wasn't a requirement. |
| |
| ▲ | icedchai 4 hours ago | parent [-] | | You also need to ask: How much do mainframes cost? They were engineered for backwards compatibility and reliability, with built in redundancy you don't find in consumer hardware. AI models are changing every other day. I have to rebuild llama.cpp from source regularly. We are no where close to a personal "AI mainframe." |
|