▲ | dare944 4 days ago | |||||||
Not sure what your point is. The ability to manufacture silicon chips will only ever be in the hands of a relatively small group of people worldwide. So of course all of us are dependent on these people/businesses to do any form of modern computing. The question isn't how can we live without dependencies. It's how many dependencies must we have? And of those that aren't strictly necessary, what are the benefits (and costs) of breaking them? | ||||||||
▲ | ndriscoll 4 days ago | parent [-] | |||||||
It's also a matter of capital vs operational dependency. Intel needs to exist today to buy a chip, but my 9 year old mid range desktop still works fine and is perfectly snappy today, and I suspect my minipcs (which draw as much power as a lightbulb or two, so could easily be solar/battery powered) will also work fine for at least a decade. I can't imagine needing more computing power than an N100 provides for a home server; mine is already 99% idle. So these things will basically never be obsolete. I suspect the actual chips will last the rest of my life at least, so even if a capacitor fails on the motherboard, the skills to replace those are considerably more common if CPU manufacturers were to fail somehow (or if new hardware became unusable due to DRM or something). | ||||||||
|