Remix.run Logo
bri3d 5 hours ago

It's different definitions of "easy."

With hardware, you have about one billion validation tests and QA processes, because when you're done, you're done and it had better work. Fixing an "issue" is very very expensive, and you want to get rid of them. However, this also makes the process more of, to stereotype, an "engineer's engineering" practice. It's very rules based, and if everything follows the rules and passes the tests, it's done. It doesn't matter how "hacky" or "badly architected" or "nasty" the input product is, when it works, it works. And, when it's done, it's done.

On the other hand, software is highly human-oriented and subjective, and it's a continuous process. With Linux working the way it does, with an intentionally hostile kernel interface, driver software is even more so. With Linux drivers you basically chose to either get them upstreamed (a massive undertaking in personality management, but Valve's choice here), deal with maintaining them in perpetuity at enormous cost as every release will break them (not common), or give up and release a point in time snapshot and ride into the sunset (which is what most people do). I don't really think this is easier than hardware, it's just a different thing.

generativenoise 2 hours ago | parent [-]

From the outside looking in. It really seems like both fields are working around each other in weird ways, somewhat enforced by backwards compatibility and historical path dependence.

The transition from more homogeneous architectures to the very heterogeneous and distributed architectures of today has never really been all that well accounted for, just lots of abstractions that have been papered over and work for the most part. Power management being the most common place these mismatches seem to surface.

I do wonder if it will ever be economical to "fix" some of these lower level issues or if we are stuck on this path dependent trajectory like the recurrent laryngeal nerve in our bodies.