▲ | VyseofArcadia 5 days ago | |||||||
> The net effect of this is that the software you’re running on your computer is effectively wiping out the last 10-20 years of hardware evolution; in some extreme cases, more like 30 years. As an industry we need to worry about this more. I get that in business, if you can be less efficient in order to put out more features faster, your dps[0] is higher. But as both a programmer and an end user, I care deeply about efficiency. Bad enough when just one application is sucking up resources unnecessarily, but now it's nearly every application, up to and including the OS itself if you are lucky enough to be a Microsoft customer. The hardware I have sitting on my desk is vastly more powerful that what I was rocking 10-20 years ago, but the user experience seems about the same. No new features have really revolutionized how I use the computer, so from my perspective all we have done is make everything slower in lockstep with hardware advances. [0] dollars per second | ||||||||
▲ | dayvigo 5 days ago | parent | next [-] | |||||||
> The hardware I have sitting on my desk is vastly more powerful that what I was rocking 10-20 years ago, but the user experience seems about the same. Not even. It used to be that when you clicked a button, things happened immediately, instead of a few seconds later as everything freezes up. Text could be entered into fields without inputs getting dropped or playing catch-up. A mysterious unkillable service wouldn't randomly decide to peg your core several times a day. This was all the case even as late as Windows 7. | ||||||||
| ||||||||
▲ | danpalmer 3 days ago | parent | prev [-] | |||||||
I understand the attitude but I think it misses a few aspects. We have far more isolation between software, we have cryptography that would have been impractical to compute decades ago, and it’s used at rest and on the wire. All that comes at significant cost. It might only be a few percent of performance on modern systems, and therefore easy to justify, but it would have been a higher percentage a few decades ago. Another thing that’s not considered is the scale of data. Yes software is slower, but it’s processing more data. A video file now might be 4K, where decades ago it may have been 240p. It’s probably also far more compressed today to ensure that the file size growth wasn’t entirely linear. The simple act of replaying a video takes far more processing than it did before. Lastly, the focus on dynamic languages is often either misinformed or purposefully misleading. LLM training is often done in Python and it’s some of the most performance sensitive work being done at the moment. Of course that’s because the actual training isn’t executing in a Python VM. The same is true for so much of “dynamic languages” though, the heavy lifting is done elsewhere and the actual performance benefits of rewriting the Python bit to C++ or something would often be minimal. This does vary of course, but it’s not something I see acknowledged in these overly simplified arguments. Requirements have changed, software has to do far more, and we’re kidding ourselves if we think it’s comparable. That’s not to say we shouldn’t reduce wastage, we should! But to dismiss modern software engineering because of dynamic languages etc is naive. |