Remix.run Logo
jubilanti 11 hours ago

That's a really sorry state of things, then. There's zero trust in software now, in the literal sense. How did it get that we live in a world where you can't trust a client to enforce its own documented behavior? How did it get to be the user's fault for not using OS and hardware level measures and not the software vendor's fault when the "Automatic updates" toggle is a no-op?

shimman 10 hours ago | parent | next [-]

MBAs/consultants hijacked the industry along with an influx of people that only consider leetcode to be sufficient for hiring. The past 10 years has been a major injunction of these people into big tech. The resulting mess is predictable, it'll get worse too which is why we need to break up these companies and allow better more efficient companies to take their place rather than letting them subsidize their failures with their monopolies.

EvanAnderson 10 hours ago | parent | prev | next [-]

In an environment where bandwidth utilization costs money I think it's a good belt-and-suspenders approach, regardless of the expected behavior of the clients, to enforce policy at the choke point between expensive and not-expensive.

(I think more networks should be built with default deny egress policies, personally. It would make data exfiltration more difficult, would make ML algorithms monitoring traffic flows have less "noise" to look thru, and would likely encourage some efficiency on the part of dependencies.)

xnyan 10 hours ago | parent | prev | next [-]

Software design is not really my wheelhouse so I can't comment meaningfully on that, but on the networking side I can very confidently say it was a poor architecture. You simply cannot assume that all of your clients are going to be both 1) non-malicious and 2) work exactly as you think they will.

Link saturation would be one of the first things that would come to mind in this situation, and at these speeds QoS would be trivial even for cheap consumer hardware.

rafterydj 10 hours ago | parent [-]

Well, on the software design side, there's plenty of scenarios where undocumented behavior crops up on unexpected network interruption. In the example above, Windows can even pre-download updates on metered connections during one time period, then install those updates during another. The customers really can't take the blame for that, IMO.

I think overall society has rapidly deteriorated in software quality and it is mostly because of the devaluing of software design. No one expects quality from software, everyone "understands there are bugs", and some like to take advantage of that. And so the Overton window gets pushed in the direction of "broken forever good luck holding the bag if you use it" rather than the more realistic "occasionally needs to restart IFF you hit an issue and it takes less than <10 seconds and has minimal data loss".

CoastalCoder 10 hours ago | parent | prev | next [-]

> How did it get that we live in a world where you can't trust a client to enforce its own documented behavior?

My guess a combo of economic incentives and weak legal protections.

I realize that answer applies to so many issues as to be almost not worth saving, but I think it's still true here.

SR2Z 7 hours ago | parent | prev | next [-]

Fair enough, but the fact is that until fairly recently most software wouldn't even pretend to care about conserving bandwidth. I certainly would never expect a desktop OS to do this well, even if MS loves their revenue-generating "bugs."

relaxing 6 hours ago | parent | prev [-]

The world where any unpatched system is a guaranteed botnet.