Remix.run Logo
kaybe 11 hours ago

Oh ya I remember how some computer pulled a windows update over a satellite connection during a research flight (aircraft). That was super expensive, wow. Now Microsoft servers are banned at the outgoing point since you couldn’t reliably stop it the computer itself and new teams with new computers come in.

xnyan 11 hours ago | parent | next [-]

I'm not letting Microsoft off the hook here, but if you have an expensive metered connection and you're trusting clients (especially a modern personal computer of any operating system type)to play nicely with bandwidth, that's 100% on you.

jubilanti 10 hours ago | parent [-]

That's a really sorry state of things, then. There's zero trust in software now, in the literal sense. How did it get that we live in a world where you can't trust a client to enforce its own documented behavior? How did it get to be the user's fault for not using OS and hardware level measures and not the software vendor's fault when the "Automatic updates" toggle is a no-op?

shimman 10 hours ago | parent | next [-]

MBAs/consultants hijacked the industry along with an influx of people that only consider leetcode to be sufficient for hiring. The past 10 years has been a major injunction of these people into big tech. The resulting mess is predictable, it'll get worse too which is why we need to break up these companies and allow better more efficient companies to take their place rather than letting them subsidize their failures with their monopolies.

EvanAnderson 10 hours ago | parent | prev | next [-]

In an environment where bandwidth utilization costs money I think it's a good belt-and-suspenders approach, regardless of the expected behavior of the clients, to enforce policy at the choke point between expensive and not-expensive.

(I think more networks should be built with default deny egress policies, personally. It would make data exfiltration more difficult, would make ML algorithms monitoring traffic flows have less "noise" to look thru, and would likely encourage some efficiency on the part of dependencies.)

xnyan 10 hours ago | parent | prev | next [-]

Software design is not really my wheelhouse so I can't comment meaningfully on that, but on the networking side I can very confidently say it was a poor architecture. You simply cannot assume that all of your clients are going to be both 1) non-malicious and 2) work exactly as you think they will.

Link saturation would be one of the first things that would come to mind in this situation, and at these speeds QoS would be trivial even for cheap consumer hardware.

rafterydj 10 hours ago | parent [-]

Well, on the software design side, there's plenty of scenarios where undocumented behavior crops up on unexpected network interruption. In the example above, Windows can even pre-download updates on metered connections during one time period, then install those updates during another. The customers really can't take the blame for that, IMO.

I think overall society has rapidly deteriorated in software quality and it is mostly because of the devaluing of software design. No one expects quality from software, everyone "understands there are bugs", and some like to take advantage of that. And so the Overton window gets pushed in the direction of "broken forever good luck holding the bag if you use it" rather than the more realistic "occasionally needs to restart IFF you hit an issue and it takes less than <10 seconds and has minimal data loss".

CoastalCoder 10 hours ago | parent | prev | next [-]

> How did it get that we live in a world where you can't trust a client to enforce its own documented behavior?

My guess a combo of economic incentives and weak legal protections.

I realize that answer applies to so many issues as to be almost not worth saving, but I think it's still true here.

SR2Z 7 hours ago | parent | prev | next [-]

Fair enough, but the fact is that until fairly recently most software wouldn't even pretend to care about conserving bandwidth. I certainly would never expect a desktop OS to do this well, even if MS loves their revenue-generating "bugs."

relaxing 6 hours ago | parent | prev [-]

The world where any unpatched system is a guaranteed botnet.

zdragnar 6 hours ago | parent | prev [-]

> since you couldn’t reliably stop it the computer itself and new teams with new computers come in.

Wifi connection settings in Windows have a "metered connection" setting, which disables automatically downloading updates. I don't recall exactly when this was introduced, but I had to use it for a year while I was stuck on satellite internet. You can even set data caps and such.

Of course, it's always off by default, and I have no idea if there's any way to provision the connection via enterprise admin to default to on for a particular network (I would assume not) so you'd be stuck hoping everyone that comes in does the right thing.

boxedemp 6 hours ago | parent [-]

It's a good setting. I've found it gets reset sometimes from Windows updates, so you must remain vigilant.