▲ | kortilla 3 hours ago | |
Yep, it’s been solved many times but this is one of the unfortunate side effects of these companies reinventing the wheel every 10 years. The higher amounts of memory, faster CPUs, more disk space, and constant fast internet are all assumed to be there. Unless you force the developers (and QA if that’s a thing) to use the software with those things constrained, it’s going to suck. There was a blog post that eludes me now from about 10 years ago where a developer that lived on a terrible connection detailed how Firefox regressed progressive loading. 95% of the page would be there in an acceptably readable form. The connection would then get interrupted and rather than putting an error somewhere outside of the page and leaving the page as-is partially rendered, it would just wipe out the whole thing and show the connection timed out page. I don’t think they ever fixed this despite it being user hostile behavior for people with poor connections. |