| ▲ | cadamsdotcom 3 days ago |
| [flagged] |
|
| ▲ | getpokedagain 3 days ago | parent | next [-] |
| I'm not sure I understand. Are you implying we should not design our technology around serious edge cases that humans encounter in life? Why wouldn't we target people in crisis when we design crisis management information sites? |
| |
|
| ▲ | netsharc 3 days ago | parent | prev | next [-] |
| It's not really written there, but how about a loading experience that gives you the important information, and then loads the bells and whistles as the JavaScript gets loaded and run. First make sure the plain text information gets loaded, maybe a simple JPEG when something graphical like a map is needed, and then load the Megabytes of React or Angular to make it all pretty and the map be interactive... |
| |
| ▲ | McGlockenshire 3 days ago | parent [-] | | Just as server side rendering was reinvented from first principles by the current generation, now they have rediscovered progressive enhancement! There might be hope for us yet! |
|
|
| ▲ | brianpan 3 days ago | parent | prev | next [-] |
| "Universal design" or "design for accessibility" will give you lots of examples of constraints that are not "commonly" needed ending up having much wider application and benefiting many other people. Some oft-cited examples are curb cuts (the sloped ramps cut into curbs for sidewalk access) and closed-captioning (useful in noisy bars or at home with a sleeping baby). There are many examples from the web where designing with constraints can lead to broadly more usable sites- from faster loading times (mobile or otherwise) to semantic markup for readers, etc. |
| |
| ▲ | cadamsdotcom 3 days ago | parent [-] | | Ah, this raises 2 important nuances: - How severe is the impact, and - How close is the default state to the constraint Kerb cuts help everyone. Kids, the elderly, disabled people, and anyone distracted by their phone are all less likely to fall on their face and lose a tooth. Web accessibility helps websites go from unusable for disabled people, to usable. On the other hand, when a dev puts a website on a diet it might make it load in 50ms instead of 200ms for 99.9% of users, and load in 2 seconds instead of 2 minutes for 0.1%. So it doesn’t impact anyone meaningfully for the site to be heavy. And for that edge case 0.1%, they’ll either leave, or stick around waiting and stab that reload button for as long as it takes to get the info they need. As shameful as it is, web perf work has almost zero payoff except at the limit. Anyone sensible therefore has far more to gain by investing in more content or more functionality. | | |
| ▲ | anonymous908213 3 days ago | parent [-] | | Google has done Google-scale traffic analysis and determined that even a 100ms delay has noticeable impacts on user retention. If a website takes more than 3 seconds to load, over 50% of visitors will bail. To say that there is no payoff for optimization is categorically incorrect. The incentives are there. Web developers are just, on average, extremely bad at their jobs. The field has been made significantly more accessible than it was in decades past, but the problem with accessibility is that it enables people who have no fundamental understanding of programming to kitbash libraries together like legos and successfully publish websites. They can't optimize even if they tried, and the real problem for the rest of us is they can't secure user data even if they try. | | |
| ▲ | cadamsdotcom 3 days ago | parent [-] | | This test was a while ago - it’d be interesting to see if it’s still the case and if the results reproduce. But still let’s consider that Google is Google and most websites are just happy to have some traffic. People go to Google expecting it to quickly get them info. On other sites the info is worth waiting an extra second for. At Google scale, a drop in traffic results in a massive corresponding drop in revenue. But most websites don’t even monetize. They’re both websites but that’s all they have in common. | | |
| ▲ | anonymous908213 3 days ago | parent [-] | | If you are a hobbyist hosting your own website for fun, sure, whatever. Do what floats your boat, you're under no obligation for your website to meet any kind of standard. The vast majority of web traffic is directed towards websites that are commercial in nature[1], though. Any drop in traffic is a drop in revenue. If you are paid tens or hundreds of thousands of dollars a year to provide a portal wherein people visit your employer's website and give them money (or indirectly give them money via advertisement impressions), and shrug your shoulders at the idea of 50% of visitors bouncing, you are not good at your job. But hey, at least you'd be in good company, because most web developers are like that, which is why the web is as awful to use as it is. [1]The only website in the top 10 most visited that is not openly commercial is Wikipedia, but it still aggressively monetizes by shaking down its visitors for donations and earns around $200 million a year in revenue. They would certainly notice if 50% or even 10% of their visitors were bouncing too. |
|
|
|
|
|
| ▲ | McGlockenshire 3 days ago | parent | prev | next [-] |
| This is the same attitude that results in modern developers ignoring low end consumer hardware, locking out a customer base because they aren't rich enough. Get some perspective. Some of us have to live on 500kbit/s. The modern web is hell, and because it doesn't impact anybody with money, nobody gives a shit. |
|
| ▲ | tomhow 2 days ago | parent | prev | next [-] |
| Please don't be curmudgeonly about others' curmudgeonliness. We're rather hoping for anti-curmudgeonliness on HN. https://news.ycombinator.com/newsguidelines.html |
|
| ▲ | bdcravens 3 days ago | parent | prev [-] |
| Every single time Github goes down there's no shortage of gnashing of teeth on HN about how we should all host our own repos and CI servers. |
| |
| ▲ | cadamsdotcom 3 days ago | parent [-] | | Then people go outside and play. Then Github comes back and sins are forgotten. |
|