▲ | pdntspa 3 days ago | ||||||||||||||||||||||
wait, top billings on HN brings in 2 hits/sec of traffic? That is an unbelievably low number considering how many sites fall over under that pressure | |||||||||||||||||||||||
▲ | lucb1e 3 days ago | parent | next [-] | ||||||||||||||||||||||
Exactly. I think this shows two things quite nicely: - Very few sites need to cope with more than a handful of hits per second. A regular DSL connection and desktop PC can host the vast majority of them; you don't need clouds if you don't want them. (Even under variable load: if you need 80% of the systems more than 40% of the time, scaling down is probably not worth the cloud premium) - If a site can't handle HN, that's a software limitation. Compare Wordpress' insanely slow page generation to simple blog software that generates pages in 5 milliseconds, or even to hosting the blog as static HTML files. I'd not be surprised if you can serve Wikipedia's page text from like one Raspberry Pi 5 per country. Not that you'd want to do that for reliability and redundancy reasons, plus you have the constant stream of edits to process and templates to (re-)render. Media and blob hosting is also a separate beast. Thankfully, most sites are not in the top ten world's most popular websites and you get away with a lot | |||||||||||||||||||||||
| |||||||||||||||||||||||
▲ | troupo 3 days ago | parent | prev | next [-] | ||||||||||||||||||||||
At one point I had two pages in the top spot on HN: https://mastodon.nu/@dmitriid/114852056319245427 - 20k peak unique visitors - 162k peak requests - 56 GB peak data but most of that data was cached by Cloudflare | |||||||||||||||||||||||
| |||||||||||||||||||||||
▲ | Retr0id 3 days ago | parent | prev [-] | ||||||||||||||||||||||
Closer to 10 at peaks, but a lot of sites are just fragile. |