| ▲ | acdha 4 days ago |
| I miss the 90s, too, but these days anyone who wants to deal with current levels of bot traffic is probably going to look at a service like Cloudflare as much cheaper than the amount of ops time they’d otherwise spend keeping things up and secure. |
|
| ▲ | immibis 4 days ago | parent [-] |
| You could just, like, not make a website that takes several seconds to handle each request. I let bots hit Gitea 2-3 times per second on a $10/month VPS, and the only actual problem was that it doesn't seem to ever delete zip snapshots, filling up the disk when enough snapshot links are clicked. So I disabled that feature by setting the snapshots folder read-only. There were no other problems. I mention Gitea because people complain about having to protect Gitea a lot, for some reason. |
| |
| ▲ | acdha 3 days ago | parent [-] | | Sure, I’ve been doing that since the 90s. I still pay for hardware and egress, and it turns out that everything has limits for the amount of traffic it can handle which bots can easily saturate. I’ve had sites which were mostly Varnish serving cached content at wire speed go down because they saturated the upstream. | | |
| ▲ | immibis 3 days ago | parent [-] | | I hope 2-3 requests per second is not that limit, or you're fucked. | | |
| ▲ | vntok 3 days ago | parent | next [-] | | It is on a simple WordPress install with the top 4 most used plugins, when you don't have a Caching Reverse Proxy like Cloudflare to filter bad traffic and serve fully cached pages from POP nodes located near the visitors. The alternative, of course, is to set up a caching system server-side (like Redis), which most people who set up their WordPress blog don't have the first idea how to do in a secure way. | |
| ▲ | acdha 3 days ago | parent | prev [-] | | It’s not, but you’re off by 3+ orders of magnitude on the traffic volume and ignoring the cost of serving non-trivial responses. |
|
|
|