Remix.run Logo
herbst 7 hours ago

I get constantly attacked.

Usually it's big actors like Facebook, Azure and OpenAI who bombard my servers without any respect or logic. I need to update my access rules constantly to keep them away (using Cloudflare) Sometimes it's clustered traffic, more classic DDoS, from China, Russia or America. That I could easily filter with the DDos protection from my hosting (which is cheaper than cloudflare anyway)

What should I do if not Cloudflare to block with "complex rules" that is strong enough to survive hundreds of concurrent requests by big companies?

rsync 4 hours ago | parent | next [-]

“Hundreds of concurrent requests…”

Back in 2001/2002 my personal website was “slashdotted” several times…

… which I learned about after the fact by seeing myself on slashdot.

It was not noticeable as it occurred and my services were not impacted.

So perhaps you need a p3-500 with 64 megabytes of ram and Apache 1.x and an old copy of cgi-lib.pl ?

herbst 4 hours ago | parent [-]

Concurrent and constant. This is nothing like real traffic, nothing like the good old hug of death.

It seems to find the slowest endpoints (well it does like my search and category pages, but sometimes it really hammers a single page for an hour), builds up until your site goes into its knees and instead of going slower it starts to hammer from other IP ranges until you have them all banned. This can go on for hours (or days even) if I don't create new rules to ban it.

It reminds me of a slowloris dos but at large scale and concurrency.

Sure if my website didn't have any dynamic content, or not millions of database lines it would be less of an issue :)

rsync 4 hours ago | parent [-]

Genuinely curious: Do you run this on single tenant hardware that you own ?

herbst 3 hours ago | parent [-]

No, it's several virtual server mostly because simplicity and I sleep better at night :)

udev4096 6 hours ago | parent | prev | next [-]

OpenAI bots are relentless. I used to see some random requests every time I requested LE cert for making a service public but now, it's always "gptbot"

52-6F-62 7 hours ago | parent | prev | next [-]

There are other CDNs out there with less surface area, but the corollary being they are less of a target.

hat_monger 7 hours ago | parent | prev [-]

The market has spoken, you are not needed.

herbst 7 hours ago | parent | next [-]

Because big companies can't stop looking at my website ("borrow" my content for their AIs I guess) constantly? Makes sense

7 hours ago | parent | prev [-]
[deleted]