Remix.run Logo
s0meON3 8 hours ago

What about using zip bombs?

https://idiallo.com/blog/zipbomb-protection

lavela 8 hours ago | parent | next [-]

"Gzip only provides a compression ratio of a little over 1000: If I want a file that expands to 100 GB, I’ve got to serve a 100 MB asset. Worse, when I tried it, the bots just shrugged it off, with some even coming back for more."

https://maurycyz.com/misc/the_cost_of_trash/#:~:text=throw%2...

LunaSea 6 hours ago | parent | next [-]

You could try different compression methods supported by browsers like brotli.

Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip".

4 hours ago | parent | prev [-]
[deleted]
renegat0x0 7 hours ago | parent | prev [-]

Even I, who does not know much, implemented a workaround.

I have a web crawler and I have both scraping byte limit and timeout, so zip bombs dont bother me much.

https://github.com/rumca-js/crawler-buddy

I think garbage blabber would be more effective.