Remix.run Logo
imiric 9 hours ago

It could be slightly more sophisticated than that. Instead of outright blocking an entire IP range, set quotas for individual clients and throttle downloads exponentially. Add latency, cap the bandwidth, etc. Whoever is downloading 10,000 copies of the same file in 24 hours will notice when their 10th attempt slows down to a crawl.

tlb 9 hours ago | parent [-]

It'll still suck for CI users. What you'll find is that occasionally someone else on the same CI server will have recently downloaded the file several times and when your job runs, your download will go slowly and you'll hit the CI server timeout.

detaro 9 hours ago | parent [-]

that's working as intended then, you should be caching such things. It sucking for companies that don't bother is exactly the point, no?

sltkr 7 hours ago | parent [-]

It's not unreasonable for each customer to maintain a separate cache (for security reasons), so that each of them will download the file once.

Then it only takes one bad user on the same subnet to ruin the experience for everyone else. That sucks, and isn't working as intended, because the intent was to only punish the one abusive user.