▲ | imiric 9 hours ago | ||||||||||||||||
It could be slightly more sophisticated than that. Instead of outright blocking an entire IP range, set quotas for individual clients and throttle downloads exponentially. Add latency, cap the bandwidth, etc. Whoever is downloading 10,000 copies of the same file in 24 hours will notice when their 10th attempt slows down to a crawl. | |||||||||||||||||
▲ | tlb 9 hours ago | parent [-] | ||||||||||||||||
It'll still suck for CI users. What you'll find is that occasionally someone else on the same CI server will have recently downloaded the file several times and when your job runs, your download will go slowly and you'll hit the CI server timeout. | |||||||||||||||||
|