| ▲ | rossant 11 hours ago |
| Can't the server detect and prevent repeated downloads from the same IP, forcing users to act accordingly? |
|
| ▲ | jbstack 10 hours ago | parent | next [-] |
| See: "Also, when we block an IP range for abuse, innocent third parties can be affected." Although they refer to IP ranges, the same principle applies on a smaller scale to a single IP address: (1) dynamic IP addresses get reallocated, and (2) entire buildings (universities, libraries, hotels, etc.) might share a single IP address. Aside from accidentally affecting innocent users, you also open up the possibility of a DOS attack: the attacker just has to abuse the service from an IP address that he wants to deny access to. |
| |
| ▲ | imiric 9 hours ago | parent [-] | | More sophisticated client identification can be used to avoid that edge case, e.g. TLS fingerprints. They can be spoofed as well, but if the client is going through that much trouble, then they should be treated as hostile. In reality it's more likely that someone is doing this without realizing the impact they're having. |
|
|
| ▲ | imiric 9 hours ago | parent | prev [-] |
| It could be slightly more sophisticated than that. Instead of outright blocking an entire IP range, set quotas for individual clients and throttle downloads exponentially. Add latency, cap the bandwidth, etc. Whoever is downloading 10,000 copies of the same file in 24 hours will notice when their 10th attempt slows down to a crawl. |
| |
| ▲ | tlb 9 hours ago | parent [-] | | It'll still suck for CI users. What you'll find is that occasionally someone else on the same CI server will have recently downloaded the file several times and when your job runs, your download will go slowly and you'll hit the CI server timeout. | | |
| ▲ | detaro 9 hours ago | parent [-] | | that's working as intended then, you should be caching such things. It sucking for companies that don't bother is exactly the point, no? | | |
| ▲ | sltkr 7 hours ago | parent [-] | | It's not unreasonable for each customer to maintain a separate cache (for security reasons), so that each of them will download the file once. Then it only takes one bad user on the same subnet to ruin the experience for everyone else. That sucks, and isn't working as intended, because the intent was to only punish the one abusive user. |
|
|
|