| ▲ | mrweasel 3 hours ago | |
Anyone could scrape the net, then modern scrapes came along with their shitty code and absolutely no respect. The reason why so many of us block or throttle scrapers is because they miss behave. They don't back off, they try to by-pass caches and if they crash a site they don't adjust, they will just pound it the ground again when it's back. We managed to talk to one large AI company would didn't really want to fix anything, but told us that they'd be fine with us just rate limiting them, as if we somehow owed them anything. They just get a stupid low rps now, even if we'd let them go faster, if they'd just fix they bot. Some sites don't want you scraping, but it's their content, their rules. We don't really care, but we have to due to the number and quality of the bots we're seeing. This is in my mind a 100% self-imposed problem from the scrapers. | ||