| ▲ | watermelon0 15 hours ago | |
Great, now we need caching for something that's seldom (relatively speaking) used by people. Let's not forget that scrapers can be quite stupid. For example, if you have phpBB installed, which by defaults puts session ID as query parameter if cookies are disabled, many scrapers will scrape every URL numerous times, with a different session ID. Cache also doesn't help you here, since URLs are unique per visitor. | ||
| ▲ | kimos 10 hours ago | parent [-] | |
You’re describing changing the base assumption for software reachable on the internet. “Assume all possible unauthenticated urls will be hit basically constantly”. Bots used to exist but they were rare traffic spikes that would usually behave well and could mostly be ignored. No longer. | ||