▲ | fluoridation 6 days ago | ||||||||||||||||
>No. A human sees a 10x slowdown. For the actual request, yes. For the complete experience of using the website not so much, since a human will take at least several seconds to process the information returned. >And the scraper paid one 1/1000000th of a dollar. (The scraper does not care about latency.) The point need not be to punish the client, but to throttle it. The scraper may not care about taking longer, but the website's operator may very well care about not being hammered by requests. | |||||||||||||||||
▲ | avhon1 6 days ago | parent | next [-] | ||||||||||||||||
But now I have to wait several seconds before I can even start to process the webpage! It's like the internet suddenly became slow again overnight. | |||||||||||||||||
| |||||||||||||||||
▲ | jsnell 6 days ago | parent | prev [-] | ||||||||||||||||
A proof of work challenge does not throttle the scrapers at steady state. All it does is add latency and cost to the first request. | |||||||||||||||||
|