| ▲ | garganzol 4 hours ago | |||||||
Nowadays people complain about AI scrapers with the same vain as they complained about search indexers a way back when. Just a few years later, people had stopped caring too much about storage access and bandwidth, and started begging search engines to visit their websites. Every trick on the planet Earth, SEO optimization, etc. Looking forward to the time when everybody suddenly starts to embrace AI indexers and welcome them. History does not repeat itself but it rhymes. | ||||||||
| ▲ | phyzome 4 hours ago | parent | next [-] | |||||||
We already know the solution: One well-behaved, shared scraper could serve all of the AI companies simultaneously. The problem is that they're not doing it. | ||||||||
| ||||||||
| ▲ | Guvante 4 hours ago | parent | prev | next [-] | |||||||
Except robots.txt was the actual real solution to search indexing... | ||||||||
| ▲ | linkregister 3 hours ago | parent | prev | next [-] | |||||||
Search indexing historically has had several of orders less impact on bandwidth and processing costs to website maintainers. My recommendation is to copy the text in this article and pass it LLM to summarize this article's key points, since it appears you missed the central complaint of the article. | ||||||||
| ▲ | what an hour ago | parent | prev [-] | |||||||
Bad take. Search engines send people to your site, LLMs don’t. | ||||||||