Remix.run Logo
orbital-decay 4 hours ago

It's three issues:

- AI shops scraping the web to update their datasets without respecting netiquette (or sometimes being unable to automate it for every site due to the scale, ironically).

- People extensively using agents (search, summarizers, autonomous agents etc), which are indistinguishable from scraper bots from website's perspective.

- Agents being both faster and less efficient (more requests per action) than humans.