| ▲ | Retr0id 7 hours ago | |
I recently un-denylisted AI crawler UAs from my blog's robots.txt, for several reasons: - My blog is static content and it costs me ~nothing to serve the requests. - The bots were ignoring robots.txt anyway. - If there's ultimately a human driving the bot (e.g. someone asking "summarise this article"), I don't mind. - It's like trying to block search engines. Just as I want my blog to turn up in search results, I want agents etc. to know it exists, too. My original motivation for denylisting, years ago, was that LLMs were simply not very good, so training-set scrapers seemed like all downside with no upside. | ||