| ▲ | HWR_14 7 hours ago | |||||||
Why would I want bots to read my blog? | ||||||||
| ▲ | Retr0id 7 hours ago | parent | next [-] | |||||||
I recently un-denylisted AI crawler UAs from my blog's robots.txt, for several reasons: - My blog is static content and it costs me ~nothing to serve the requests. - The bots were ignoring robots.txt anyway. - If there's ultimately a human driving the bot (e.g. someone asking "summarise this article"), I don't mind. - It's like trying to block search engines. Just as I want my blog to turn up in search results, I want agents etc. to know it exists, too. My original motivation for denylisting, years ago, was that LLMs were simply not very good, so training-set scrapers seemed like all downside with no upside. | ||||||||
| ▲ | kibwen 7 hours ago | parent | prev | next [-] | |||||||
Why would bots and the people operating them care what you want? | ||||||||
| ||||||||
| ▲ | unglaublich 7 hours ago | parent | prev [-] | |||||||
Like it or not, but it's the intermediary layer now, between you and your users. | ||||||||