▲ | fxtentacle 11 hours ago | ||||||||||||||||||||||
This is massively counterproductive! They add to the robots.txt file: "As a condition of accessing this website, you agree to abide by the following content signals ..." which means robot operators are now better off if they never download the robots.txt file, because then you know for sure that you won't accidentally encounter these conditions. This creates a legal risk if you try to adhere to the robots.txt, so it'll make future AI bots less controllable and less restricted. | |||||||||||||||||||||||
▲ | skybrian 11 hours ago | parent [-] | ||||||||||||||||||||||
That would be an interesting court case. I'm doubtful that companies will be held to agreements that they didn't even see and even their bots didn't explicitly agree to? This isn't even like a shrinkwrap license where the bot would have to press the "I agree" button. Cloudflare's other initiative where they help websites to block bots seems more likely to work. On the other hand if an LLM sees this then maybe it will be confused by it. | |||||||||||||||||||||||
|