| ▲ | mrweasel 3 hours ago | |
In March 2025, Drew DeVault wrote a blog post called "Please stop externalizing your costs directly into my face"[1]. I think that is a pretty good guess as to why these bots do not care about frequency of changes, it costs to much. Every run is basically a fresh run, no state stored, every page is just feed into the machine a new. At least that's my theory. The AI companies need a full copy of your page, every time they retrain a model. Now they could store that in their own datacenters, but that's a full copy of the internet, in a market where storage costs are already pretty high. So instead, they just externalize the storage cost. If you run a website, a public Gitlab instance, Forgejo, a wiki, a forum, whatever, you basically functions as free offsite storage for the AI companies. 1) https://drewdevault.com/2025/03/17/2025-03-17-Stop-externali... | ||