| ▲ | pavel_lishin 8 hours ago | |||||||
> Not because I expect a person in Singapore to shave 200ms off their pageload, but because the next request for that page is more likely to come from a retrieval system than a browser, and the request after that, and the one after that. Why do I care if I shave off 200ms from a crawler's request, instead of a human's? | ||||||||
| ▲ | Brybry 8 hours ago | parent | next [-] | |||||||
The graphic in the article seems to be the only significant content. Based on that I think it's more about requests from bots/scrapers having the greatest chance possible of hitting a cache before hitting the blog's origin/real host. Bots will hit some layer of Cloudflare first then they'll hit Fastly and then if not in Fastly they'll hit the Ghost blog's server. To me, this makes a lot of sense if it's self-hosted but I also thought it was already the standard to shove your self-hosted blog behind a reverse-proxy and cache as much as possible. And I'm not a professional web developer but all the extra caching layers for a static personal blog seem a bit overkill. Aside from the graphic, the article is a lot of words about engaging with an LLM to get a full understanding of how caching works for their blog hosting and how it enabled them to change their setup for the better. It's kind of hard to understand because there are no words about what they actually did or how what they actually did was better. | ||||||||
| ▲ | m0rde 8 hours ago | parent | prev | next [-] | |||||||
From the post: > If you care about how your content moves through the world now, including through AI systems, you have to care about caching. Not as a performance optimisation for human browsers, but as infrastructure for machine readership. | ||||||||
| ||||||||
| ▲ | rodw 8 hours ago | parent | prev [-] | |||||||
Page load time can impact index coverage (depth of crawl), freshness (revisit rate), and ranking. | ||||||||