| ▲ | Brybry 8 hours ago | |
The graphic in the article seems to be the only significant content. Based on that I think it's more about requests from bots/scrapers having the greatest chance possible of hitting a cache before hitting the blog's origin/real host. Bots will hit some layer of Cloudflare first then they'll hit Fastly and then if not in Fastly they'll hit the Ghost blog's server. To me, this makes a lot of sense if it's self-hosted but I also thought it was already the standard to shove your self-hosted blog behind a reverse-proxy and cache as much as possible. And I'm not a professional web developer but all the extra caching layers for a static personal blog seem a bit overkill. Aside from the graphic, the article is a lot of words about engaging with an LLM to get a full understanding of how caching works for their blog hosting and how it enabled them to change their setup for the better. It's kind of hard to understand because there are no words about what they actually did or how what they actually did was better. | ||