▲ | pyman 5 days ago | |
I've seen some sites, like Amazon, calculate the probability of a user clicking a link and preload the page. This is called predictive preloading (similar to speculative fetching). It means they load or prepare certain pages or assets before you actually click, based on what you're most likely to do next. What I like about this is that it's not a guess like the browser does, it's based on probability and real user behaviour. The downside is the implementation cost. Just wondering if this is something you do too. | ||
▲ | jameslk 5 days ago | parent [-] | |
For a while, there was a library built to do this: https://github.com/guess-js/guess You can do this with speculation rules too. Your speculation rules are just prescriptive of what you think the user will navigate to next based on your own analytics data (or other heuristics) Ultimately the pros/cons are similar. You just end up with potentially better (or worse) predictions. I suspect it isn’t much better than simple heuristics such as whether a cursor is hovering over a link or a link is in a viewport. You’d probably have to have a lot of data to keep your guesses accurate Keep in mind that this will just help with the network load piece, not so much for the rendering piece. Often rendering is actually what is slowing down most heavy frontends. Especially when the largest above-the-fold content you want to display is an image or video |