▲ | theZilber 5 days ago | |
It is less about performance issues of loading megabytes on the browser (which is also an issue). It is about those cases where a fetch request may take a noticable amount of time just because of server distance, maybe the server needs to perform some work (ssr) to create the page (sometimes from data fetched from an external api). If you have a desktop app it will also have to do the same work by fetching all the data it needs from the server, and it might sometimes cache some of the data locally (like user profile etc...). This allows the developers to load the data on user intent(hover, and some other configurable logic) instead of when application is loaded(slow preload), or when the user clicks (slow response). Even if the the target page is 1byte, the network latency alone makes things feel slugish. This allows low effort fast ui with good opinionated api. One of the reasons I can identify svelte sites within 5 seconds of visiting a page, is because they preload on hover, and navigating between pages feels instant. This is great and fighting against it seems unreasonable. But I agree that in other cases where megabytes of data needs to be fetched upon navigating, using these features will probably cause more harm then good, unless applied with additional intelligent logic (if these features allow such extension). Edit: i addressed preloading, regarding pretending its a whole new set of issues which i am less experienced with. Making web apps became easier but unfortunately them having slow rendering times and other issues.. well is a case of unmitigated tech debt that comes from making web application building more accessible. |