Remix.run Logo
zeroq 4 days ago

SPA is not only about seamless transitions but also being able to encapsulate a lot of user journey on the client side, without the need of bothering server too much.

Let me give you an example - one of my biggest gripes about web ux is the fact that in 2025 most shops still requires you to fully reload (and refetch) content when you change filters or drill down a category.

A common use case is when you come to a shop, click on "books" (request), then on "fantasy" subsection (another request), realize the book you're looking for is actually a "sci-fi", so you go back (request, hopefully cached) and go to "sci-fi" (another request).

It's much better ux when a user downloads the whole catalogue and then apply filters on the client without having to touch the server until he wants to get to the checkout.

But it's a lot of data - you may say - maybe on Amazon, but you can efficiently pack sections of most shops in data that will enable that pattern in less kilobytes that takes one product photo.

I've been building web apps like that since ca. 2005 and I still can't understand why it's not more common on the web.

da_chicken 4 days ago | parent | next [-]

I don't know, I think the most painful aspect of having to do a full reload is how I efficient the site is. The actual data is a few KB, but the page itself has to download 100 MB and the web browser is burning through a GB of RAM.

Like I don't find Hacker News to be egregious to navigate, and nearly every nav is a reload. It runs fine on my 2008 laptop with 4 GB of RAM.

But I go to DoorDash on the same device, and it takes 30s to load up a list of 50 items. They give you a countdown for a double dash, and I genuinely don't think it's possible to order some curry and get a 6 pack of soda in less than 3 minutes. And 2.5 minutes is waiting for it to render enough to give me the interface. Almost none of it is a full reload.

nine_k 4 days ago | parent | next [-]

An SPA can be lean and fast. React is the prevailing Web framework today? Preact is like 5 KiB of code.

What makes SPAs unwieldy is not the technology but the lack of desire to optimize. It loads fine on yesteryear's Macbook Air? Enough, push it.

I very well remember heavy, slow-loading websites of, say, year 2000, without any SPA stuff, even though lightweight, quick-loading pages existed even then, in the epoch of dial-up internet. It's not the technology, it's the desire to cram in more, and to ship faster, with least thinking involved.

thaumasiotes 4 days ago | parent [-]

> I very well remember heavy, slow-loading websites of, say, year 2000, without any SPA stuff, even though lightweight, quick-loading pages existed even then, in the epoch of dial-up internet.

Sure, lightweight, quick-loading pages existed, but sometimes you want to see a picture.

nine_k 4 days ago | parent [-]

Google appeared in 1998. It was very noticeable how fast it loaded compared to competitors like AltaVista and Yahoo. None of them featured large photos.

This was visible not only on a 33600 phone connection at home, but also on a megabit connection at work, because, shockingly, how fast your backend is also plays a major role.

Izkata 4 days ago | parent [-]

Speaking of speed, don't forget when gmail and google maps first came out: With the data copied locally they could separate the UI from server requests for a lot of interactions so a lot of the UI was actually instant and you could keep doing things while the requests were handled in the background. This seems to have been missed by a lot of modern stuff that doesn't bother figuring out if this is even possible and opts to show a loading spinner instead.

cosmic_cheese 4 days ago | parent | prev | next [-]

Yeah, the enemy isn’t the need to reload, it’s reloading taking a long time due to too much garbage that’s not the content the user is interested in having to come down off the wire and render. A site that requires an occasional split second reload is always going to be preferred to a behemoth that doesn’t need reloading but has me staring at blank screens and loading spinners half the time I’m using it.

Aurornis 4 days ago | parent | prev | next [-]

> But I go to DoorDash on the same device, and it takes 30s to load up a list of 50 items.

> And 2.5 minutes is waiting for it to render enough to give me the interface.

I have a very old MacBook Air (one of the painfully slow ones) that I use for development reference for what a slow machine looks like.

I just tried clicking around DoorDash and didn’t see anything nearly this bad. Not even close.

Every time there’s a Hacker News thread about how slow websites are, there are dozens of comments like this claiming extremely large latency numbers. I can’t tell how much of it is exaggeration for effect, or if some people have configurations with weird quirks that make them abnormally slow.

I suspect it’s a lot of this:

> on my 2008 laptop with 4 GB of RAM.

Sorry, but I don’t think it makes sense for companies to optimize their websites for computers that are nearly two decades old and don’t even have enough RAM to run a modern operating system.

The intersection between people who spend money on a expensive luxury service like food delivery to their door and people who won’t spend the cost of a couple DoorDash deliveries to upgrade from a 2008 laptop to a 2018 laptop in 2025 is negligibly small.

mym1990 4 days ago | parent | prev | next [-]

Hmm, I would say comparing Hacker News to DoorDash is not exactly apples to apples. There may also be ulterior motives to make a website slow(or at least not optimized) if the company wants to funnel people towards an app on the phone.

dzhiurgis 4 days ago | parent | prev [-]

Gmail takes 3s to load. And HN is a website, not an app.

MYEUHD 4 days ago | parent | next [-]

That's the discussion being had.

HN would've been considered an app if it was built as SPA

paulryanrogers 4 days ago | parent | prev | next [-]

> Gmail takes 3s to load

On a 2008 device, in 2025? On a mobile connection?

xyzsparetimexyz 4 days ago | parent | prev [-]

How is HN not an app? All the content is user generated. Everything is interactive. What's the difference?

dzhiurgis 4 days ago | parent | next [-]

Most of the time spent is content consumption (viewing html documents), rather interaction (which is only 2 simple actions).

da_chicken 4 days ago | parent | next [-]

Navigating to a thread isn't much different than navigating to a store.

Upvoting isn't much different than adding to the cart.

Payment isn't much different than adding a comment.

I think the interaction levels are incredibly similar. The primary difference is the amount of images displayed, but that really isn't a significant aspect of web design or Internet traffic in 2025.

dzhiurgis 3 days ago | parent [-]

When you posit like that makes me think doordash shouldn’t be an app at first place.

xyzsparetimexyz 4 days ago | parent | prev [-]

I suppose that's valid. Feels fairly arbitrary though

layer8 4 days ago | parent | prev [-]

Document-centric, form-driven websites aren’t traditionally called apps. The Wikipedia website isn’t an app. Web forums aren’t apps. “App” implies that essential parts of the interaction logic are driven by client-side code (JS, not HTML) that couldn’t similarly be implemented by HTML forms.

chuckadams 4 days ago | parent [-]

Sure Wikipedia is an app, long as you think of static websites as a subset of web apps. One might say that makes the term meaningless, I say it's a deliberately vague term because people don't want or need to get cornered by fine distinctions. Otherwise it's endless quibbling over whether things like search do or do not count toward a site being a "real" app.

layer8 3 days ago | parent [-]

Your definition of the term simply doesn’t match common usage.

mirkodrummer 4 days ago | parent | prev | next [-]

Thing is, and I believe it's a valuable example counterpoint, if I shift-click on a link, like sci-fi category, to open it in a new tab(very common thing people do), having a multi page application is zero work added, on an spa you have to manage that. If the link doesn't exist and categories can only be accessed by a select input then ux isn't that great

paffdragon 4 days ago | parent | next [-]

This is so annoying when SPAs break browser functionality like open link in new tab. Even if it works often it has to reload a bunch of JS again, which makes things veeery slow. This is why I really don't like Linear, I often open issues in separate tabs, which is every time a pain, browser freezes for seconds just to open a link in a tab...

nevertoolate 4 days ago | parent [-]

I think it was not designed for your use case. It is conveniently running in the browser but not a collection of links to html pages. The people quarreling in this comment section seemingly ignore that life is complex, web browsers are complex, web development is complex. Maybe you should think about linear as a desktop app and call it a day :)

paffdragon 3 days ago | parent [-]

My use case is being productive at my work. If it was not designed for that, then not sure what is it for. It comes up fairly often in team chats, it's a resource hog with ridiculously high memory usage, poor browser utility, syntax highlighting selecting random languages,... I'm happy that it fits your desktop app use case, but there are also other people that see this as a limitation and sign of poor design. Web development is complex, but acknowledging that there are poor web apps and better web apps is part of that too. And from my experience and in my opinion (also shared in my team), Linear is an example of the bloated web app that is making the web worse for us, which is why I brought it up above.

EDIT: the parent I was replying to talked about the issue of opening links in new tabs and your answer - don't do it, use it like a desktop app - is basically the problem we are having with some of these web apps.

brulard 4 days ago | parent | prev [-]

Modern SPA frameworks give you this by default. You can deep link and get the specific page with all the parameters/filters etc applied server rendered in remix/react-router, next, sveltekit and most other frameworks. I agree that too many apps have basic things like CMD+click broken

Zanfa 4 days ago | parent | prev | next [-]

Please no. Whenever I see an online store as a SPA catalogue I shudder, because it usually breaks after browsing a bit in a weird state. And it resets to somewhere random should you hit back, refresh or try to send a link to somebody.

lblume 4 days ago | parent [-]

Yes, SPAs should store simple user state in the URL as well.

SenHeng 4 days ago | parent | prev | next [-]

> It's much better ux when a user downloads the whole catalogue and then apply filters on the client without having to touch the server until he wants to get to the checkout.

This is what we[0] do too. We have a single JSON with a thousand over BOMs that's loaded directly into the browser. Previously we loaded the inventory data via an API as is usually expected. The fact that there's even an API meant requiring progress and loading bars, API unavailability scenarios, etc.

Having it all as a single preloaded JSON meant that all of the above goes away. Response is instantaneous.

[0]: https://chubic.com

throwaway7783 4 days ago | parent | prev | next [-]

HTMX (and similar) solves a lot of this. It so happens that we end up building two apps one frontend and one backend with SPAs as built today. I'd rather build a lot of it on the server side, and add some dumb interactivity on the client (show/hide, collapse/expand, effects). There is still a place for SPA though.

naet 4 days ago | parent [-]

HTMX does the opposite of this, it requires many more round trips to the server instead of using client side JS to do work.

recursivedoubts 4 days ago | parent | next [-]

htmx does not require many more round trips to the server, front end scripting is perfectly compatible with htmx:

https://hypermedia.systems/client-side-scripting/

in addition to native html features like <details>, etc.

htmx can often decrease the number of trips to a server because in the hypermedia model you are encouraged to deliver all the content for a UI in one fell swoop, rather than in a series of chatty JSON requests that may be made due to opaque reactive hooks.

aquariusDue 4 days ago | parent | prev | next [-]

I find Datastar to be a better replacement for HTMX, especially now that it can also do plain requests instead of Server-Sent Events. You also don't need Alpine.js combined with HTMX anymore.

chuckadams 4 days ago | parent [-]

First time I've heard of Datastar. Not sure what to make of it yet, but the video on data-star.dev is certainly one of the cutest things I've seen all year!

princevegeta89 4 days ago | parent | prev | next [-]

Many more round trips to the server is okay - it is the server after all and it is easy to scale it.

goatlover 4 days ago | parent | prev | next [-]

Does it matter how many round trips are made to the server if they're fast enough to be seamless?

throwaway7783 4 days ago | parent | prev [-]

I meant for the SPA-like experience.

crazygringo 4 days ago | parent | prev | next [-]

Generally speaking, companies don't want you to download their entire catalog. They don't want competitors to be able to analyze it easily like that.

And if a store is selling books, it might have hundreds of thousands of them. No, it's not a good experience to transfer all that to the client, with all the bandwidth and memory usage that entails.

zeroq 4 days ago | parent [-]

That's really weak argument.

If it's one their website the competitors can write a simple crawler and create that catalog.

And you don't have to send every single field you have in your database. Once the user selects a category you can send a metadata that enable the client to scaffold the UI. Then you cache the rest while user interacts with the site.

Barnes and Nobles - according to their FAQ - has 1 million unique items in their catalog. But they also have tons of different categories. A single book cover weights around 30kb.

I'll leave it as an excercise to figure out how much data you can fit into 30kb to make usable filtering system.

btw: opening their front page downloads 12.1MB already.

crazygringo 4 days ago | parent | next [-]

> I'll leave it as an excercise to figure out how much data you can fit into 30kb to make usable filtering system.

Into 30kb? That's just 300 items at 100 bytes each. So not a lot?

what 4 days ago | parent | prev [-]

>I'll leave it as an excercise to figure out how much data you can fit into 30kb to make usable filtering system.

Not even 1 bit per item in the Barnes and nobles catalog? So not much.

kerkeslager 4 days ago | parent | prev | next [-]

> SPA is not only about seamless transitions but also being able to encapsulate a lot of user journey on the client side, without the need of bothering server too much.

True, but as a user, I don't want you encapsulating my journey. You can wax poetic about hypothetical book categories, but the reality of SPAs is that they break back buttons, have terrible accessibility anti-patterns, attempt to control my attention, and expose my computer to all your bad security practices. SPAs usually contain half-assed implementations of half the features that ship standard in a modern browser, and the only motivation for all that effort is to make the site more favorable to the server's owner.

When a site is implemented with simple HTML and careful CSS, I can configure it to my needs in the browser quite easily. That's a site that favors me, not your opaque blob nonsense.

uxcolumbo 3 days ago | parent [-]

Why is this being downvoted.

I agree, as user make it fast and don’t break the web. I see so many SPAs with bad usability, like back button not working.

A shopping site doesn’t need to be SPA.

A drawing app, 3D modeller, collaborative whiteboard, video chat, those type of apps fit the SPA model.

ec109685 4 days ago | parent | prev | next [-]

In almost all cases the back swipe in the spa resets you to the top of the page, navigating out of the app and back in doesn’t work, etc. It’s really hard to build a multi page spa that feels good.

zeroq 4 days ago | parent | next [-]

It's funny you've mentioned that.

It reminded me of the time when I joined Wikia (now Fandom) back in, I think it was 2006. One of the first things that landed on my desk was (I can't recall the context) deeplinking.

And I remember being completely flabergasted, as I came Flash/games background, and for us that problem was completely solved for at least 4 years at the time (asual swfaddress package). I felt kind of stupid having to introduce that concept to much more senior engineers that I was at the time.

dzhiurgis 4 days ago | parent | prev [-]

Never thought about scroll position (tho SPA I’ve built recently I think does it ok). How do you solve it?

throwawaylaptop 4 days ago | parent | next [-]

I'm a self taught PHP/jQuery/bootstrap guy with a small saas. I handle page scroll position by literally saving it into some session data cookie and when you go back I check where your scroll was and I fix it for you. I'm not a genius or skilled... But I cared so I did it.

PaulHoule 4 days ago | parent [-]

I’ve done it like that. It’s not a lot of code.

cyco130 4 days ago | parent | prev | next [-]

I write some of my thoughts on this sone years ago. The library described at the end is now fairly out of date but the ideas and suggestions are still good, I think.

https://dev.to/cyco130/how-to-get-client-side-navigation-rig...

fleebee 4 days ago | parent | prev [-]

Depends on what you're using for routing.

In Tanstack Router it's a boolean you set when creating the router. The documentation nicely lays out what's being done under the hood, and it's quite a bit.[1] I wouldn't try that at home.

In React Router you just chuck a <ScrollRestoration /> somewhere in your DOM.[2]

[1]: https://tanstack.com/router/v1/docs/framework/react/guide/sc...

[2]: https://reactrouter.com/6.30.1/components/scroll-restoration

ThatPlayer 4 days ago | parent | prev | next [-]

In a similar use, a few of my hobby projects are hosted on static web servers. So instead of rendering out everything into individual pages, which would've been tens of thousands of pages, I have a JSON file that gets rendered by the client in a SPA. I have even used Github Pages for this

I'm playing around with a newer version that uses a sqlite database instead. Sqlite officially has wasm builds, and the database file is already built to be separated into pages. With HTTP Range Requests, I can grab only the pages I need to fulfill any queries.

Sqlite full text search even works! Though I'm hesitant to call that a success because you do end up grabbing the entire FTS table for shorter searches. Might be better to download the entire database and build the FTS table locally.

brulard 4 days ago | parent | prev | next [-]

This wouldn't work for 90% of apps out there. While I would love that approach, it has so many problems that in practice it never worked out for me. First problem is that data change. Either you edit it, or you need updates from server applied to your copy. This is quite complex (although there are some solutions, I had little luck in making this work reliably). Second - you don't want to share the whole catalog in one request. Third - in most cases there is a lot of data. It's not uncommon to have tens of thousands of items and for each you likely need some kilobytes. Four - you are downloading the whole catalog even for deep links which might not care about all but a tiny fraction of that data.

isleyaardvark 3 days ago | parent [-]

In my experience, state is the devil, and the approach of filtering things on the client moves a bunch of state management from the back end to the browser, which is the worst place to handle state.

ndriscoll 4 days ago | parent | prev | next [-]

You can do that with checkbox selectors. e.g. see [0]. Note that I don't really do frontend and I asked chatgpt to vibe code it, so this may not be the best way to do it.

[0] https://html-preview.github.io/?url=https://gist.githubuserc...

danielscrubs 4 days ago | parent | prev [-]

Pjax was the goat.