Remix.run Logo
mpweiher 3 days ago

> a) developers don't taking any time to optimize, lazy load, cache, minimize dependencies...

> (This is partly on React, or may be on the culture around React that has made all of this normal and acceptable.)

Yes, that, too. But you are forgetting that React makes all that opimizing work necessary in the first place.

Networks are fast. Machines are crazy fast. Almost 30 years ago I was doing on-line adaptation of Postscript print files. So some form input and re-rendering the Postscript with the updates from the form values. Basically instantaneous.

branko_d 3 days ago | parent | next [-]

> Networks are fast.

Well, it depends on what you mean by “fast”: bandwidth or latency? While the bandwidth has improved enormously over the years, latency… not so much. And it never will due to the simple fact that the speed of light is limited.

Most of the slowness seems to come about by treating latency as something that doesn’t matter (because the testing is done on a local, low-latency network) or will improve radically over time because bandwidth did (which it will not).

Unfortunately, React wants to be both a rendering library and also manage state by tying it to the component lifetime, which encourages cascaded fetching - exactly the kind of workload that is sensitive to latency.

nicce 3 days ago | parent | prev | next [-]

> Yes, that, too. But you are forgetting that React makes all that opimizing work necessary in the first place.

Isn't the runtime state optimization the only responsibility of React. It's a library. The rest goes for Vite, Deno et al.

tmpz22 3 days ago | parent | prev [-]

Low powered android devices are a thing. Networks outside of Metro US, EU, and parts of Asia, are also a thing.

Check out google maps there’s more to the world than your open office.

HappMacDonald 3 days ago | parent | next [-]

His point isn't "network/hardware is fast, so let's be inefficient": it is the opposite. "network/hardware is fast, so why is the page still slow?". On lower powered devices and slower networks, it's even more vital to author lean applications and web pages — but "things are slow even when the hardware and network are fast" is a simple canary that we are swimming through some problems.

troupo 3 days ago | parent | prev | next [-]

1. Even those low-powered Android devices are basically supercomputers

2. The Javascript bloat hurts those devices immensely. See "Performance Inequality Gap 2024" https://infrequently.org/2024/01/performance-inequality-gap-...

3 days ago | parent | prev | next [-]
[deleted]
mpweiher 3 days ago | parent | prev [-]

How would you spec such a "lower powered" Android device?

panstromek a day ago | parent [-]

Alex Russel did a lot of writing on this and posts yearly updates based on the state of the phone market. You can pick median, P75 or P95 device based on the analysis and set up targets based on that.

https://infrequently.org/2024/01/performance-inequality-gap-...

I did it the simple way and bought a first item in "sort by cheapest" smartphone list. That's Alcatel 1, and it's extremely underpowered. It's maybe a bit overkill, but if something runs on that device, it will run amazing on anything else.

mpweiher 8 hours ago | parent [-]

Hmm....that's a cool writeup but not really what I was looking for. Anyway, let's take the phone configuration he mentions:

"The A51 featured eight slow cores (4x2.3 GHz Cortex-A73 and 4x1.7 GHz Cortex-A53) on a 10nm process"

Looking at Wikipedia, it also has at least 4 GB of RAM and comes with 4G Internet.

The Alcatel 1 also seems to have at least a 1 GHz CPU and at least a gigabyte of RAM.

I also had a look at the Samsung Galaxy Watch. Lowest spec I could find was 1 GHz dual core + 768MB RAM (bluetooth), 1.5 GB (LTE).

The machine I was doing the web-based Postscript rendering on was a PowerMac G3. Single core 32 bit processor running at 266 MHz and and with 192MB of RAM. Connection was early DSL, 768 KB down, I think 128 KB up.

I did not do any heroic optimizations, it was fast "as-is".

So I think my point stands that modern computers, including low end Smartphones and watches are incredibly powerful and fast, including the networks.

If your tech stack manages to bring that hardware to its knees for basic UI rendering, and requires a lot of optimization effort to run barely reasonably, then there is something fundamentally wrong with your tech stack.

panstromek 7 hours ago | parent [-]

> If your tech stack manages to bring that hardware to its knees for basic UI rendering, and requires a lot of optimization effort to run barely reasonably, then there is something fundamentally wrong with your tech stack

Yea, I think this is the problem, but the hard part is that it's largely outside of your control on the web. Alcatel 1 is technically a super computer, but it can't even run its own UI properly. I optimize my websites on it and while laggy, they sometimes run faster than Chrome's UI that displays them, it's crazy.

Running the system + the browser is already way too much and there's almost no perf budget left for the website - and it doesn't help that web tech is inherently sub-optimal in many ways, so you're already pessimized on all fronts. Even a baseline a simple page with almost no content is laggy on this device.

mpweiher 6 hours ago | parent [-]

> it's largely outside of your control on the web.

On the web or in the JS ecosystem?

While I agree that even just the browsers are monster applications, usually duplicating (at least) the entire OS they are running on, just usually worse.

However, most browsers are perfectly capable of extremely snappy rendering and interactions, even on low-powered devices.

Let's remember that WWW.app was developed on a NeXT Cube, 25 MHz 68040, probably in the 16-64 MB RAM range (min was 8, but I am assuming it was more than the min), and that was plenty snappy.