Remix.run Logo
giamma 3 hours ago

It used to be like that, computer had limited resources and desktop environments were light. Then at some point RAM became less and less of an issue, and everything started to get bigger and less efficient.

Coyuld anyone summarize why a desktop Windows/MacOs now needs so much more RAM than in the past? is it the UI animations, color themes, shades etc etc or is it the underlying operating system that has more and more features, services etc etc ?

I believe it's the desktop environment that is greedy, because one can easily run a linux server on a raspberry pi with very limited RAM, but is it really the case?

zozbot234 2 hours ago | parent | next [-]

The web browser is the biggest RAM hog these days as far as low-end usage goes. The browsing UI/chrome itself can take in the many hundred megs to render, and that's before even loading any website. It's becoming hard to browse even very "light" sites like Wikipedia on less than a 4GB system at a bare minimum.

marhee 2 hours ago | parent | prev | next [-]

> Coyuld anyone summarize why a desktop Windows/MacOs now needs so much more RAM than in the past

Just a single retina screen buffer, assuming something like 2500 by 2500 pixels, 4 byte per pixel is already 25MB for a single buffer. Then you want double buffering, but also a per-window buffer since you don't want to force rewrites 60x per second and we want to drag windows around while showing contents not a wireframe. As you can see just that adds up quickly. And that's just the draw buffers. Not mentioning all the different fonts that are simultaneously used, images that are shown, etc.

(Of course, screen bufferes are typically stored in VRAM once drawn. But you need to drawn first, which is at least in part on the CPU)

torginus 21 minutes ago | parent | next [-]

Per window double buffering is actively harmful - as it means you're triple buffering, as the render goes window buffer->composite buffer->screen, and that's with perfect timing, and even this kind of latency is actively unpleasant when typing or moving the mouse.

If you get the timing right, there should be no need for double-buffering individual windows.

zozbot234 2 hours ago | parent | prev [-]

You don't need to do all of this, though. You could just do arbitrary rendering using GPU compute, and only store a highly-compressed representation on the CPU.

marhee 38 minutes ago | parent [-]

Yes, but then the GPU needs that amount of ram, so it's fairer to look at the sum of RAM + VRAM requirements. With compressed representations you trade CPU cycles for RAM. To save laptop battery better required copious amounts of RAM (since it's cheap).

flohofwoe 2 hours ago | parent | prev | next [-]

> is it the UI animations, color themes, shades etc etc or is it the underlying operating system that has more and more features, services etc etc ?

...all of those and more? New software is only optimized until it is not outright annoying to use on current hardware, it's always been like that and that's why there are old jokes like:

    "What Andy giveth, Bill taketh away."

    "Software is like a gas, it expands to consume all available hardware resources."

    "Software gets slower faster than hardware gets faster"
...etc..etc... variations of those "laws" are as old as computing.

Sometimes there are short periods where the hardware pulls a little bit ahead for a few short years of bliss (for instance the ARM Macs), but the software quickly catches up and soon everything feels as slow as always (or worse).

That also means that the easiest way to a slick computing experience is to run old software on new hardware ;)

creshal 2 hours ago | parent [-]

Indeed. Much of a modern Linux desktop e.g. runs inside one of multiple not very well optimized JS engines: Gnome uses JS for various desktop interactions, and all major desktops run a different JS engine as a different user to evaluate polkit authorizations (so exactly zero RAM could be shared between those engines, even if they were identical, which they aren't), and then half your interactions with GUI tools happens inside browser engines, either directly in a browser, or indirectly with Electron. (And typically, each Electron tool bundles their own slightly different version of Electron, so even if they all run under the same user, each is fully independent.)

Or you can ignore all that nonsense and run openbox and native tools.

torginus 30 minutes ago | parent | next [-]

Which is baffling as to why they chose it - I remember there being memory leaks because GObject uses a reference counted model - cycles from GObject to JS then back were impossible to collect.

They did hack around this with heuristics, but they never did solve the issue.

They should've stuck with a reference counted scripting language like Lua, which has strong support for embedding.

burner420042 an hour ago | parent | prev | next [-]

A month with Crunch Bang Plus Plus (which is a really nice distribution based on Openbox) and you'll appreciate how quick and well put together Openbox and text based config files are.

zozbot234 2 hours ago | parent | prev [-]

COSMIC is gaining ground as a JS-free alternative to current desktops, so hopefully you won't be limited to openbox and such.

creshal an hour ago | parent [-]

Openbox isn't limiting me, Wayland still has no advantages for what I do with desktops.

roywashere 3 hours ago | parent | prev | next [-]

I am wondering if, with memory and storage prices skyrocketing, there will be more effort on making computing use less resources?

t-3 an hour ago | parent [-]

Unlikely. If you can't afford RAM, how can you afford the SaaS contracts that keep devs employed?

anonnon 2 hours ago | parent | prev [-]

They typically also need GPU acceleration, these days, and that can be an even bigger bottleneck, with the drivers often not supporting older cards.