Remix.run Logo
Lerc 8 hours ago

I know 2GB isn't very heavy in OS terms these days, but it's still enough to hold nearly 350 uncompressed 1080p 24-bit images.

There's rather a lot of information in a single uncompressed 1080p image. I can't help but wonder what it all gets used to for.

array_key_first 2 hours ago | parent | next [-]

A lot of it is optimizing applications for higher-memory devices. RAM is completely worthless if it's not used, so ideally you should be running your software with close to maximum RAM usage for your device. Of course, the software developer doesn't necessarily know what device you will be using, or how much other software will be running, so they aim for averages.

For example, Java applications will claim much more memory than they need for the heap. Most of that memory will be unused, but it's necessary to have a faster running application. If you've ever run a Java app at consistently 90% heap usage, you know it grinds to an absolute halt with constant collection.

The same is true for caching techniques. Reading from storage is slow, so it often makes sense to put stuff in RAM even if you're not using it very often.

senfiaj 8 hours ago | parent | prev | next [-]

I also believe that this memory usage might be decreased significantly, but I don't know how much (and how much is worth it). Some RAM usage might be useful, such as caching or for things related with graphics. Some is a cumulative bloat in applications caused by not caring much or duplication of used libraries.

But I remember in 2016 Fedora Gnome consumed about 1.6GB of RAM on my PC with 2GB of RAM a decade ago. Considering that after a decade the standard Ubuntu Gnome consumes only 400MB more RAM and also that my new laptop has 16GB of RAM (the system might use more RAM when more RAM is installed), I think the increase is not that bad for a decade. I thought it would be much worse.

jonhohle 7 hours ago | parent | next [-]

Buy why that much? The first computer I bought had 192MB of RAM and I ran a 1600x1200 desktop with 24-bit color. When Windows 2000 came out, all of the transparency effects ran great. Office worked fine, Visual Studio, 1024x768 gaming (I know that’s quite a step down from 1080p).

What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?

Someone 5 hours ago | parent | next [-]

> and I ran a 1600x1200 desktop with 24-bit color

> What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?

It’s not a factor of ten, but a 4K monitor has about four times as many pixels. Cached font bitmaps scale with that, photos take more memory, etc.

> When Windows 2000 came out

In those times, when part of a window became uncovered, the OS would ask the application to redraw that part. Nowadays, the OS knows what’s there because it keeps the pixels around, so it can bitblit the pixels in.

Again, not a factor of ten, but it contributes.

The number of background processes likely also increased, and chances are you used to run fewer applications at the same time. Your handful of terminals may be a bit fuller now than it was back then.

Neither of those really explain why you need gigabytes of RAM nowadays, though, but they didn’t explain why Windows 2000 needed whatever it needed at its time, either.

The main real reason is “because we can afford to”.

senfiaj 7 hours ago | parent | prev | next [-]

Partly because we have more layers of abstraction. Just an extreme example, when you open a tiny < 1KB HTML file on any modern browser the tab memory consumption will still be on the order of tens, if not hundreds of megabytes. This is because the browser has to load / initialize all its huge runtime environment (JS / DOM / CSS, graphics, etc) even though that tiny HTML file might use a tiny fraction of the browser features.

Partly because increased RAM usage can sometimes improve execution speed / smoothness or security (caching, browser tab isolation).

Partly because developers have less pressure to optimize software performance, so they optimize other things, such as development time.

Here is an article about bloat: https://waspdev.com/articles/2025-11-04/some-software-bloat-...

tosti 5 hours ago | parent | prev | next [-]

2 Programmers sat at a table. One was a youngster and the other an older guy with a large beard. The old guy was asked: "You. Yeah you. Why the heck did you need 64K of RAM?". The old man replied, "To land on the moon!". Then the youngster was asked: "And you, why oh why did you need 4Gig?". The youngster replied: "To run MS-Word!"

winrid 7 hours ago | parent | prev [-]

Higher res icons probably add a couple hundred megs alone

KronisLV 8 hours ago | parent | prev [-]

I remember running Xubuntu (XFCE) and Lubuntu (LXDE, before LXQt) on a laptop with 4 GB of RAM and it was a pretty pleasant experience! My guess is that the desktop environment is the culprit for most modern distros!

abenga 6 hours ago | parent [-]

Gnome 50 and its auxilliary services on my machine uses maybe 400MB.

The culprit is browsers, mostly.

adgjlsfhk1 8 hours ago | parent | prev [-]

well to start, you likely have 2 screen size buffers for current and next frame. The primary code portion is drivers since the modern expectation is that you can plug in pretty much anything and have it work automatically.

Lerc 8 hours ago | parent [-]

How often do you plug in a new device without a flurry of disk activity occurring?