Remix.run Logo
system2 13 hours ago

I have multiple apps using 300 GB+ PostgreSQL databases. For some queries, high RAM is required. I enable symmetrical NVMe swaps, too. Average Joe with gaming needs wouldn't need more than 64 GB for a long time. But for the database, as the data grows, RAM requirements also grow. I doubt my situation is relatable to many.

Ekaros 13 hours ago | parent [-]

I understand servers. But why do actually average user need more than 2 or 4GB? For what actual data in memory at one time?

parrellel 8 hours ago | parent [-]

Where have you seen 4 GB cut it in the last decade? 2 GB was enough to make Vista chug in 2007?

I've got old linux boxes that feel fine with a couple gig of DDR3, but can't think of a place where that would be acceptable outside of that.

Ekaros 8 hours ago | parent [-]

My entire question is why can't whatever users do on computers actually work on 2GB of RAM? Like what is the true reason we are in state that it is for some reason not possible?

2 GB is huge amount information. So surely it should be enough for almost all normal users, but for some reason it is not.

vee-kay 3 hours ago | parent | next [-]

Quick.. list your favorite software and tell us how much GBs of space they use after installation and how many GBs of RAM they consume when running.

You will find most of your fave programs struggle badly with 2-4GB of RAM, even on Linux.

Over the years most software programs (even on mobile) have become bloated and slow due to "new features" (even if most people don't need them) and also because it is a nexus with the hardware manufacturers. Who will buy any expensive CPU, more RAM, larger capacity SSDs, bigger displays, etc., if there is no software program needing all that extra oomph of performance, bandwidth, and fidelity?

bloppe 6 hours ago | parent | prev | next [-]

One potential reason: now that CPU clock speed is plateauing, parallelism is the main way to juice performance. Many apps try to take advantage of it by running N processes for N cores. For instance, my 22-core machine will use all 22 cores in parallel by default for builds with modern build systems. That's compiling ~22 files at once, using ~5x as much RAM as the 4-core machines of 15 years ago, all else being equal. As parallelism increases further, expect your builds to use even more memory.

parrellel 7 hours ago | parent | prev [-]

Ah! Yes, I agree.