| ▲ | senfiaj 9 hours ago |
| From my understanding this is an official statement, not a benchmark result. > The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows, all of which demand additional memory. So it is more about the 3rd party software instead of OS or desktop environment. Actually, nowadays it's recommended to have 8+ GB of RAM, regardless of OS. I just checked the memory usage on Ubuntu 24.04 LTS after closing all the browser tabs. It's about 2GB of 16GB total RAM. 26.04 LTS might have higher RAM usage but it seems unlikely that it will get anywhere close to 6GB. |
|
| ▲ | HauntingPin 8 hours ago | parent | next [-] |
| Also, the Windows 11 requirements are ludicrous. https://www.microsoft.com/en-us/windows/windows-11-specifica... 4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay. |
| |
| ▲ | mpyne 6 hours ago | parent | next [-] | | > 4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay. Not okay as soon as you throw on the first security tool, lol. I work in an enterprise environment with Win 11 where 16 GB is capped out instantly as soon as you open the first browser tab thanks to the background security scans and patch updates. This is even with compressed memory paging being turned on. | |
| ▲ | winrid 7 hours ago | parent | prev [-] | | Win11 IOT runs great on 4gb if that matters :) I have a few machines in the field running it and my java app, still over a gig free usually. |
|
|
| ▲ | mpol 8 hours ago | parent | prev | next [-] |
| It's not just the applications, the installer doesn't even start up with 1GiB of memory. With 2GiB of memory it does start up. You could (well, I would :) ) blame it on the Gnome desktop, but it is very different from what I would have expected. I just tested this with 25.10 desktop, default gnome. With 24.04 LTS it doesn't even start up with 2GiB. |
| |
| ▲ | senfiaj 4 hours ago | parent [-] | | So, you mean when RAM is 2 GiB with 25.10 the installer started up but didn't with 24.04? What about being able to install and then boot the installed Ubuntu? |
|
|
| ▲ | CoolGuySteve 8 hours ago | parent | prev | next [-] |
| No because as far as we know 26.04 won't enable zswap or zram whereas Windows and MacOS both have memory compression technology of some sort. So Ubuntu will use significantly more memory for most tasks when facing memory pressure. Apparently it's still in discussion but it's April now so seems unlikely. Kind of weird how controversial it is considering DOS had QEMM386 way back in 1987. |
| |
| ▲ | cogman10 8 hours ago | parent | next [-] | | Zswap is a no brainer. I have to wonder why the hesitancy. | |
| ▲ | bzzzt 7 hours ago | parent | prev [-] | | QEMM386 for DOS did not have a memory compression feature. Only one of the later versions for Windows 3.1 did. | | |
| ▲ | roryirvine 6 hours ago | parent [-] | | CPUs really weren't up to the job in the pre-Pentium/PowerPC world. Back then, zip files used to take an appreciable number of seconds to decompress, and there was a market for JPEG viewers written in hand-optimised assembly. That's why SoftRAM gained infamy - they discovered during testing that swapping was so much faster than compression that the released version simply doubled the Windows swap file size and didn't actually compress RAM at all, despite their claims (and they ended up being sued into oblivion as a result...) Over on the Mac, RAMDoubler really did do compression but it a) ran like treacle on the 030, b) needed to do a bunch of kernel hacks, so had compatibility issues with the sort of "clever" software that actually required most RAM, and c) PowerMac users tended to have enough RAM anyway. Disk compression programs were a bit more successful - DiskDoubler, Stacker, DoubleSpace et al. ISTR that Microsoft managed to infringe on Stacker's patents (or maybe even the copyright?) in MS DOS 6.2, and had to hastily release DOS 6.22 with a re-written version free of charge as a result. These were a bit more successful because they coincided with a general reduction in HDD latency that was going on at roughly the same time. |
|
|
|
| ▲ | panarky 8 hours ago | parent | prev | next [-] |
| If you run Windows 11 with Microsoft Teams and Microsoft Outlook on a 4GB machine you're gonna have a bad day. |
|
| ▲ | Lerc 9 hours ago | parent | prev | next [-] |
| I know 2GB isn't very heavy in OS terms these days, but it's still enough to hold nearly 350 uncompressed 1080p 24-bit images. There's rather a lot of information in a single uncompressed 1080p image. I can't help but wonder what it all gets used to for. |
| |
| ▲ | array_key_first 2 hours ago | parent | next [-] | | A lot of it is optimizing applications for higher-memory devices. RAM is completely worthless if it's not used, so ideally you should be running your software with close to maximum RAM usage for your device. Of course, the software developer doesn't necessarily know what device you will be using, or how much other software will be running, so they aim for averages. For example, Java applications will claim much more memory than they need for the heap. Most of that memory will be unused, but it's necessary to have a faster running application. If you've ever run a Java app at consistently 90% heap usage, you know it grinds to an absolute halt with constant collection. The same is true for caching techniques. Reading from storage is slow, so it often makes sense to put stuff in RAM even if you're not using it very often. | |
| ▲ | senfiaj 8 hours ago | parent | prev | next [-] | | I also believe that this memory usage might be decreased significantly, but I don't know how much (and how much is worth it). Some RAM usage might be useful, such as caching or for things related with graphics. Some is a cumulative bloat in applications caused by not caring much or duplication of used libraries. But I remember in 2016 Fedora Gnome consumed about 1.6GB of RAM on my PC with 2GB of RAM a decade ago. Considering that after a decade the standard Ubuntu Gnome consumes only 400MB more RAM and also that my new laptop has 16GB of RAM (the system might use more RAM when more RAM is installed), I think the increase is not that bad for a decade. I thought it would be much worse. | | |
| ▲ | jonhohle 7 hours ago | parent | next [-] | | Buy why that much? The first computer I bought had 192MB of RAM and I ran a 1600x1200 desktop with 24-bit color. When Windows 2000 came out, all of the transparency effects ran great. Office worked fine, Visual Studio, 1024x768 gaming (I know that’s quite a step down from 1080p). What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor? | | |
| ▲ | Someone 5 hours ago | parent | next [-] | | > and I ran a 1600x1200 desktop with 24-bit color > What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor? It’s not a factor of ten, but a 4K monitor has about four times as many pixels. Cached font bitmaps scale with that, photos take more memory, etc. > When Windows 2000 came out In those times, when part of a window became uncovered, the OS would ask the application to redraw that part. Nowadays, the OS knows what’s there because it keeps the pixels around, so it can bitblit the pixels in. Again, not a factor of ten, but it contributes. The number of background processes likely also increased, and chances are you used to run fewer applications at the same time. Your handful of terminals may be a bit fuller now than it was back then. Neither of those really explain why you need gigabytes of RAM nowadays, though, but they didn’t explain why Windows 2000 needed whatever it needed at its time, either. The main real reason is “because we can afford to”. | |
| ▲ | senfiaj 7 hours ago | parent | prev | next [-] | | Partly because we have more layers of abstraction. Just an extreme example, when you open a tiny < 1KB HTML file on any modern browser the tab memory consumption will still be on the order of tens, if not hundreds of megabytes. This is because the browser has to load / initialize all its huge runtime environment (JS / DOM / CSS, graphics, etc) even though that tiny HTML file might use a tiny fraction of the browser features. Partly because increased RAM usage can sometimes improve execution speed / smoothness or security (caching, browser tab isolation). Partly because developers have less pressure to optimize software performance, so they optimize other things, such as development time. Here is an article about bloat: https://waspdev.com/articles/2025-11-04/some-software-bloat-... | |
| ▲ | tosti 5 hours ago | parent | prev | next [-] | | 2 Programmers sat at a table. One was a youngster and the other an older guy with a large beard. The old guy was asked: "You. Yeah you. Why the heck did you need 64K of RAM?". The old man replied, "To land on the moon!". Then the youngster was asked: "And you, why oh why did you need 4Gig?". The youngster replied: "To run MS-Word!" | |
| ▲ | winrid 7 hours ago | parent | prev [-] | | Higher res icons probably add a couple hundred megs alone |
| |
| ▲ | KronisLV 8 hours ago | parent | prev [-] | | I remember running Xubuntu (XFCE) and Lubuntu (LXDE, before LXQt) on a laptop with 4 GB of RAM and it was a pretty pleasant experience! My guess is that the desktop environment is the culprit for most modern distros! | | |
| ▲ | abenga 6 hours ago | parent [-] | | Gnome 50 and its auxilliary services on my machine uses maybe 400MB. The culprit is browsers, mostly. |
|
| |
| ▲ | adgjlsfhk1 8 hours ago | parent | prev [-] | | well to start, you likely have 2 screen size buffers for current and next frame. The primary code portion is drivers since the modern expectation is that you can plug in pretty much anything and have it work automatically. | | |
| ▲ | Lerc 8 hours ago | parent [-] | | How often do you plug in a new device without a flurry of disk activity occurring? |
|
|
|
| ▲ | rdsubhas 3 hours ago | parent | prev [-] |
| That's subjective and I would be more comfortable if that's called as Recommended memory, not Minimum memory. Minimum memory as in this change sets a completely different expectation. |