| ▲ | mikkupikku 3 days ago |
| To be fair. 8GB of ram is huge. I don't know, maybe I'm stuck in the early 00s but even 2 GB of ram still seems extravagant; I remember when that was an exotic amount of RAM for dedicated gamers to play extremely high fidelity games, so for a mere web server 8 GB of ram almost seems like absurd overkill. I still feel a tinge of shame whenever I see any software of my own using more than a few hundred megabytes. What a waste. |
|
| ▲ | aduty 3 days ago | parent | next [-] |
| I remember when 16 MB was considered a lot. Then again, I also remember when graphics acceleration was considered optional. |
| |
| ▲ | Koshkin 2 days ago | parent | next [-] | | Well, graphics acceleration is still considered optional on servers :) | |
| ▲ | fsagx 2 days ago | parent | prev [-] | | I asked my dad for the 16kB expansion. He said no. |
|
|
| ▲ | seethishat 3 days ago | parent | prev | next [-] |
| The major difference, here, is this is intended for multiple users (not one person). Imaging 5,000 users all using the device at the same time. The amount of memory, open file handles, network connections, etc. for many users at once adds up. |
| |
| ▲ | dhosek 2 days ago | parent [-] | | The IBM mainframe that I used at UIC in the 80s had 64 MEGA bytes of RAM and about double the users. |
|
|
| ▲ | giis 3 days ago | parent | prev | next [-] |
| Until few days server ago was using 8GB and I did a cost cutting measure and its running on 4GB server for last week or so. :) |
|
| ▲ | nkrisc 2 days ago | parent | prev | next [-] |
| Depends entirely on what you're doing. 8GB of RAM is very insufficient for 3D texturing workflows, for example, where you can have many different 4k textures cached in memory. For other things, 8GB is probably a lot. |
|
| ▲ | andrewstuart 3 days ago | parent | prev | next [-] |
| 64K was huge when the Commodore 64 came along. |
| |
| ▲ | Twirrim 2 days ago | parent | next [-] | | I barely used or remember the ZX-81 my folks had with it's amazing 1KB of memory. It had a 16K expansion module you could plug into the back, which apparently made a big difference, but also didn't have the greatest connection. You could easily dislodge it typing on the keyboard. I do remember my father coming up with various ways to try to secure it. The ZX Spectrum that followed, with its huge 48K of RAM was night and day. The programs were so much more complicated. Even echo on linux these days takes 38K of disk space and a baseline of 13K of memory to execute, before whatever is required to hold the message you're repeating. | | |
| ▲ | Lio 2 days ago | parent [-] | | Fixing the 16K RAM pack makes an apperance in the Micro Men film: https://youtu.be/XXBxV6-zamM?t=1694 RAM was so tight on those 8-bit machines that many games used tricks like hiding things inside the viewable area of the screen to eck out just a little bit more. |
| |
| ▲ | Lio 2 days ago | parent | prev [-] | | Not sure why the down votes, this is true. If you only had 16, 32, or 48K then 64K seemed like a lot. Hell, the RAM size was so important that they named machines after it. |
|
|
| ▲ | weird-eye-issue 2 days ago | parent | prev [-] |
| Recently I had a laggy browser tab and I checked and it literally was using 7.6 GB of RAM. |
| |
| ▲ | Koshkin 2 days ago | parent [-] | | Quite often clients were more powerful than servers. Hell, at one point a CPU embedded into a printer could be faster than, say, 8088. An X server (running on the client side) often required a more powerful machine than one running X clients (i.e. a server). A web browser is not an exception. |
|