Remix.run Logo
hsbauauvhabzb 3 hours ago

I don’t know how resolution maps to ram in x11 but I assume at least one byte per pixel. Based on that assumption, there’s no chance you’d even be able to power a 4k monitor with 8mb of ram, let alone the rest of the system.

PaulRobinson an hour ago | parent | next [-]

Correct, 4k is very modern by these standards. But then I'm old, so perhaps it's all about perspective.

Back in the days when computers had 8MB of RAM to handle all that MS-DOS and Windows 3.1 goodness, we were still in the territory of VGA [0], and SVGA [1] territory, and the graphics cards (sorry, integrated graphics on the motherboard?! You're living in the future there, that's years away!), had their own RAM to support those resolutions and colour depths.

Of course, this is all for PCs. By the mid-1990s you could get a SPARCstation 5 [2] with a 24" Sun-branded Sony Trinitron monitor that was rather more capable.

[0] Maxed out at 640 x 480 in 16-colour from an 18-bit colour gamut

[1] The "S" is for Super: 1280 x 1024 with 256 colours!

[2] https://en.wikipedia.org/wiki/SPARCstation_5

p_l 2 hours ago | parent | prev | next [-]

This was the main driver of VGA memory size for a time - if you spent money on 2MB card instead of a 1MB, you could have higher resolution or bit depth.

if you had a big enough framebuffer in your display adapter, though, X11 could display more than your main ram could support - the design, when using "classic way", allowed X server to draw directly on framebuffer memory (just like GDI did)

direwolf20 an hour ago | parent | prev | next [-]

X11 was designed to support bit depths down to 1 bit per pixel.

argsnd 3 hours ago | parent | prev [-]

Presumably every pixel is 32 bits rather than just 8. So the count starts at 33.2MB just for the display.

stavros 3 hours ago | parent | next [-]

It is now, but back then it was 1 byte, with typical resolutions being 800x600. There were high-color modes but for a period it was rare to have good enough hardware for it.

cout 2 hours ago | parent [-]

I have run x11 in 16-color and 256-color mode, but it was not fun. The palette would get swapped when changing windows, which was quite disorienting. Hardware that could do 16-bit color was common by the late 90s.

p_l 2 hours ago | parent | next [-]

Fun thing - SGI specifically used 256 color mode a lot, to reduce memory usage even if you used 24bit outputs. So long as you used defaults of their Motif fork, everything you didn't specifically request to use more colors would use 256 color visuals which then were composited in hardware.

actionfromafar 2 hours ago | parent | prev [-]

Much better to stick to 1 bit per pixel. :-)

Like in Sun SPARCStation ELC. No confusing colors or shades.

zozbot234 2 hours ago | parent | next [-]

1bpp (at low resolution) is still relevant today on epaper screens, though some of them now allow for shades of grey or even color.

t-3 an hour ago | parent [-]

Most aren't all that low res either... 300dpi is standard.

b112 2 hours ago | parent | prev [-]

But what if it's a UTF8 bit? Then it'd be 2 bits.

Which proves time travel exists, all those "two bits" references in old Westerns.

hsbauauvhabzb 2 hours ago | parent | prev [-]

Damn pixel bit-depth bloat!