Remix.run Logo
VerifiedReports 6 hours ago

No, it isn't. Making your entire screen dark for all content isn't a solution for a dumb GUI color scheme.

"Back in the day, light mode wasn’t called “light mode”. It was just the way that computers were, we didn’t really think about turning everything light or dark. Sure, some applications were often dark (photo editors, IDEs, terminals) but everything else was light, and that was fine."

Several incorrect statements there. "Back in the day," computers displayed white text on a dark background (usually a blue background) out of the box. This was deemed the most legible. The opposite was called "inverse." The Atari 8-bit and Commodore 64 computers (and possibly others) even had dedicated keys that toggled between regular and inverse text; it is called that in the manual.

Word even had a checkbox option in it entitled "Blue background, white text." It wasn't removed until 2007, concurrent with lots of other UI regressions in Windows. Microsoft also removed the color-scheme editor from Windows, with which people had been able to set up global color schemes (including "dark" ones) since 1991.

When people finally realized how dumb it is to read dark text off the surface of a glaring light bulb all day, companies had to run around slapping hard-coded "dark modes" onto everything... after abandoning better solutions (user-defined system-wide color schemes) that had existed since the early '90s on every platform except the vaunted Mac.

So how did we end up suffering through decades of inverse GUIs? I've always attributed it to

1. The "desktop publishing" fad of the late '80s / early '90s, which sought to make the screen analogous to a piece of paper.

2. The Mac, which imitated Xerox's GUI, which was inverse. Possibly related to #1.

3. Windows defaulting to an inverse scheme (although it provided a way to easily change the global scheme), as it imitated the Mac.

cosmic_cheese 5 hours ago | parent | next [-]

> after abandoning better solutions (user-defined system-wide color schemes) that had existed since the early '90s on every platform except the vaunted Mac

Even classic Mac OS (pre OS X) had thousands of third party themes via the very popular extension Kaleidoscope and later the built in Appearance Manager. Kaleidoscope schemes especially ran the gamut, with looks ranging from cloning other OSes to green on black “Hollywood hacker” to Star Trek LCARS to shiny chrome to a pair of blue jeans. A great number of those themes were dark.

The loss of user control over appearance like that is tragic.

orbital-decay 5 hours ago | parent | prev [-]

>"Back in the day," computers displayed white text on a dark background (usually a blue background) out of the box. This was deemed the most legible.

That just prevented CRT degradation and it had less ghosting and flickering, especially as most CRTs in the home computer era were just terrible home TVs, and CRTs in the mainframe era were equally terrible. The saturated blue background was absolutely insufferable and I had ghosting and shifted color perception for minutes after using NC and Borland software for a long time. I loathe it till this day, just like garish CGA colors which were an assault on my eyes.

80's and 90's had a general concept of a desktop with windows as paper documents, because the first real use case for personal computers and workstations was assisting office jobs.

Funny how you call the normal light scheme inverted. IIRC PC text/graphics modes used this term for dark backgrounds.

VerifiedReports 3 hours ago | parent [-]

It was not a screen-saver. Less ghosting? Most likely. But the fact remains that reading text off of a light bulb blasting in your face all day sucks, and once upon a time people knew that... but "forgot" it when vendors shoved inverse color schemes on them by default.

"80's and 90's had a general concept of a desktop with windows as paper documents"

Yes, I noted that; but the analogy to a piece of paper fails because paper does not EMIT light.

Everyone with sufficient computing experience calls "light" schemes inverted. This was even documented in instruction manuals from the early PC era: https://imgur.com/a/aLV8tn0

orbital-decay an hour ago | parent [-]

Anyone experienced remembers that any 60Hz CRT is a flickering mess, especially computer monitors that used shorter-lived phosphors, and any old TV had terrible burn-in. That's why you want to reduce the amount of bright pixels on it. That's not a legibility thing.

A display is not a light bulb if you aren't specifically making it a light bulb against the poorly lit environment. There's no difference between reflected and emitted light, what you actually need is much better lighting in the room, so your display doesn't stand out when used on a brightness level that provides sufficient contrast (and just because working in a poorly lit room is unhealthy).

Moreover, a light scheme in a well-lit environment is less eye-straining, because your pupils contract and adapt to the light. If you're using a dark display against a dark background, your eyes adapt to the dark and then you're hit with the bright text. If you want to display more than just text, dark mode becomes a problem because most of the content (e.g. pictures, videos) is not largely dark.

tl;dr avoid excessive contrast and flickering. Everything else is individual eyesight differences, opinions, and snake oil.