Remix.run Logo
weinzierl 3 hours ago

OP has a good point, but for me, I'd rather wish we'd skipped the 90s and picked up again much later. I like to think of the 8-bit era as an early bronze age of computing, lots of things went right and were done right.

16-bit, to me, are the dark ages. Lot's of confusion, not much good came out of it technologically and aesthetically. God, everything was ugly. Maybe all the trials and tribulations were necessary for what was about to come but I like to believe they weren't.

32-bit to me is the golden age and 64-bit is platinum.

If you offered my a time machine to go back, I'd surely say: "No, thank you!". There hasn't been a better time than now, but if you'd forced me at gun point, I'd pick the 80s over the 90s any time.

saulpw 2 hours ago | parent | next [-]

I agree that 32-bit is the golden age, but 64-bit is enterprise bloat. I personally would go for 1995 vs 2005, but I think 2005 was a lot better than 2015 in terms of interfaces.

leptons 2 hours ago | parent | prev [-]

There was plenty of amazing stuff going on with computing in the 90's. You just had to know where to look. Do you consider the 68000 CPU to be 16 or 32 bit, or both?

weinzierl 2 hours ago | parent [-]

The 68000 was introduced in 1979, to me it is part of the 80s.

But you are making a good point. Maybe the distinction between 8/16/32/64-bit isn't really helpful. I loved the Amiga but I loathed all the XT/AT segmented memory bitplaned VGA 16-bit stuff. That to me is the deep dark ages.

einr 35 minutes ago | parent [-]

I loved the Amiga but I loathed all the XT/AT segmented memory bitplaned VGA 16-bit stuff. That to me is the deep dark ages.

I get the sentiment, but I have to nitpick the details ;)

VGA isn't bitplaned. It's chunky -- to put a pixel on the screen in 320x200x8 VGA mode 13h you literally just write a single byte to the VGA memory. The Amiga, on the other hand, does use planar graphics.

(Maybe you're thinking of EGA, which is planar and a pain to program for)