Remix.run Logo
justin66 5 days ago

This is a good comment. I guess I don't attach the same weight to the CISC/RISC thing you do, but I agree that it's certainly true that the big "the real problem was x" pronouncements are insufficient to explain what really happened. (but are nevertheless quite interesting when they come from the parties involved!)

simonh 5 days ago | parent | next [-]

Arguably there is one company that did manage to make the transition from 8 bit hobby home computing and gaming. Apple.

The Apple II was a contemporary, in fact a predecessor of all these systems. So, what did they do right, or what went right for them, that enabled them to make it? I suspect it was the penetration of the Apple II into education and business that helped make it possible, but suppose Steve Jobs had been in charge at Commodore or Atari?

badc0ffee 5 days ago | parent | next [-]

I've thought about this a bit, and what I can come up with is that Apple had the clear lead in the first wave of home PCs in the late 70s (the others being the Commodore PET and the TRS-80 model I), and maintained it. The Apple II had bitmap graphics and colour built-in, and a very fast and relatively cheap disk add-on, but also well thought out expandability. You didn't need to buy a sidecar unit; just throw a card in an empty slot. Importantly, it also worked with inexpensive TVs and monochrome monitors that you could purchase separately. The hardware was also high quality - it had a nice keyboard, and a switching power supply that didn't get hot.

Fast forward a few years, and the Apple II was still very usable and competitive, with RAM expansion options up to 128k, higher res graphics, and 80 column text, while still supporting the same software.

One other thing is that the Apple II was wildly profitable. It had no custom chips, just cleverly used commodity chips and some ROMs. This includes the fast and cheap disk system.

justin66 5 days ago | parent | prev | next [-]

> I suspect it was the penetration of the Apple II into education and business that helped make it possible

I don't know how much it moved the needle but it was astonishing how much schools and home users - parents whose kids used the machines at school - were willing to pay for an Apple II well after it was a technically obsolete machine. It definitely helped them to some extent.

(don't get me wrong, I love those machines in my bones, but they were pretty overpriced after a while)

flenserboy 5 days ago | parent [-]

here's a guess: text was sharp on an Apple II with a decent monitor. font shapes were good. no matter how good graphics were on the C64 & Ataris, in comparison, text was always blocky & amateur looking. Tandy did better on this front, but it wasn't enough for them. pretty sure this is the same reason why the Amiga & the ST didn't make more inroads — people looked at them alongside the Mac & technical considerations were quickly forgotten. it's funny to me that this hasn't changed all that much — Windows font rendering looks awful to me, & I'll always pick a Mac or Linux box to use instead if there's a choice, just so I don't have to put up with the fonts. this wasn't always the case — the old system character sets used under DOS were pleasant to use.

justin66 5 days ago | parent [-]

Font shapes.

mrandish 5 days ago | parent | prev | next [-]

Excellent question and one I already touched on in a sister reply before I saw your post. https://news.ycombinator.com/item?id=43722230

Apple is indeed an extraordinary outlier (as is Jobs). If you look into the history of Apple's Gil Amelio days, very near-death and Steve's return, it was IMHO, a remarkable example of a series of fortunate miracles coinciding to allow Steve to brilliantly save the company when it had been only weeks away from death. Jobs calling Bill Gates and convincing him to quickly loan $400M to Apple averted disaster potentially by a matter of days. And Gates only did that because MSFT was being sued for anti-trust by the Justice Dept and needed Apple to survive as an example that Wintel still had some competition. Apple's survival in that period is the closest close thing I think the industry has ever seen.

To answer your last question, Jobs was undoubtedly incredibly brilliant but it took every ounce of that brilliance AND some crazy good luck for Apple to survive. Ultimately, it was Jobs plus flukes, so no, just Jobs without the flukes wouldn't have changed anything at Atari or Commodore. Even on its death bed Apple had a much better brand, distribution, market potential and talent than Atari or Commodore ever did. Plus Steve had his hand-picked entrepreneurial team from Next with him. The situations at Atari and Commodore were just much weaker in every way, so I don't think any single super hero, no matter how super, could have saved them.

bsder 5 days ago | parent | prev [-]

> So, what did they do right, or what went right for them, that enabled them to make it?

Desktop publishing.

The Macintosh/LaserWriter cash cow absolutely dominated desktop publishing for a very, very, very long time.

This gave Apple access to enterprise accounts that other computer companies did not have.

mrandish 5 days ago | parent | prev [-]

> I guess I don't attach the same weight to the CISC/RISC thing you do

I certainly didn't appreciate the impact of CISC vs RISC architecture at the time. I understood the conceptual difference between them at a high level but didn't get why that caused Motorola to decide they couldn't keep scaling beyond the 68060. As a user and fan of 68030 and 040, I just didn't understand why they'd walk away from, arguably, the second most popular flagship computer ISA at the time. And they actually told computer manufacturers that the 68060 would be the end of the 68K line more than a year before 68060 even shipped. I was like, WTF? They weren't even done with the latest, greatest chip when they decided to kill the whole line, orphaning all that fantastic software from ever having any upgrade path to the future.

Only later did I gain a deeper appreciation for the impact. A few key things informed me:

* My high level understanding of CISC vs RISC wasn't complete back then. In the late 80s there was a lot of debate among CPU ISA designers on the relative merits between CISC and RISC - and that even extended to the definitions of the terms (which were fuzzy). A good example is this legendary Comp.Sci Usenet discusson: https://yarchive.net/comp/risc_definition.html. Only when I dove into those debates in the last 10 years did I really start to get the larger picture.

* The part that mattered most between RISC/CISC wasn't that RISC had less instructions than CISC (although it usually did), it was that those instructions were much less complex AND that the addressing modes were much less complex. This meant that, in general, RISC ISAs were easier to decode because instruction and operand length tended to be more fixed. This also had a bunch of other downstream effects generally causing RISC-ish ISAs to be easier to pipeline more deeply, easier to branch predict, easier to speculatively execute, etc. These are all things that enable putting extra gates to work speeding up execution.

* I was a huge fan of the 68K's insanely powerful addressing modes which let savvy assembly language programmers pack huge functionality into fewer instructions with addressing modes like indirection on both the input and output along with setting a bunch of flags and pre/post operations like decrement/increment. Programmers not only called 68K addressing modes powerful but also things like "orthogonal" and even "elegant." But all those varying instruction lengths with up to 14 different addressing modes plus even more optional flags modifying behavior before and/or after also created a complexity explosion for CPU architects trying to implement the most powerful new optimizations. That's one big reason why the 68060 was over a year late coming to market. It was only dual pipeline but even doing that was triggering unexpected avalanches of design complexity.

* Both Intel and Motorola realized the only way to continue increasing performance in the future while maintaining software compatibility with their old CISC ISA was to (basically) make future processors RISC CPUs running an extra hardware layer emulating a CISC ISA. It was both hard and costly in terms of gates and performance. Intel's lead in fab process helped them hide that performance cost and keep showing generational net speed increases as they navigated the transition. Motorola realized they'd probably have a generation or two of CPUs that weren't meaningfully faster until they bridged the gap and were on the other side.

There's a lot more but I'm certainly not a domain expert in CPU design and already summarizing my non-expert understanding of expert debates, so I'll leave it there. But it's pretty fascinating stuff. Both Intel and Moto realized that pure RISC implementations would probably beat them soon. Each company responded differently. Intel made the RISC-emulating-CISC approach to ISA compatibility (broadly speaking) work well enough to survive the transition. Motorola decided it was too risky (probably correctly given their fab technology and corporate resources), and instead chose to break with the past and partner with IBM in moving to Power PC. For Atari, Commodore, Apple et al this was a planetary level asteroid impact. If developers and customers lose all software compatibility with your new products, that makes the choice of moving to your next generation not much different than moving to another platform entirely. Only Apple managed to survive (and even they almost didn't). Arguably, they only treaded water with great design and marketing until saved by the iPod.

I should also mention there was another huge asteroid for vertically integrated non-Wintel computer platforms right behind the CISC/RISC asteroid. In the early to mid 90s Moore's Law scaling was allowing desktop computers to improve rapidly by growing dramatically more complex. It was getting to be more than one company could do to win on each separate front. On the Wintel side, the market solved this complexity by dividing the problem among different ecosystems of companies. One ecosystem would compete to make the CPU and chipset (Intel, NEC, Cyrix, AMD), another would make the OS (Windows/OS/2), another ecosystem would compete to make the best graphics and yet another would compete on sound (Creative, Yamaha, Ensoniq, etc). It would require a truly extraordinary company to compete effectively against all that with a custom vertically integrated computer. There was no way a Commodore or Atari could survive that onslaught. The game changed from company vs company to ecosystem vs ecosystem. And that next asteroid even wiped out stronger, better-capitalized companies that were on pure RISC architectures (Sun, SGI, Apollo, etc).

zozbot234 5 days ago | parent [-]

> But all those varying instruction lengths with up to 14 different addressing modes plus even more optional flags modifying behavior before and/or after also created a complexity explosion for CPU architects trying to implement the most powerful new optimizations

You certainly see the impact of the "don't add too many instructions/flags" style of design even today in things like RISC-V that doesn't even use condition codes (an unexpected source of complexity in ISA spec, since every instruction must define exactly how it affects or does not affect each of several condition codes - RISC-V has none of that), expects you to use instruction fusion in larger implementations and defines "compressed" instructions as a mere shorthand for existing full-length instructions in order to simplify decode. ARM64 has made different choices on all of these things, it will be quite interesting to see how they compare in real-world scenarios at the higher end of performance.)