▲ | TMWNN 12 hours ago | |||||||
> The article handwaves over why the chip wasn't a success, which makes my first thought of "how much did each chip cost" all the more relevant. That's being polite. Attributing the chip's failure to AT&T buying NCR is ridiculous; that happened in 1991. Here's a rundown of what actually happened: * After the divestiture, AT&T from 1984 is finally allowed to build and sell computers. (This is also why Unix was not a commercial product from AT&T until then.) Everyone, in and outside AT&T, thinks Ma Bell is immediately going to be an IBM-level player, armed with Bell Labs research and Western Electric engineering. One of many, many such articles that conveys what everyone then expects/anticipates/fears: <https://archive.org/details/microsystems_84_06/page/n121/mod...> If there is anyone that can turn Unix into the robust mainstream operating system (a real market opportunity, given that IBM is still playing with the toy DOS, and DEC and other minicomputer companies are still in denial about the PC's potential), it's AT&T. * AT&T immediately rolls out a series of superminicomputers (the 3B series) based on existing products Western Digital has made for years for AT&T use (and using the Bellmarc CPU) and, at at the lower end, the 6300 (Olivetti-built PC clone) and UNIX PC (Convergent-built Unix workstation). All are gigantic duds because, despite superb engineering and field-tested products, AT&T has never had to compete with anyone to sell anything before. * After further fumbling, AT&T buys NCR to jumpstart itself into the industry. It gives up five years later and NCR becomes independent again. * The end. >This is such an uplifting story until you think about how the 8086 is just about to wipe it off of the map. People today have this idea that Intel was this dominant semiconductor company in the 1980s, and that's why IBM chose it as the CPU supplier for the PC. Not at all. Intel was then no more than one of many competing vendors, with nothing in particular differentiating it from Motorola, Zilog, MOS, Western Digital, Fairchild, etc. The 8088's chief virtue was that it was readily available at a reasonable price; had the PC launched a little later IBM probably would have gone with the 68000, which Intel engineers agreed with everyone else was far superior to the 8086/8088 and 80286. Binary compatibility with them was not even in the initial plan for the 80386, so loathed by everyone (including, again, Intel's own people) was their segmented memory model (and things like the broken A20 line); only during its design, as the PC installed base grew like crazy, did Intel realize that customers wanted to keep running their software. That's why 80386 supports both segmented memory (for backward compatibility with the virtual 8086) and flat. And that flat memory model wasn't put in for OS/2, or Windows NT; it was put in for Unix. | ||||||||
▲ | pinewurst 32 minutes ago | parent | next [-] | |||||||
I think you mean ‘Western Electric’ rather than ‘Western Digital’. | ||||||||
| ||||||||
▲ | crb3 11 hours ago | parent | prev [-] | |||||||
> The 8088's chief virtue was that it was readily available at a reasonable price; That, and it had a compatible suite of peripheral chips, while the M68K didn't... Something I vaguely recall an Intel FAE gloating about soon after: "And we're going to keep it that way." |