Remix.run Logo
WalterBright 5 hours ago

I've always thought the 8087 was a marvelous bit of engineering. I never understood why it didn't get much respect in the software business.

For example, when Microsoft was making Win64, I caught wind that they were not going to save the x87 state during a context switch, which would have made use of the x87 impractical with Win64. I got upset about that, and contacted Microsoft and convinced them to support it.

But the deprecation of the x87 continued, as Microsoft C did not provide an 80 bit real type.

Back in the late 80's, Zortech C/C++ was the first compiler to fully implement NaN in the C math library.

jdsully 2 hours ago | parent | next [-]

Excel needed the x87 as well as they cared about maintaining the 80-bit precision in some places to get exactly the same recalc results. So they would have fixed it eventually most likely.

kstrauser 4 hours ago | parent | prev | next [-]

I’d agree that the engineering was brilliant (but 68882 gang represent!). Its ISA was so un-x86-like, though, as it was basically an RPN calculator. X86 had devs manipulating registers. X87 had them pushing operands and running ops that implicitly popped them and pushed the result back on the stack.

That’s not better or worse, just different. However, I can imagine devs of the days saying hey, uh, Intel, can we do math the same way we do everything else? (Which TBH is how you’d end up with an opcode for a hardware-accelerated bubble sort or something, because Intel sure does love them some baroque ISAs.)

jamesfinlayson 2 hours ago | parent | next [-]

> Its ISA was so un-x86-like, though, as it was basically an RPN calculator

Yeah I remember when I first came across floating point stuff when trying to reverse engineer some assembly - I wasn't expecting something stack-based.

WalterBright 3 hours ago | parent | prev [-]

Eh, as far as compiler backends go, the RPN stack was worse.

I thought the X86_64 instruction set was a giant kludge-fest, so I was looking forward to implement the AArch64 code generator. Turns out it is just as kludgy, but at right angles. For example, all the wacky ways of simply loading a constant into a register!

mschaef 3 hours ago | parent | prev | next [-]

What do you mean by respect? Here's a layperson's perspective, at least.

Up through the 486 (with its built in x87), the x87 was always a niche product. You had to know about it, need it, buy it, and install it. This is over and on top of buying a PC in the first place. So definitionally, it was relegated it to the peripheries of the industry. Most people didn't even know x87 was a possibility. (I remember distinctly a PC World article having to explain why there was an empty socket next to the 8088 socket in the IBM PC.)

However, in the periphery where it mattered, it gained acceptance within a matter of a few years of being available. Lotus 1-2-3, AutoCAD, and many compilers (including yours, IIRC) had support for x87 early on. I would argue that this is one of the better examples of marginal hardware being appropriately supported.

The other argument I'd make is that (thanks to William Kahan), the 8087 was the first real attempt at IEEE-754 support in hardware. Given that IEEE-754 is still the standard, I'd suggest that x87's place in history is secure. While we may not be executing x87 opcodes, our floating point data is still in a format first used in the x87. (Not the 80-bit type, but do we really care? If the 80-bit type was truly important, I'd have thought that in the intervening 45 years, there'd be a material attempt to bring it back. Instead, what we have are a push towards narrower floating point types used in GPGPU, etc.... fp8 and f16, sure... fp80, not so much.)

WalterBright 2 hours ago | parent [-]

> What do you mean by respect?

The disinterest programmers have in using 80 bit arithmetic.

A bit of background - I wrote my one numerical analysis programs when I worked at Boeing. The biggest issue I had was accumulation of rounding errors. More bits would put off the cliff where the results turned into gibberish.

I know there are techniques to minimize this problem. But they aren't simple or obvious. It's easier to go to higher precision. After all, you have the chip in your computer.

Cold_Miserable 4 hours ago | parent | prev [-]

x87 should have been killed off. It would have forced lazy game developers to use SSE around the 2005 era.

WalterBright 2 hours ago | parent [-]

Game floating point precision doesn't matter much - speed does. But if you're doing numerical analysis, it does matter.