Remix.run Logo
klodolph 8 hours ago

You can find a lot of discussion about what the minimum specs for Quake are. Famously, it needs a decent FPU, and the Pentium was a convenient early CPU with a decent built-in FPU. It was significantly faster than a 486.

…But people have managed to run Quake on the 486.

And the myth people tell about Quake is that it killed Cyrix, because Quake performance on Cyrix was subpar. But was that true? And if it was true, was that because the Cyrix was slower than a Pentium, or was it because the Quake code had assembly that was hand-optimized for the Pentium FPU pipeline?

Anyway. “Most simple computer that could run Quake” is probably going to include a decent FPU. If you are implementing something on an FPGA, you can probably get somewhere around 200 MHz clock anyway. At which point you can run Quake II.

AbanoubRodolf 2 hours ago | parent | next [-]

The Cyrix story is actually well-documented. Quake's software renderer used hand-optimized x86 assembly with FPU instruction sequences specifically tuned for the Pentium's pipeline. Cyrix processors had a different FPU execution pipeline that stalled on those specific instruction orderings — the issue wasn't raw FPU performance, it was that the Pentium-optimized code ran slower on Cyrix than straightforward C code would have. It was hand-optimization that made things worse, not better, on a competitor's hardware.

The timing was brutal for Cyrix. This was right when "Intel Inside" was becoming a meaningful consumer brand signal, and game benchmarks were becoming the primary way consumers evaluated CPU purchases. Quake wasn't just a game, it was the benchmark everyone ran at CompUSA to compare machines. Being demonstrably worse at Quake, regardless of the cause, was a marketing catastrophe.

The real floor for running Quake is basically "does it have a hardware FPU." The 486 DX (with FPU) could do it at low resolution and low framerate. The 486 SX (no FPU, software float emulation) was genuinely painful. The Pentium was the first CPU where it actually felt good.

jasonwatkinspdx 6 hours ago | parent | prev | next [-]

My perspective from being a teen doing lan party stuff at the time: Quake ran slow on them, but it was far from the only thing that ran slow. Cyrix was well understood to be the value brand for general office apps and such, but not up to it for more demanding computing, and for having random compatibility issues here and there.

Ultimately what killed Cyrix is they just couldn't offer enough of a discount vs intel to matter, especially with all the lock in stuff intel was doing with Dell, Gateway, etc.

Intel Inside was a successful marketing campaign as well. If you were around back then I bet you can imagine the jingle/chord immediately.

polpo 7 hours ago | parent | prev | next [-]

I had a Cyrix 6x86 when Quake first came out. My disappointment at how poorly Quake ran on it was significant, especially because pretty much every other game at the time ran well on the Cyrix. The FPU performance in Quake was doubly handicapped on the Cyrix: not only was its FPU slower than the Pentium's to begin with, Quake's code was indeed hand-optimized for the Pentium's FPU pipeline. Fabien Sanglard's writeup of Michael Abrash's optimizations for Quake goes into great detail: https://fabiensanglard.net/quake_asm_optimizations/

5 hours ago | parent | prev | next [-]
[deleted]
Foobar8568 2 hours ago | parent | prev | next [-]

Quake ran like shit on 486dx33, a few fps at best.

NooneAtAll3 8 hours ago | parent | prev [-]

can it be rewritten to use fixed point arithmetic instead?

ndepoel an hour ago | parent | next [-]

Yes but also no. The problem with fixed point arithmetic is a lack of dynamic range compared to floating point. Floats are great at representing both large numbers with limited precision and small numbers with high precision, but with fixed point you have to make a choice based on which kind of number you're trying to represent. Meaning you need to use a mixture of 8.24, 16.16 and 24.8 fixed point types (and appropriate conversions) depending on the context of the calculations that you're doing.

It's possible to write a game engine with that limitation, but there's no easy natural conversion from Quake's judicious use of floats to a fully fixed-point codebase. You'd have to redesign and rewrite the entire engine from scratch, basically.

klodolph 7 hours ago | parent | prev | next [-]

I want to look at this from a different perspective… a single-precision floating-point multiply is pretty simple, no? 24x24 bit multiply, which is about half as many gates as a 32x32 bit multiply.

Maybe I would prefer to rip out the integer multiplication unit first, before ripping out the FPU.

Narishma 8 hours ago | parent | prev | next [-]

The PS1 doesn't an FPU but got a version of Quake 2, so it's possible. That said, it was somewhat different from the PC version, so it could be argued that it's not the same game.

klodolph 8 hours ago | parent [-]

The PS1 version definitely has its own engine, which is not just a port of the Quake 2 engine to the Playstation, but a new engine.

jasonwatkinspdx 6 hours ago | parent | next [-]

I can't speak on Quake, but I was a level designer on the failed effort to port Unreal to PSX.

My understanding from talking to the coders at the time was that Unreal's software renderer was a huge advantage as a starting point. They were able to reuse a lot of the portal rendering stuff as setup on the R3K cpu, but none of the rasterization. That had to go to the graphics core, which was a post setup 2D engine that in addition to the usual sprites, could do tris and quads.

We had a budget of about 3k polygons post clipping, and having two enemies on screen would burn about half of that. The other huge limit is the texture cache was tiny, so we couldn't do lightmaps. Our lightning was baked in at vertex level and it just was what it was.

There's a bit more info here: https://www.terrygreer.com/unrealpsx.html

I imagine the situation with Quake was comparable. The BSP stuff would carry right over, but I can't imagine they got lightmapping proper working at the time. They'd also need some sort of solution for overdraw, as Quake's PVS was a lot more loose than Unreal's portal clipping.

ndepoel 41 minutes ago | parent | prev [-]

The PS1 version uses a custom engine based on technology built for the game Shadow Master, the previous title by Hammerhead Studios. It was a technical tour de force for the original PlayStation.

rasz 3 hours ago | parent | prev [-]

Sure, but then you need CPU that is twice as fast :). Playstation did it by pushing geometry calculations to GTE.