Remix.run Logo
mrweasel 3 hours ago

It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.

At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.

mizzack an hour ago | parent | next [-]

Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.

https://en.wikipedia.org/wiki/S3_Texture_Compression

aruametello 5 minutes ago | parent [-]

+1 to that, when i first saw unreal tournament with the add-on compressed texture pack was a real WOW moment.

cubefox 5 minutes ago | parent | prev | next [-]

This is an ad from viral marketing company and everyone here is falling for it.

whizzter 2 hours ago | parent | prev | next [-]

Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.

PunchyHamster an hour ago | parent | prev | next [-]

G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's

formerly_proven 3 hours ago | parent | prev [-]

The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.

mrweasel 2 hours ago | parent [-]

Like virtualized NICs pretending to be an NE2000? That's interesting, do you know why they'd use a G200 and not something like an older ATI chip?

bluedino a few seconds ago | parent | next [-]

Drivers, probably.

formerly_proven 2 hours ago | parent | prev [-]

Probably started out as a real G200 chip which might’ve been the cheapest and easiest to integrate in the 2000s? Or it had the needed I/O features to support KVM (since this would’ve involved reading the framebuffer from the BMC side), or matrox was amenable to adding that.