| ▲ | martinald 2 days ago |
| We have had a decade of performance stagnation. Compare PS1 with PS3 (just over 10 years apart). PS1: 0.03 GFLOPS (approx given it didn't really do FLOPS per se)
PS3: 230 GFLOPS Nearly 1000x faster. Now compare PS4 with PS5 pro (also just over 10 years apart): PS4: ~2TFLOPS
PS5 Pro: ~33.5TFLOPS Bit over 10x faster. So the speed of improvement has fallen dramatically. Arguably you could say the real drop in optimization happened in that PS1 -> PS3 era - everything went from hand optimized assembly code to running (generally) higher level languages and using abstrated graphics frameworks like DirectX and OpenGL. Just noone noticed because we had 1000x the compute to make up for it :) Consoles/games got hit hard by first crypto and now AI needing GPUs. I suspect if it wasn't for that we'd have vastly cheaper and vastly faster gaming GPUs, but when you were making boatloads of cash off crypto miners and then AI I suspect the rate of progress fell dramatically for gaming at least (most of the the innovation I suspect went more into high VRAM/memory controllers and datacentre scale interconnects). |
|
| ▲ | SlowTao 2 days ago | parent | next [-] |
| It is not just GPU performance, it is that visually things are already very refined. A ten times leap in performance doesn't really show as ten times the visual spectical like it used to. Like all this path tracing/ray tracing stuff, yes it is very cool and can add to a scene but most people can barely tell it is there unless you show it side by side. And that takes a lot of compute to do. We are polishing an already very polished rock. |
| |
| ▲ | martinald 2 days ago | parent [-] | | Yes but in the PS1 days we were doing a 1000x compute performance a decade. I agree that 10x doesn't move much, but that's sort of my point - what could be done with 1000x? |
|
|
| ▲ | cosmic_cheese 2 days ago | parent | prev | next [-] |
| Yeah there’s been a drop off for sure. Clearly it hasn’t been steep enough for game studios to not lean on anyway, though. One potential forcing factor may be the rise of iGPUs, which have become powerful enough to play many titles well while remaining dramatically more affordable than their discrete counterparts (and sometimes not carrying crippling VRAM limits to boot), as well as the growing sector of PC handhelds like the Steam Deck. It’s not difficult to imagine that iGPUs will come to dominate the PC gaming sphere, and if that happens it’ll be financial suicide to not make sure your game plays reasonably well on such hardware. |
| |
| ▲ | martinald 2 days ago | parent [-] | | I get the perhaps mistaken impression the biggest problem games developers have is making & managing absolutely enormous amounts of art assets at high resolution (textures, models, etc). Each time you increase resolution from 576p, to 720p to 1080p and now 4k+ you need a huge step up in visual fidelity of all your assets, otherwise it looks poor. And given most of these assets are human made (well, until very recently) this requires more and more artists. So I wonder if games studios are more just art studios with a bit of programming bolted on, vs before with lower res graphics where you maybe had one artist for 10 programmers, now it is more flipped the other way. I feel that at some point over the past ~decade we hit a "organisational" wall with this and very very few studios can successfully manage teams of hundreds (thousands?) of artists effectively? | | |
| ▲ | MindSpunk 2 days ago | parent | next [-] | | This hits the nail pretty close to the head. I work on an in-house AAA engine used by a number of different games. It's very expensive to produce art assets at the quality expected now. Many AAA engine's number one focus isn't "performance at all costs", it's "how do we most efficiently let artists build their vision". And efficiency isn't runtime performance, efficiency is how much time it takes for an artist to create something. Performance is only a goal insofar as to free artists from being limited by it. > So I wonder if games studios are more just art studios with a bit of programming bolted on. Not quite, but the ratio is very in favor of artists compared to 'the old days'. Programming is still a huge part of what we do. It's still a deeply technical field, but often "programming workflows" are lower priority than "artist workflows" in AAA engines because art time is more expensive than programmer time from the huge number of artists working on any one project compared to programmers. Just go look at the credits for any recent AAA game. Look at how many artists positions there are compared to programmer positions and it becomes pretty clear. | | |
| ▲ | kasool a day ago | parent [-] | | Just to add to this, from a former colleague of mine who currently works as a graphics programmer at a UE5 studio: most graphics programmers are essentially tech support for artists nowadays. In an age where much of AAA is about making the biggest, most cinematic, most beautiful game, your artists and game content designers are the center of your production pipeline. It used to be that the technology tended to drive the art. Nowadays the art drives the tech. We only need to look at all the advertised features of UE5 to see that. Nanite allows artists to spend less time tweaking LODs and optimizing meshes as well as flattening the cost of small triangle rendering. Lumen gives us realtime global illumination everywhere so artists don’t have to spend a million hours baking multiple light maps. Megalights lifts restrictions on the number of dynamic lights and shadows a lighting artist can place in the scene. The new Nanite foliage shown off in the Witcher 4 allows foliage artists to go ham with modeling their trees |
| |
| ▲ | cosmic_cheese 2 days ago | parent | prev | next [-] | | That depends a lot on art direction and stylization. Highly stylized games scale up to high resolutions shockingly well even with less detailed, lower resolution models and textures. Breath of the Wild is one good example that looks great by modern standards at high resolutions, and there’s many others that manage to look a lot less dated than they are with similarly cartoony styles. If “realistic” graphics are the objective though, then yes, better displays pose serious problems. Personally I think it’s probably better to avoid art styles that age like milk, though, or to go for a pseudo-realistic direction that is reasonably true to life while mixing in just enough stylization to scale well and not look dated at record speeds. Japanese studios seem pretty good at this. | |
| ▲ | spookie 2 days ago | parent | prev [-] | | Yeah, its flipped. Overall, it has meant studios are more and more dependent on third party software (and thus license fees), it led to game engine consolidation, and serious attrition when attempting to make something those game engines werent built for (non-pbr pipelines come to mind). It's no wonder nothing comes out in a playable state. |
|
|
|
| ▲ | PoshBreeze 11 hours ago | parent | prev | next [-] |
| > Arguably you could say the real drop in optimization happened in that PS1 -> PS3 era - everything went from hand optimized assembly code to running (generally) higher level languages and using abstrated graphics frameworks like DirectX and OpenGL. Just noone noticed because we had 1000x the compute to make up for it :) Maybe / Kind of. Consoles in the PS1/N64 they were not running optimised assembly code. The 8bit and 16 bit machines were. As for DirectX / OpenGL / Glide actually massively improved performance over running stuff on the CPU. You only ran stuff with software rendering if you had a really low performance GPU. Just look at Quake running in software vs Glide. It easily doubles on a Pentium based system. > Consoles/games got hit hard by first crypto and now AI needing GPUs. I suspect if it wasn't for that we'd have vastly cheaper and vastly faster gaming GPUs, but when you were making boatloads of cash off crypto miners and then AI I suspect the rate of progress fell dramatically for gaming at least (most of the the innovation I suspect went more into high VRAM/memory controllers and datacentre scale interconnects). The PC graphics card market got hit hard by those. Console markets were largely unaffected. There are many reasons why performance has stagnated. One of them I would argue is the use of the Unreal 4/5 engine. Every game that runs either of these engines has significant performance issues. Just look at Star wars: Jedi Survivor and the previous game Star wars Jedi: Fallen Order. Both games run poorly even on a well spec'd PC and even runs poorly on my PS5. Doesn't really matter though as Jedi Survivor sold well and I think Fallen Order also sold well. The PS5 is basically a fixed PS4 (I've owned both). They've put a lot of effort into the PS5 into reducing loading times. Loading times on the PS4 were painful and were far longer than the PS3 (even games loading from Bluray). This was something Sony was focusing on. Every presentation about the PS5 talked about the new NVME drives and the external drive and the requirements for it. The other reason is that the level of graphical fidelity achieved in the mid-2000s to early-2010s is good enough. A lot of reasons why some games age worse than others is due to the art style, rather than the graphical fidelity. Many of the high earning games don't have state of the art graphics e.g Fortnite prints cash and the graphics are pretty bad IMO. Performance and Graphics just isn't the focus anymore. It doesn't really sell games like it used to. |
|
| ▲ | Dylan16807 2 days ago | parent | prev | next [-] |
| You divided 230 by .03 wrong, which would be 10000-ish, but you underestimated the PS1 by a lot anyway. The CPU does 30 MIPS, but also the geometry engine does another 60 MIPS and the GPU fills 30 or 60 million pixels per second with multiple calculations each. |
| |
| ▲ | deaddodo 2 days ago | parent [-] | | Not to mention that few developers were doing hand optimized assembly by the time of PSX. They were certainly hand optimizing models and the 3D pipeline (with some assembler tuning), but C and SDKs were well in use by that point. Even Naughty Dog went with their own LISP engine for optimization versus ASM. | | |
| ▲ | dmbaggett 18 hours ago | parent | next [-] | | I don’t know about other developers at the time, but we had quite a lot of hand-written assembly code in the Crash games. The background and foreground renderers were all written in assembly by hand, as was the octree-based collision detection system. (Source: me; I wrote them.) And this thread comes full circle: Mark Cerny actually significantly improved the performance of my original version of the Crash collision detection R3000 code. His work on this code finally made it fast enough, so it’s a really good thing he was around to help out. Getting the collision detection code correct and fast enough took over 9 months —- it was very difficult on the PS1 hardware, and ended up requiring use of the weird 2K static RAM scratchpad Sony including in place of the (removed) floating point unit. GOOL was mainly used for creature control logic and other stuff that didn’t have to be optimized so much to be feasible. Being able to use a lisp dialect for a bunch of the code in the game saved us a ton of time. The modern analogue would be writing most of the code in Python but incorporating C extensions when necessary for performance. Andy made GOAL (the successor lisp to GOOL) much more low-level, and it indeed allowed coding essentially at the assembly level (albeit with lispy syntax). But GOOL wasn’t like this. | | |
| ▲ | deaddodo 3 hours ago | parent [-] | | I've never seen the Crash source code, so was making my statements based on second hand knowledge. So thanks for that clarification. I do think it's worth pointing out that Naughty Dog and Insomnia were two companies well known for making highly optimized software for the PSX; so probably not a standard most other companies matched. Additionally, I have written my own PSX software as well as reviewed plenty of contemporaneous PSX software. While many have some bit of assembler, it's usually specifically around the graphics pipeline. About 90+% of all code is C. This is in line with interviews from developers at the time, as well. The point wasn't that ASM wasn't used at all (in fact, I specifically acknowledged it in my original post), it was that the PSX was in an era passed the time when entire codebases were hand massaged/tuned assembler (e.g. "the 16-bit era" and before). |
| |
| ▲ | p_l 2 days ago | parent | prev [-] | | Naughty Dog's GOAL was PS2 specific and essentially chock full of what would be called intrinsics these days that let you interleave individual assembly instructions particularly for the crazy coprocessor setup of Emotion Engine. My understanding is that the mental model of programming in PS2 era was originally still very assembly like outside of few places (like Naughty Dog) and that GTA3 on PS2 made possibly its biggest impact by showing it's not necessary. | | |
| ▲ | deaddodo 2 days ago | parent [-] | | If by "mental model" you mean "low-level" programming, sure. But you might as well conflate "religion" with "Southern Baptist protestantism" then. You're working with the same building blocks, but the programming style is drastically different. The vast majority of PSX games were done completely in C, period. Some had small bits of asm here and there, but so do the occasional modern C/C++ apps. To your last point, before there was GOAL there was GOOL (from the horse's mouth itself): https://all-things-andy-gavin.com/tag/lisp-programming/ And it was used in all of Naughty Dog's PSX library. | | |
| ▲ | p_l a day ago | parent [-] | | The quote I recall reading about long ago summarized the semi-official guidance as "write C like you write ASM". Because outside of ports from PC, large amount of console game developers at the time were experienced a lot with with programming earlier consoles which had a lot more assembly level coding involved. GTA3 proved that "PC style" engine was good enough despite Emotion Engine design. Didn't help that PS2 was very much oriented towards assembly coding at pretty low level, because getting the most of the hardware involved writing code for the multiple coprocessors to work somewhat in-sync - which at least for GOAL was done by implementing special support for writing the assembly code in line with rest of the code (because IIRC not all assembly involved was executed from the same instruction stream) As for GOOL, it was the way more classic approach (used by ND on PS3 and newer consoles too) of core engine in C and "scripting" language on top to drive gameplay. | | |
| ▲ | deaddodo a day ago | parent [-] | | > The quote I recall reading about long ago summarized the semi-official guidance as "write C like you write ASM". You could read that in pretty much any book about C, until the mid-00s. C was called "portable assembler" for the longest time because it went against the grain of ALGOL, Fortran, Pascal, etc by encouraging use of pointers and being direct to the machine. Thus why it only holds a viability in embedded development these days. I've written C on the PSX, using contemporaneous SDKs and tooling, and I've reviewed source code from games at the time. There's nothing assembler about it, at least not more so than any systems development done then or today. If you don't believe me, there are plenty of retail PSX games that accidentally released their own source code that you can review yourself: https://www.retroreversing.com/source-code/retail-console-so... You're just arguing for the sake of arguing at this point and, I feel, being intellectually dishonest. Believe what you'd like to believe, or massage the facts how you like; I'm not interested in chasing goal (heh) posts. |
|
|
|
|
|
|
| ▲ | imtringued a day ago | parent | prev [-] |
| >I suspect if it wasn't for that we'd have vastly cheaper and vastly faster gaming GPUs This feels very out of touch since AMD's latest GPU series is specialized in gaming only, to the point where they sell variants with 8GB, which is becoming a bit tight if you want to play modern games. |
| |
| ▲ | martinald a day ago | parent [-] | | Yes but AMD also has an enterprise line of AI cards to protect. And regardless, if NVidia wasn't also making bank selling AI GPUs then we'd have seen them add more performance on gaming, which would have forced AMD to, etc. |
|