Remix.run Logo
bigyabai 6 hours ago

It's a great option to have. Once you reach the 2-7ms frame time territory, you're approaching the CPU bottleneck for many game engines even on the fastest hardware. For newer titles like GTA VI, framegen might be the only reliable path to 120+ FPS without pinning all of your cores.

Framegen is also a good fit for low-end hardware like the Steam Deck, which can hit 30 or 45 FPS in stuff like Elden Ring but is far from the max 90hz of the OLED model's panel. For a handheld, trading a bit of 720p visual clarity for locked 90hz gameplay is a solid trade if you can get it working.

Borealid 6 hours ago | parent [-]

Would you say a game is running at 90fps if, 45 times per socond, two frames are produced, the second of which is a linear interpolation of the frame before and after it?

How about if the two frames are 100% identical?

Does either of these situations differ substantially from what is being discussed, wherein the render pipeline can only produce a new render 45 times per second?

Incipient 5 hours ago | parent | next [-]

My understanding is that frame generation uses motion vectors to (slightly?) adjust the scene to produce a "highly plausible" next frame to drop in before the following "real" frame.

I've only seen videos, so from a somewhat unrealistic perspective, it seems like an acceptable compromise for low end hardware in particular.

Boosting 120hz to 240hz admittedly seems silly.

Borealid 4 hours ago | parent [-]

My comment isn't denigrating frame generation, which can be useful.

It's pointing out the absurdity of calling "45fps plus 1-for-1 frame generation" as if it is in any sense "90fps". It's not, and you aren't hitting a 90Hz refresh rate target at any more with it than you were without it. In point of fact, it lowers real FPS because it consumes resources that would have otherwise been available for the render pipeline.

I wish reviewers in particular would stop saying e.g. "120fps with DLSS FG enabled" and instead call out the original render rate. It makes the discourse very confusing.

close04 4 hours ago | parent | prev [-]

> the second of which is a linear interpolation of the frame before and after it

If I understand what you describe, this is generating a frame "in the past", an average between 2 frames you already generated, so not very useful? If you already have frames #1 and #2, you want to guess frame #3, not generate frame #1.5.

The higher the "real frame" rate, the smaller the differences from one to the next. This makes it easier to predict those differences, and "hide" a bad prediction. On the other hand if you have 10FPS you have to "guess" 100ms worth of changes to the frame which is a lot to guess or hide if the algorithm gets it wrong.

Borealid 4 hours ago | parent [-]

I chose the two scenarios I did to illustrate that "frames per second" is clearly not meant to be measured in terms of times the display refreshed, but rather in terms of times content was actually rendered by the game engine.

In my opinion it is quite difficult to provide a definition of "fps" that somehow makes 45-fps-native-with-frame-doubling be counted as 90 but doesn't also make either of the ludicrous examples I presented be counted as 90.

close04 3 hours ago | parent [-]

I understand now, but I think any full frame that comes out of the GPU frame buffer is a frame. A real rendered frame or a generated frame using some algorithm. Even in the silly "I duplicate each frame" example, you are outputting that number of FPS. If you stand still in a game and nothing changes in the frame you're still counting all those practically identical frames.

A measure for "FPS effectiveness" sounds interesting. Like how much detail, changes, information can you discretely convey per second relative to what the game is continuously generating.

A Nyquist of sorts. Are you just duplicating samples? Are you sampling a high frequency signal (fast motion in the game) at high enough rate (lots of discrete FPS)?

Borealid 3 hours ago | parent [-]

I would say the correct missing metric is similarity to what would have been rendered had the GPU kept up.

"90fps at 95% fidelity" is a meaningful way to describe performance. AFAIK nobody measures this when discussing xess or dlss or fsr.