Remix.run Logo
try_the_bass 5 days ago

Out of curiosity, why are such high fps numbers desirable? Maybe I don't understand how displays work, but how does having fps > refresh rate work? Aren't many of those frames just wasted?

hamdingers 5 days ago | parent | next [-]

If you have a 60Hz display and the game is locked to 60fps, when you take an action it may take up to 16.67 milliseconds for that action to register. If the game is running at 500fps, it registers within 2 milliseconds, even though you won't see the action for up to 16.67 milliseconds later. At extremely high levels of competition, this matters.

Also, there are 540Hz displays.

badsectoracula 4 days ago | parent | next [-]

> even though you won't see the action for up to 16.67 milliseconds later

Note that this is only the case if you have vsync enabled. Without vsync you will see the action (or some reaction anyway) +2ms later instead of +16.67ms, just not the full frame. This will manifest as screen tearing though if the screen changes are big - though it is up to personal preference if it bothers you or not.

Personally i always disable vsync even my high refresh rate monitor as i like having the fastest feedback possible (i do not even run a desktop compositor because of that) and i do not mind screen tearing (though tearing is much less visible with a high refresh monitor than a 60Hz one).

try_the_bass 4 days ago | parent | prev | next [-]

> If the game is running at 500fps, it registers within 2 milliseconds, even though you won't see the action for up to 16.67 milliseconds later.

Okay I think I follow this, but I think I'd frame it a little differently. I guess it makes more sense to me if I think about your statement as "the frame I'm seeing is only 2ms old, instead of 16.67ms old". I'm still not seeing the action for 16.67ms since the last frame I saw, but I'm seeing a frame that was produced _much_ more recently than 16.67ms ago.

Thanks for the explanation, it helps!

mattmanser 4 days ago | parent [-]

This is mostly like high fidelity audio equipment, or extreme coffee preparation. Waste of time for most people.

I used to play CS:Go at a pretty high level (MGE - LE depending on free time), putting me in the top 10%. Same with Overwatch.

Most of the time you're not dying in a clutch both pulling the trigger situation. You missed, they didn't, is what usually happens.

I never bothered with any of that stuff, it doesn't make a meaningful difference unless you're a top 1%.

But there's a huge number of people who play these games who THINK it does. The reason they're losing isn't because of 2ms command registrations, it's because they made a mistake and want to blame something else.

vintermann 4 days ago | parent [-]

I'm sure that's true, but low latency can just plain feel good. I don't play FPSses at all, and I can totally understand how low latency helps the feeling of being in control. Chasing high refresh rates and low latency seems a lot more reasonable to me than chasing high resolution.

gf000 4 days ago | parent | prev | next [-]

A game doesn't necessarily have to process input at the same rate as it displays frames, does it?

hamdingers 4 days ago | parent | next [-]

That's correct, and the most competitive multiplayer games tend to have fixed tick rates on the server, but the higher FPS is still beneficial (again, theoretically for all but the highest level of competition) because your client side inputs are sampled more frequently and your rendered frames are at most a couple ms old.

adastra22 4 days ago | parent [-]

I think you're missing the point. The game could be processing input and doing a state update at 1000Hz, while still rendering a mere 60fps. There doesn't have to be any correlation whatsoever between frame rate and input processing. Furthermore, this would actually have less latency because there won't be a pipeline of frame buffers being worked on.

Tying the input loop to the render loop is a totally arbitrary decision that the game industry is needlessly perpetuating.

hamdingers 4 days ago | parent | next [-]

No, I'm explaining how most games work in practice.

You're right a game could be made that works that way. I'm not aware of one, but I don't have exhaustive knowledge and it wouldn't surprise me if examples exist, but that was not the question.

adastra22 4 days ago | parent [-]

I would not at all be surprised that there are examples out there, although I don't know of them. Tying the game state to the render loop is decision made very deep in the game engine, so you'd have to do extensive modifications to change any of the mainstream engines to do something else. Not worth the effort.

But a greenfield code shouldn't be perpetuating this mistake.

whstl 4 days ago | parent | next [-]

That's a super interesting discussion

On most modern engines there is already a fixed-step that runs at a fixed speed to make physics calculation deterministic, so this independence is possible.

However, while it is technically possible to run the state updates at a higher frequency, this isn't done in practice because the rendering part wouldn't be able to consume that extra precision anyway.

That's mainly because the game state kinda needs to remain locked while: 1) Rendering a frame to avoid visual artifacts (eg: the character and its weapon are rendered at different places because the weapon started rendering after a state change), or even crashes (due to reading partially modified data); 2) while fixed step physics updates are being applied and 3) if there's any kind of work in different threads (common in high FPS games).

You could technically copy the game-state functional-style when it needs to be used, but the benefits would be minimal: input/state changes are extremely fast compared to anything else. Doing this "too early" can even cause input lag. So the simple solution is just to do state change it at the beginning of the while loop, at the last possible moment before this data is processed.

Source: worked professionally with games in a past life and been in a lot of those discussions!

vintermann 4 days ago | parent | prev [-]

I can give an example. I'd heard that Super Meat Boy was hard, and it was, but it turned out, if you ran it at the 60hz it was designed for instead of 75hz, it was considerably easier. At 120hz it was unplayable.

You kind of understand how the game loop is tied to the refresh rate in games like this, though. Practicing "pixel perfect" jumps must be challenging if the engine updates aren't necessarily in sync with what goes on on screen. And in the really old days (when platformers were invented!) there was no real alternative to having the engine in sync with the screen.

adastra22 4 days ago | parent [-]

In the model I am describing there would be whole game state updates on every tick cycle, completely decoupling the frame rate from the response latency and prediction steps.

fizzynut 4 days ago | parent | prev [-]

Doing that will increase input latency, not decrease it.

There are many tick rates that happen at the same time in a game, but generally grabbing the latest input at the last possible moment before updating the camera position/rotation is the best way to reduce latency.

It doesn't matter if you're processing input at 1000hz if the rendered output is going to have 16ms of latency embedded in it. If you can render the game in 1ms then the image generated has 1ms of latency embedded in to it.

In a magical ideal world if you know how long a frame is going to take to render, you could schedule it to execute at a specific time to minimise input latency, but it introduces a lot of other problems like both being very vulnerable to jitter and also software scheduling is jittery.

Keyframe 4 days ago | parent | prev | next [-]

Game has to process the input, but it also has to update the "world" (which might also involve separate processing like physics) and then also render it both visually and audio. With network and server updates in-between things get even more complex. Input to screen lag and latency is a hardcore topic. I've been diving into that on and off for the past few years. One thing that would be really sweet of hardware/OS/driver guys would be an info when the frame was displayed. There's no such thing yet available to my knowledge.

nemomarx 4 days ago | parent | prev | next [-]

It doesn't and well programmed games won't be tied to fps that way. I'm not sure anything past 300 fps plausibly matters for overwatch even with the best monitor available.

4 days ago | parent [-]
[deleted]
4 days ago | parent | prev [-]
[deleted]
iwontberude 4 days ago | parent | prev [-]

Yeah it felt like I got caught in an early 2000s timewarp for a second. It was nice.

Jhsto 4 days ago | parent | prev | next [-]

You want your minimum FPS to be your refresh rate. You won't notice when you're over it, but you likely will if you go below it.

In Counter-Strike, smoke grenades used to (and still do, to an extent) dip your FPS into a slideshow. You want to ensure your opponent can't exploit these things.

rkoten 5 days ago | parent | prev | next [-]

Not OP but got quite a bit of experience with this playing competitive FPS for a decade. You're right that refresh rate sets the physical truth of it, e.g. 180 FPS on a 160 Hz monitor won't give you much advantage over 160 FPS if at all. However reaching full multiples of your refresh rate in FPS – 320 in this instance, 480, and so on – will, and not only in theory but you'll feel it subjectively too. I get ~500-600 FPS in counter-strike and I have my FPS capped to 480 to get the most of my current hardware (160 Hz). Getting a 240 Hz monitor would make it smoother. Upgrading the PC to get more multiples would also.

sznio 4 days ago | parent | prev | next [-]

If you're not using V-sync, if a new frame is rendered while the previous one wasn't fully displayed yet, it gets swapped to the fresher one half-way through. This causes ugly screen tearing, but makes the game more responsive. You won't see the whole screen update at once, but like 1/5th of it will react instantly.

I used to do that until I switched to Wayland which forces vsync. It felt so unresponsive that I bought a 165hz display as a solution to that.

omnimus 4 days ago | parent | prev | next [-]

To certain extent for online games it can be advantage (atleast it feels like it to me). AFAIK The server updates state between players at some (tick) rate when you have FPS above tick rate then the game interpolates between the states. The issue is that frames and networking might not be constantly synced so you are juggling between fps, screen refresh rate, ping and tick rate. In other words more frames you have higher the chance you will "get lucky" with latency of the game.

aleph_minus_one 4 days ago | parent | prev | next [-]

> Out of curiosity, why are such high fps numbers desirable? Maybe I don't understand how displays work, but how does having fps > refresh rate work? Aren't many of those frames just wasted?

The reason is triple buffering:

> https://en.wikipedia.org/w/index.php?title=Multiple_bufferin...

I just quote the central relevant sentences of this section:

"For frames that are completed much faster than interval between refreshes, it is possible to replace a back buffers' frames with newer iterations multiple times before copying. This means frames may be written to the back buffer that are never used at all before being overwritten by successive frames."

TACIXAT 5 days ago | parent | prev | next [-]

I run a 500hz monitor. Generally, you want your FPS to match your refresh rate.

try_the_bass 4 days ago | parent [-]

Huh, I didn't know those existed now. I think the last time I was shopping for a monitor, 144Hz was the new hotness.

Things have come a long way since then!

marcosdumay 4 days ago | parent | prev | next [-]

Tying the input and simulation rates to the screen refresh rate is an old "best practice" that is still used in some games. In fact, a long time ago it was even an actual good practice.

shric 5 days ago | parent | prev | next [-]

I think it was just to show that the performance is comparable to Windows, implying that it also will be fine for games/settings where fps is in the range that does matter.

cwillu 4 days ago | parent | prev [-]

osu (music beat-clicking game) has a built-in screen frequency a/b test, and despite running on a 60hz screen I can reliably pass that test up to 240hz. It's not just having 60 frames ready per second, it's what's in those frames.

try_the_bass 4 days ago | parent [-]

I don't understand how this works, I guess? If your screen is 60Hz, you're drawing four frames for every one that ends up getting displayed. You won't even see the other three, right? If you can't see the frames, what difference does what's in them make?

[E] Answered my own question elsewhere: the difference is the "freshness" of the frame. Higher frame rates mean the frame you do end up seeing was produced more recently than the last frame you actually saw

4 days ago | parent | next [-]
[deleted]
layer8 4 days ago | parent | prev [-]

Also, your input gets registered faster (happens earlier) in the game world.

try_the_bass 4 days ago | parent [-]

I don't think I understand this part.

Why does the rate at which frames are rendered (by the GPU?) relate to the speed at which input is registered?

[E] Ah, I think another comment [1] up in a different branch of this thread answered this for me

[1] https://news.ycombinator.com/item?id=45794453