Remix.run Logo
How to Implement an FPS Counter(vplesko.com)
41 points by vplesko 3 days ago | 4 comments
ivanjermakov 14 minutes ago | parent | next [-]

It all boils down to what FPS counter is suppose to show. In my games I make three delta time indicators: 100%, low1%, and low01% average over 10s rolling window. Helps spotting dropped frames and stutters.

g7r 20 minutes ago | parent | prev | next [-]

Technically, the methods with a queue drop up to an entire frame at the beginning of the window. Depending on how the averageProcessingTime() function is implemented, this can mean either faster recovery after a single heavy frame (if it divides by the sum of the durations of the frames in the window) or slightly lower than actual values overall (if it just divides by the duration of the window).

But that's just the nerd in me talking. The article is great!

flohofwoe 20 minutes ago | parent | prev | next [-]

...and don't just smooth your measured frame duration for displaying the FPS, but also use it as actual frame time for you animations and game logic timing to prevent micro-stutter.

The measured frame duration will have jitter up to 1 or even 2 milliseconds for various 'external reasons' even when your per-frame-work fits comfortably into the vsync-interval.

What you are measuring is basically the time distance between when the operating system decides to schedule your per-frame workload. But OS schedulers (usually) don't know about vsync, and they don't care about being one or two milliseconds late.

E.g. if the last frame was a 'long' frame, but the current frame will be 'short' because of scheduling jitter, you'll overshoot and introduce visible micro-stuttering.

The measurement jitter may be caused by other reasons too, e.g. on web browsers all time sources have reduced precision since Sprectre/Meltdown, but thankfully the 'precision jitter' goes both ways and averaging/filtering over enough frames gives you back the exact refresh interval (e.g. 8.333 or 16.667 milliseconds).

On some 3D APIs you can also query the 'presentation timestamp', but so far I only found the timestamp provided by CAMetalLayer on macOS and iOS to be completely jitter-free.

For this smoothing/filtering purpose, I found an EMA (Exponential Moving Average) more useful than a simple sliding window average (which I used before in sokol_app.h). An properly tuned EMA filter reacts quicker and 'less harshly' to frame duration changes (like moving the render window to a display with different refresh rate), it's also easier to implement since it doesn't require a ring buffer of previous frame durations.

TL;DR: frame timing for games is a surprisingly complex topic.

Also see the most influental blog post about the topic (IIRC the post is quite a bit older than 2018 but had been re-hosted):

https://medium.com/@alen.ladavac/the-elusive-frame-timing-16...

mfgadv99 31 minutes ago | parent | prev [-]

[dead]