52 points by vplesko 3 days ago | 6 comments
NL807 1 minute ago
FPS based on last frame is good to see load spikes, particular periodic ones, if you graph them.

FPS based on the median of a moving window is good if you want perceived frame rate, which rejects extreme outliers.

FPS based on the average of a moving window is good if you want statistical mean frame rate.

nasretdinov 2 minutes ago
Ideally you'd want to measure _perceived_ performance of the game by players, which would probably depend on the _lowest_ "fps" value during the specific interval. I've seen some games change the colour of the fps counter based on whether or not there were significant FPS _dips_ below the one-second average. So e.g. you might be able to render 100 frames in a specific second, but if one frame took 0.1s and the others took the rest, then for users it'll feel like the game plays at 10fps at that point, even though the actual number of frames rendered is much higher.
ivanjermakov 1 hour ago
It all boils down to what FPS counter is suppose to show. In my games I make three delta time indicators: 100%, low1%, and low01% average over 10s rolling window. Helps spotting dropped frames and stutters.
g7r 1 hour ago
Technically, the methods with a queue drop up to an entire frame at the beginning of the window. Depending on how the averageProcessingTime() function is implemented, this can mean either faster recovery after a single heavy frame (if it divides by the sum of the durations of the frames in the window) or slightly lower than actual values overall (if it just divides by the duration of the window).

But that's just the nerd in me talking. The article is great!

flohofwoe 1 hour ago
...and don't just smooth your measured frame duration for displaying the FPS (or better: frame duration in milliseconds), but also use it as actual frame time for your animations and game logic timing to prevent micro-stutter.

The measured frame duration will have jitter up to 1 or even 2 milliseconds for various 'external reasons' even when your per-frame-work fits comfortably into the vsync-interval each single frame. Using an extremly precise timer doesn't help much unfortunately, it will just very precisely measure the externally introduced jitter which your code has absolutely no control over :)

What you are measuring is basically the time distance between when the operating system decides to schedule your per-frame workload. But OS schedulers (usually) don't know about vsync, and they don't care about being one or two milliseconds late, and this may introduce micro-stutter when naively using a measured frame time directly for 'driving the game logic'.

For instance if the previous frame was a 'long' frame, but the current frame will be 'short' because of scheduling jitter, you'll overshoot and introduce visible micro-stuttering, because the rendered frames will still be displayed at the fixed vsync-interval (I'm conveniently ignoring vsync-off or variable-refresh-rate scenarios).

The measurement jitter may be caused by other reasons too, e.g. on web browsers all time sources have reduced precision since Sprectre/Meltdown, but thankfully the resulting jitter goes both ways and averaging/filtering over enough frames gives you back the exact refresh interval (for instance 8.333 or 16.667 milliseconds even when the time source only has millisecond precision).

On some 3D APIs you can also query the 'presentation timestamp', but so far I only found the timestamp provided by CADisplayLink on macOS and iOS to be completely jitter-free.

I also found an EMA filter (Exponential Moving Average) more useful than a simple sliding window average (which I used before in sokol_app.h). A properly tuned EMA filter reacts quicker and 'less harshly' to frame duration changes (like moving the render window to a display with different refresh rate), it's also has less implementation complexity because it doesn't require a ring buffer of previous frame durations.

TL;DR: proper frame timing for games is a surprisingly complex topic because desktop operating systems are usually not "tuned" for game workloads.

Also see the "classic" blog post about frame timing jitter:

https://medium.com/@alen.ladavac/the-elusive-frame-timing-16...

mfgadv99 1 hour ago
[dead]