# How are average FPS calculated?

#### zigzag

##### New Member
There no link on average FPS page in reviews that would describe how average is calculated.

I'm asking because FPS is a non-linear scale and calculating arithmetic mean (eg: avg=(fps1 + fps2 + fps3)/3) will skew results towards high FPS numbers.

It's better to convert from FPS to frame time (1/FPS=frame_time) for average calculation and then convert the result back to FPS (1/frame_time=FPS).

Lets say that results for three games are 30, 60 and 200 FPS.

Calculation done using FPS: (30+60+200)/3 = 96.7 FPS
Calculation done using frame time: 1/((1/30+1/60+1/200)/3) = 54.6 FPS

#### W1zzard

Staff member
will skew results towards high FPS numbers.
That's correct. I'm doing it this way to keep things simple for the average reader

The relative performance numbers are what you are looking for. Performance of each game is normalized, so that the tested card is "100"

Arithmetic mean is the correct mean to use for this case.

For example, lets change 200 FPS to 100,000 FPS in your calculations,

(30+60+100000)/3 = 33363.3 FPS
1/((1/30+1/60+1/100000)/3) = 59.988 FPS

So which one is better now?

What you did with averaging frametimes is actually called harmonic mean. You are just taking harmonic mean of those fps numbers (30, 60, 200).

Harmonic mean is used for (/causes) mitigating impact of large outliers and increasing the effect of the least values. So you are actually skewing results towards low FPS numbers when you do that.

Basically, when you take arithmetic mean of 'frame per second' values in games, you assume play time is constant in every game (thus we say 'per second'). Which makes sense since you dont want to assume people spends more time on some game or not. It's like, you are playing every game for an hour, add all the frames back to back, count the total frames, then divide that number to how many hours you played in total.
But when you take arithmetic mean of 'seconds per frame' (= frame time), you assume number of total frames is constant in every game. So now you are playing every game until you see total of (lets say) 1000 frames. So naturally, you play lower FPS games longer compared to higher fps games to reach that 1000 frame target. Then you add all the frame times back to back, sum the total time it takes, then divide that total time to total number of frames. Because you spent more time on low fps games, your average frame time result would be biased towards those games.

Harmonic mean is used in places like calculating average speed, because you spend less time to travel in the same distance if you go faster. Therefore impact of high speed is actually less than the low speeds. e.g, if you drive from A to B at 100 km/h and from B to A at 50 km/h, your average speed wouldnt be (100 + 50)/2 = 75km/h. It would be 1/((1/50+1/100)/2) = 66.67 km/h. It is closer to 50km/h because you spent more time driving at 50km/h.

#### zigzag

##### New Member
The relative performance numbers are what you are looking for. Performance of each game is normalized, so that the tested card is "100"

This affects relative performance numbers as well.
GPU1 (baseline): 30, 60 and 200 FPS
GPU2: 40, 70, 210 FPS

Using FPS:
GPU1 = (30+60+200)/3 = 96.7 FPS
GPU2 = (40+70+210)/3 = 106.7 FPS
GPU2 Relative = 106.7/96.7 = 1.10 = 110%
GPU2 Relative = (40/30+70/60+210/200)/3 = 1.18 = 118%

Using frame times:
GPU1 = 1/((1/30+1/60+1/200)/3) = 54.6 FPS
GPU2 = 1/((1/40+1/70+1/210)/3) = 68.1 FPS
GPU2 Relative (normalize avg.) = 68.1/54.6 = 1.25 = 125%
GPU2 Relative (normalize each game before avg. calculation)= 1/(((1/40)/(1/30)+(1/70)/(1/60)+(1/210)/(1/200))/3) = 1.17 = 117%

EDIT: missed the fact that you normalize performance of each game. Fixed calculations.

Arithmetic mean is the correct mean to use for this case.

For example, lets change 200 FPS to 100,000 FPS in your calculations,

(30+60+100000)/3 = 33363.3 FPS
1/((1/30+1/60+1/100000)/3) = 59.988 FPS

So which one is better now?

What you did with averaging frametimes is actually called harmonic mean. You are just taking harmonic mean of those fps numbers (30, 60, 200).

Harmonic mean is used for (/causes) mitigating impact of large outliers and increasing the effect of the least values. So you are actually skewing results towards low FPS numbers when you do that.

Basically, when you take arithmetic mean of 'frame per second' values in games, you assume play time is constant in every game (thus we say 'per second'). Which makes sense since you dont want to assume people spends more time on some game or not. It's like, you are playing every game for an hour, add all the frames back to back, count the total frames, then divide that number to how many hours you played in total.
But when you take arithmetic mean of 'seconds per frame' (= frame time), you assume number of total frames is constant in every game. So now you are playing every game until you see total of (lets say) 1000 frames. So naturally, you play lower FPS games longer compared to higher fps games to reach that 1000 frame target. Then you add all the frame times back to back, sum the total time it takes, then divide that total time to total number of frames. Because you spent more time on low fps games, your average frame time result would be biased towards those games.

Harmonic mean is used in places like calculating average speed, because you spend less time to travel in the same distance if you go faster. Therefore impact of high speed is actually less than the low speeds. e.g, if you drive from A to B at 100 km/h and from B to A at 50 km/h, your average speed wouldnt be (100 + 50)/2 = 75km/h. It would be 1/((1/50+1/100)/2) = 66.67 km/h. It is closer to 50km/h because you spent more time driving at 50km/h.
Second (59.988 FPS) is much more meaningful to the player. For gameplay experience it doesn't matter, if the third game runs at 100000 or 200000 FPS, but it matters, if the first game runs at 30 or 60 FPS. Second number betters approximates player's experience.

Yes, arithmetic mean of frame times equals harmonic mean of FPS. The thing is that frame time is better measure of performance than FPS:

So, I agree with you that arithmetic mean should be used, just not arithmetic mean of FPS, but of frame times

Last edited:

#### W1zzard

Staff member
relative performance numbers
Relative performance uses a different formula as mentioned in my earlier post

#### zigzag

##### New Member
Relative performance uses a different formula as mentioned in my earlier post
Sorry, I missed that performance of each game is normalized. I have edited the post. Is the new calculation correct?

Although, normalizing the average of frame times produces more meaningful result as it give more weight to games with low FPS, which normalizing the performance of each game doesn't.

Simplified example with just two games:

 Game 1 Game 2 Relative performance - method 1 (performance of each game is normalized) Relative performance - method 2 (average is normalized) GPU1 30 FPS 200 FPS 100% 100% GPU2 30 FPS 400 FPS (30/30 + 400/200)/2 = 1.5 = 150% 1/(((1/30+1/400)/2)/((1/30+1/200)/2)) = 1.07 = 107% GPU3 60 FPS 200 FPS (60/30 + 200/200)/2 = 1.5 = 150% 1/(((1/60+1/200)/2)/((1/30+1/200)/2)) = 1.77 = 177%

Method 1 indicates as if GPU2 and GPU3 will give player equal experience regarding the performance.

#### W1zzard

Staff member
as it give more weight to games with low FPS
why would you want to do that?

will skew results towards high FPS numbers.

If you want to introduced weighte averages, there's other things to take into account, too, like popularity of the games, age, graphics quality, penalty for sub-30 and sub-60, no ultrawide support, denuvo, power consumption, noise, bonus scoring for underdogs like AMD and Intel

#### zigzag

##### New Member
I'm not talking about weighted averages. Each method of reducing a set of data into a single number has specific properties. Even the method you are using. Universal statistical method that is best in every scenario doesn't exist. Your method just has a deferent bias than the one I am proposing. You have to figure out which single number would best represent a set of data you are analyzing.

Why do you think current method produces a value that better represents performance of a GPU over multiple games? In statistics, you always need to interpret the result. FPS number by itself is meaningless. You have to interpret it. Eg. 20FPSâ†’the game is unplayable; 50FPSâ†’the game is playable, but not smooth; 150FPSâ†’the game is playable and smooth; 200FPSâ†’the game is playable and smooth.

You can see that interpretation of a real world experience in the last two examples didn't change, event though the number has changed. I hope that answers your question why should you choose a method with a different bias than the bias of your current method. You can see a similar example for a relative performance in my previous post. One bias produces numbers that are easier to interpret to a real world experience than the numbers the other bias produces.

But it's perfectly fine, if you are not interested in this. I was just trying to help.

#### bug

@zigzag Average is a mathematical term with a well-established formula. Why do you feel the need to mess with that?

If you want a statistical approach, then you compute the average and sigma. Discard everything that falls outside 3*sigma, rinse and repeat until there is nothing left to discard. Only works if you can prove a normal(ish) distribution of fps, but I think most games would fit.

#### zigzag

##### New Member
@zigzag Average is a mathematical term with a well-established formula. Why do you feel the need to mess with that?

If you want a statistical approach, then you compute the average and sigma. Discard everything that falls outside 3*sigma, rinse and repeat until there is nothing left to discard. Only works if you can prove a normal(ish) distribution of fps, but I think most games would fit.
Where do you see I'm messing with average? My proposal is just to calculate average of frame times instead of frames per second as it is considered a better measure of performance. You can find explanation of why in the OpenGL Wiki link from the Khronos Group in my second post.

#### bug

Where do you see I'm messing with average? My proposal is just to calculate average of frame times instead of frames per second as it is considered a better measure of performance. You can find explanation of why in the OpenGL Wiki link from the Khronos Group in my second post.
See the title of this thread. You may want to reword it for clarity.

The link you mention describes how frame time is a better metric for measuring application performance and optimizing code paths. Hardly relevant for those that are looking to see if they can hit their monitor's (max) refresh rate.

#### zigzag

##### New Member
See the title of this thread. You may want to reword it for clarity.

The link you mention describes how frame time is a better metric for measuring application performance and optimizing code paths. Hardly relevant for those that are looking to see if they can hit their monitor's (max) refresh rate.
I don't understand what is wrong with it. The title is asking how it is calculated. I asked it because when reading reviews I noticed that avg. FPS graphs give a different impression from per game FPS graphs. There is no explanation how these numbers are calculated so I asked.

I remembered reading about the problems that arise when using FPS in performance calculations. I suspected this might be a reason for weird graphs, so, I mentioned it.

It turned out my suspicion was correct. Due to considerable bias of the currently used calculation method, the average FPS graphs, in the reviews, are practically lying to those that are looking to see, if they can hit their monitor's (max) refresh rate. How is that not an issue?

Last edited:

#### W1zzard

Staff member
are practically lying to those that are looking to see, if they can hit their monitor's (max) refresh rate.
This describes OP's concern:

He's looking for "Can it do 4K120?" Looks at the dotted line which says "146", which means "yes"

Then he looks at the bars, many of which appear lower than 120. His brain says "average should be the red line"

Then he looks at Doom Eternal's 389567398 FPS and comes to the conclusion that TPU is lying to him

Edit:

Avg: 145.6
Harmonic Mean (OP's proposal): 128
Geometric Mean: 135.6

None of these give the desired answer "lower than 120 FPS"

But adding more averages just makes things very complicated for people who barely know how to read a chart

Last edited:

#### bug

@W1zzard Yes, but that's because users don't understand "average" (i.e. math). To me, deceptive would be to slap "avg fps" on a graph that shows something else.
I've also described the statistical method of discarding outliers when you want a more meaningful "average" (sorry, don't recall if that number also has a name), but I've never seen it used in video game reviews. I really, really don't see a problem here.
Maybe just throw the standard deviation next to the mean, that will tell people which card outputs more stable fps numbers.

#### zigzag

##### New Member
Avg: 145.6
Harmonic Mean (OP's proposal): 128
Geometric Mean: 135.6

None of these give the desired answer "lower than 120 FPS"

But they just make things very complicated for people who barely know how to read a chart
I have to point out that my proposal was not a harmonic mean calculation, but an average (of frame times). Yes, due to a relationship between these two units (FPS and frame time), average (of frame times) gives same result as harmonic mean (of FPS). Why am I even bringing this up, if they they produce exactly the same result!!? Because, you can also calculate harmonic mean (of frame times) and geometric mean (of frame times). These two might even give desired answer of "lower than 120 FPS" in your example. EDIT: Actually, they won't give you a lower value. Due to relationship between units, all methods are interrelated. Harmonic mean (of frame times) is equal to average (of FPS) and geometric mean (of frame times) is equal to geometric mean (of FPS).

It's not complicated to someone who barely knows how to read a chart, because he only looks at pretty pictures and, as you said, his brain says "average should be the red line". So, a lower line will unconsciously feel more natural, even though he doesn't understand how it is calculated and probably doesn't even care.

But for those who are interested in the details, you could include a link in reviews to a page describing the method. If you include pretty pictures (like the ones you posted above) that show why more advanced calculation is used, they will understand, even if they might not grasp the math part.

Frame time is essentially a latency of frame calculation. If we take numbers from adilazimdegilx's example, we have three latencies: 33.33 ms, 16,67 ms and 0.01 ms. One way of calculation says that the average latency is 0.03 ms and the other that it's 16.67 ms. Which one feels more correct and useful?

FPS number is easier to interpret for the reader than milliseconds, so results should be in FPS. However, that doesn't mean that FPS is also the best unit for doing calculations. Calculations should be done in frame times and results converted back to FPS.

EDIT: Explaining you are calculating average of frame times and converting result back to FPS is not really THAT much more complicated. I don't understand why do you feel it needs to be explained as "harmonic mean of FPS". Average of frames per second is "harmonic mean of frame times", so either method can be turned to sound daunting.

EDIT 2: Another thing, tools for measuring average FPS in one game are already calculating average of frame times (or harmonic mean of FPS as you like to call it):
Ingame benchmark tools, OCAT, FRAPS, PresentMon, CapFrameX - all these tools ultimately give the user FPS values, but these are not the values that are being measured.
What is measured are the time intervals between the individual images: The frametimes.
the performance metrics get calculated using the raw frametime data converted to FPS.
So, you are already using the average of frame times (harmonic mean of FPS) method in your process.

Why do you think it's better to calculate the average performance of multiple games differently than how benchmarking tools are calculating average performance of a single game?

EDIT 3: Since, the measurements are done in frame times, frame times are the base unit and FPS is a derived unit. So, average of FPS is average of a derived values which in turn is equal to the Harmonic Mean of the base measurements.

Last edited:

#### zigzag

##### New Member
I've been reading a bit more on the topic. What is currently shown in reviews is not an "average FPS", but actually an "average of FPSes". FPS is a rate. The clue is right there in the name "... Per Second" (or framerate). When you want to calculate an "average rate" from rates (like FPS) you have to use harmonic mean to get the correct result. Using arithmetic mean on rates gives you an "average of rates", thus producing an incorrect result when you want to get an "average rate".

TL;DR:
• Arithmetic Mean of FPS = "Average of FPSes" (ie. average of rates) - currently incorrectly labeled in reviews as "Average FPS".
• Harmonic Mean of FPS = "Average FPS" (ie. average rate)
All this complex stuff can be avoided, if the base measurement unit (frame time) is used for all calculations and results converted to FPS in the end

Last edited:

#### mrpaco

talking about this, it would be pretty nice if in the future, you could start adding also graphics of the frametimes / framepacing in game reviews . They are the ones that tells the whole picture of specially stuttering in games and how the player is gonna feel it playing

A 60fps game you could think is just the bare minimum and see an 90fps on the other side and think it have better optimization, but if the 90fps have a shit framepacing and the 0.1% are not just in specific parts of the game or any scene, but in the whole benchmark, you can really know that game is shit and will feel like shit vs a 60fps that have a 0.1% of 45fps but it is only in 2 seconds of a scene from a 30s benchmark or so.

#### kapone32

talking about this, it would be pretty nice if in the future, you could start adding also graphics of the frametimes / framepacing in game reviews . They are the ones that tells the whole picture of specially stuttering in games and how the player is gonna feel it playing

A 60fps game you could think is just the bare minimum and see an 90fps on the other side and think it have better optimization, but if the 90fps have a shit framepacing and the 0.1% are not just in specific parts of the game or any scene, but in the whole benchmark, you can really know that game is shit and will feel like shit vs a 60fps that have a 0.1% of 45fps but it is only in 2 seconds of a scene from a 30s benchmark or so.
All of this was available on screen using AMD overlay. With the Anti lag+ issue it has since been removed but I look forward to it. In terms of smoothness as long as it is the Freesync range I doubt you would see stutters. As an example there is a Game called Viking Battle for Asgard that was ported to PC without unlocking 30 FPS. Even with all the horsepower in my PC that Game is not in any way enjoyable. Compare that to Mount and Blade from 2013-14 and if you turn everything up and play it in 4K it looks better than any image you see in screenshots but because the FPS is in Freesync range and above it is a smooth process.