• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Relative performence - what does it mean?

DUS

New Member
Joined
Sep 30, 2018
Messages
7 (0.00/day)
What's the formula that estimates the relative performance of GPU X to be 110% better than Y?
Thanks.
 
Actually I'd be interested in knowing this as well. Does the chart show relative performance in games @1080p for instance, an avg of all 3 resolutions (with multiple games) or what?
 
dw5ro4colw.jpg


@R0H1T as the title says "1920x1080"

For every single game test (at 1080p in this example), calculate the percentage of each card relative to the "100% card", ie scale all bars uniformly so that the "100%" bar has 100 length.
Then calculate the geometric mean for all these tests
 
@W1zzard, so it's the relative change in FPS @1080p always for the same settings and same drivers?
Edit: Even across newer GPU generations?
 
It's in percent (%). Averaging fps without normalizing gives high fps titles too much weight over other titles

Test system and driver info is included in all reviews
 
Last edited:
relative means "in realtin to one another". So if the 1080 Ti is 100% and the 1070 is 70%, the 1070 is 70% as fast as the 1080 RTo ... and the 1080 ti is 1.43 times (100/70) as fast as the 1070
 
My question concerns test parameters rather than how percentages are calculated.
Just so it's clear- Relative Performance does not reflect benchmarks that are drivers, patches & settings agnostic, right?
That is to say that excluding GPUs that were benchmarked together, there is no data from which to construe from say a 160% RP increase between a GTX 960 and a 970 (supposing tests done at significant time interval) how much of it is in merit of optimized patches, updated drivers or even different game settings (however slightly), is this correct?
In other words- do these test parameters change between differently timed benchmarks, specifically between different GPU generations, and if so how are these fluctuations calculated for?
 
Last edited:
If you check the "Test Setup" page, the driver version used for all compared cards as well as the CPU clocks, Windows version and applied updates, etc. are all given. Here is an example: https://www.techpowerup.com/reviews/MSI/RX_580_Mech_2/6.html

And how are those differences translated into raw numbers, such as the ones shown in the RP charts? The RP chart is there, with the numbers, neatly showing precise percentages, not ~10% estimations so how is it done in this regard?
 
Differences? The point of that setup page is to show the standard test configuration that all the compared cards were tested in. All of those cards you see in the chart were run using xxx.xx driver version and whichever mentioned update of Windows, etc. If you see a GTX 970 and a GTX 1070 in the same chart and the test setup page says they were using the same driver, well then it would seem to be a pretty direct comparison and therefore no estimation required for performance results. The only gray area I can see in this regard is if any weighting is done in the overall percentage results due to particular games favoring AMD or NVIDIA architectures. If that doesn't answer your inquiry, then I'm not really sure what is the question.
 
Relative Performance does not reflect benchmarks that are drivers, patches & settings agnostic, right?
Relative performance is specific to the test setup obviously, because it's based on real data, not estimates.

The relative performance values do change between rebenches, due to drivers, patches, test scene change, test duration change, non-GPU hardware changes.

Not 100% sure what exactly you are asking
 
@Beertintedgoggles, the RP chart compares ALL GPUs from ALL benchmarks.


Fullscreen capture 04102018 180154.bmp.jpg


If the test platform itself changes from test to test, then some numeric compensation needs to take place in order to list all of the GPUs the way they are currently listed in the RP chart. If the case was such that the site compared apples to apples and the platforms are static than it was simply a matter of listing raw FPS results and the comparison was clear. However, the case is different so the strength of the RP chart would be in the ability to quantify and compensate for the changes in testing platform.
If no such formula is in place (something in the lines of /Score = Game FPS @release date X Platform hardware multiplier X Platform OS version multiplier X GPU drivers multiplier X Game patch multiplier X Game settings multiplier) then it would seem that the Relative Performance chart is not only superfluos but downright misleading.
 
platforms are static
the platform is static for all results in a given review

ability to quantify and compensate for the changes in testing platform.
What you are looking for isn't possible, not even in theory. Game performance is a function of many factors, none of which can be removed entirely.

Or are you looking for a theoretical FLOPs rating? Which doesn't represent real-life performance at all
 
What you are looking for isn't possible, not even in theory. Game performance is a function of many factors, none of which can be removed entirely.

I was merely looking for an explanation to what is in place already mind. It's still no way clearer to me how that baffling chart is made in the first place but I'm clear on the fact that it represents something else than what one might assume it does, and I would suggest, very very humbly, that a big fat asterisk should be added to that chart.
 
Oh.... the GPU Database. Sorry, I had assumed you were talking about the reviews (which I guess the database results are based on but I can see where those might be based upon different driver versions and updates). Yeah, although it's an attempt to compare multiple cards and generations; maybe not the best / most accurate chart due to those mentioned limitations.
 
My question concerns test parameters rather than how percentages are calculated.
Just so it's clear- Relative Performance does not reflect benchmarks that are drivers, patches & settings agnostic, right?
That is to say that excluding GPUs that were benchmarked together, there is no data from which to construe from say a 160% RP increase between a GTX 960 and a 970 (supposing tests done at significant time interval) how much of it is in merit of optimized patches, updated drivers or even different game settings (however slightly), is this correct?
In other words- do these test parameters change between differently timed benchmarks, specifically between different GPU generations, and if so how are these fluctuations calculated for?

Yes the test parameters will change ... but if ya careful, it shouldnt matter much. Lets say that in any given test the Whoopdeedoo 2000 is 20% faster than the Whoopdedoo 1800. But the review you are looking at is for the MSI Whoopdedoo 2000 Gaming X. So in the new review for the MSI card, you see:

MSI Whoopdedoo 2000 Gaming X gets a 100% score.
Reference Whoopdedoo 2000 gets a 95% score.
Reference Whoopdedoo 1800 gets a 79% score.

So from that data, we can easily note that "outta the box":

a) The reference Whoopdedoo 2000 averaged 20.3% faster (95/79) than the Whoopdedoo 1800
b) The MSI Whoopdedoo 2000 Gaming X averaged 5.3% faster (100/95) than the reference Whoopdedoo 2000

But ya don't know... how much faster is the MSI Whoopdedoo 2000 Gaming X compared to the MSI Whoopdedoo 1800 Gaming X when both are overclocked. To find that out....

The MSI Whoopdedoo 2000 Gaming X article will show, some pages later that the reference card got 60 fps, 64 fps (+6.7%) "outta the box" and 68.4 fps (+14%) manually overclocked with the overclock test game.

Now we go back to the MSI Whoopdedoo 1800 Gaming X article from 2 years ago and we see that the reference card got 70.0 fps, 75.6 fps (+8.0%) "outta the box" and 88.9 fps (+27%) manually overclocked with a different overclock test game.

We can still equate the two cards:

Reference Whoopdedoo 2000score of 95% x 1.14 = 108.3 for the MSI Gaming version manually overclocked
Reference Whoopdedoo 1800 score of 79% x 1.27 = 100.3 for the MSI Gaming version manually overclocked

Reference Whoopdedoo 2000score of 95% x 1.067 = 101.4 for the MSI Gaming version outta the box
Reference Whoopdedoo 1800 score of 79% x 1.080 = 85.3 for the MSI Gaming version outta the box

Looking at the above numbers, an upgrade would be pretty hard to justify outta the box... but the extra OC ability of th older generation mas compared to current one makes an upgrade significantly less attractive.

I hope I explained the process simply enough, the logic is sound but there's always variables. i would expect them to be significant tho.
 
Maybe I'm reading more into their inquiry than the OP meant but I think the point they are making is what if Whoopdedoo 2000 gained a performance increase with a driver update. Now the percentage difference does not align with the previous findings and makes that MSI Whoopdedoo 1800 Gaming X article from 2 years ago not a straight comparison anymore. How would one go about weighing which relative performance numbers to use in the comparison now? That's why I came to the conclusion that the GPU Database relative performance graphs are nice to look at but are a rather poor excuse for an actual in depth review where all the cards are an equal footing (drivers, updates, patches, etc.)
 
  • Like
Reactions: DUS
Now we go back to the MSI Whoopdedoo 1800 Gaming X article from 2 years ago and we see that the reference card got 70.0 fps, 75.6 fps (+8.0%) "outta the box" and 88.9 fps (+27%) manually overclocked with a different overclock test game.
To state the obvious let me just say that as a simple consumer and not some hardcore gamer that follows daily GPU related news I'd gravitate towards the place that can easily show me the value for purchase, or upgrade ,and help me make the best informed choice with less of being sent to dig around in old articles. At least not until my preference is down to 2-3 options from the list, whereupon further details may be choice instructive. Plus, it saves me the ridiculous mental guess work of comparing apples to oranges to penguins and trying to put that in percentages.
That's the value of a good relative performance chart for a guy like me.

Even so, as it is latest gen benchmark results only tell me 'these cards are all within a performance range of 200% between 'em, and one additional penguin between each of them and your GPU from two gens ago and is absent from this benchmark'.
At best it's indicative, at worst- useless.

...I think the point they are making is what if Whoopdedoo 2000 gained a performance increase with a driver update. Now the percentage difference does not align with the previous findings and makes that MSI Whoopdedoo 1800 Gaming X article from 2 years ago not a straight comparison anymore.
That's it.
Excluding OC there are practically zero unknowns when GPUs are benchmarked, and done so purely on performance values. Everything can be calculated for (no build quality, no warranty, no aesthetics, no ergonomics or convenience and not even human factor parameters but rather pure, cold, raw numbers)- so no holly grail for any site that wants a claim of authority in its field.
 
You want buying a GPU made easy? Okay. Buy a MSI "Gaming" or Asus "Strix". Both run cool and quiet, both are know for their build quality, both have a "no questions asked" 3 year warranty, and both look good without RGB.
your GPU from two gens ago and is absent from this benchmark'.
I don't think you have a clue as to how much work goes into a review.
To state the obvious let me just say that as a simple consumer and not some hardcore gamer that follows daily GPU related news I'd gravitate towards the place that can easily show me the value for purchase, or upgrade ,and help me make the best informed choice with less of being sent to dig around in old articles.
One of the things we have to learn in life is to take the advice given when asked. If the "hardcore gamer that follows daily GPU related news" tells you to go read, it's because there are so many subjective aspects to GPU's that they want you to figure out what matters to you.
 
I dont get what people are bitching about. W1zz generally has multiple generations of GPUs in the article and they are regularly rebenched with newer drivers so that the data remains consistent. No one else reviews the same number of GPUs or rebenches the number that techpowerup does. Considering renbenching takes place periodically with new drivers. The performance delta seldom changes more than a 1-3% overall from what I have seen. Even then since some games improve and other fall back performance wise between drivers it ends up being a wash.
 
To state the obvious let me just say that as a simple consumer and not some hardcore gamer that follows daily GPU related news I'd gravitate towards the place that can easily show me the value for purchase, or upgrade ,and help me make the best informed choice with less of being sent to dig around in old articles. At least not until my preference is down to 2-3 options from the list, whereupon further details may be choice instructive. Plus, it saves me the ridiculous mental guess work of comparing apples to oranges to penguins and trying to put that in percentages.
That's the value of a good relative performance chart for a guy like me.

Even so, as it is latest gen benchmark results only tell me 'these cards are all within a performance range of 200% between 'em, and one additional penguin between each of them and your GPU from two gens ago and is absent from this benchmark'.
At best it's indicative, at worst- useless.


That's it.
Excluding OC there are practically zero unknowns when GPUs are benchmarked, and done so purely on performance values. Everything can be calculated for (no build quality, no warranty, no aesthetics, no ergonomics or convenience and not even human factor parameters but rather pure, cold, raw numbers)- so no holly grail for any site that wants a claim of authority in its field.

Relative performance charts are for hierarchy of GPUs. Not for being some sort of exact science. Like you've said: indicative. And not 'at best' but just that: indicative. Each entry in the list is an indicator of its performance relative to the rest.

Apart from theoretical possible changes in performance, the reality is most cards get released and re-benched but I seriously cannot recall ONE card that has actually switched places in a GPU hierarchy chart. And if they do, we're not talking about stock vs stock but stock versus OC, which is irrelevant in this discussion altogether. Very recently, a TPU member has tested how much performance changed across a range of Nvidia drivers on Kepler. Conclusion: nearly within margin of error for most individual games and averaged across all of them, negligible.

You're making far too much of that RP chart and it was never intended to be a tool that allows you to avoid any sort of reading. If you want info, you'll need to get into it, its that simple. If you want a ballpark idea, the RP chart is there for you and does a perfect job at that. Your example of a simple consumer that wants something is an example of being a lazy consumer that wants everything handed on a silver platter. And if I'm honest, we need LESS of those people, not accommodate and encourage more of them.

TL : DR stop obsessing over it
 
Last edited:
Back
Top