• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is CPU game benchmarking methodology (currently) flawed?

Make that 5 threads where this video popped up... Jesus people...

Anyway, my take...again... low res, low setting, fast gpu testing BY ITSELF is NOT good way to show CPU differences. It will show those differences, but they are, clearly, exaggerated due to the unrealistic testing environment. In other words... what knuckle head runs a 1070+ lower than 1080p on low settings without AA?

It would be more interesting to see the differences at nominal/normal settings like 1080p/2560x1440/4k UHD because that's where it runs. You can't extrapolate the % difference as the res goes up, it would, typically be less as the settings as res goes up in most cases. So if it holds back a title 10% at 1280x1024, it will be less higher.

For those reason extremely low res testing with low settings is an abhorrent way, IMO, to capture that data...
 
Last edited:
How about we address the topic and not have a go at each other of our "perceived intentions" of anyone's posts?

That out of the way I have sat here for most of the day thinking about this, and while I do see some merit to what is shown in the video, my biggest issue is that the content creator is slamming the media for not being mind readers, fortune tellers, or in any way being able to sort out what the future holds for a certain segment of the market. IMHO that is asking a bit too much of reviewers to do. While many will have an educated guess based off of past trends, that is all it really is, a guess!

To me, "games" are made for the masses, just like any other part of the PC world! Cases are generally made to sit to the right of the user, mice are either right handed or ambidextrous, and games are tuned for Intel as they are the market leader currently! In the end of the video we hear the guy say it is on AMD to make the right moves for Ryzen to pick up steam and have a real go at Intel when it comes to gaming, and it took the 8350/8370 five years to bridge that gap. As a writer, I would find it hard to tell my readers to buy something "in hopes" it will get better! While most of us hope the best for AMD and Ryzen, we are sitting here on our hands waiting for AMD to get the masses to adjust to what they made. It very well may come to fruition that games will become more adjusted to the new architecture and the way in which AMD wants the CPU to work, but the reality is, it may take that same five year span to accomplish this.

I do not think the current testing methodology is flawed in general. Of course, there is always room for change, but to base all of your information of a five year span of improvements to come, seems a bit washy to me as a solid reason to suggest people spend their money in that direction at this point in the game based on hopes, wants, and dreams. Reviewers (should) base their conclusions on facts, and currently this is what I am seeing. Yes Ryzen wins in many areas, but until games are made to conform to what AMD is offering now, Intel has the upper hand for gamers, and is why it is being explained as such.

This is just my take on what I saw in the video and what I have seen in reviews. It is not gospel in any way, but at the same time, we have seen promises not come to reality in the past, and have seen Intel at work playing heavy handed not to accommodate the competition. Without first hand control of any of these aspects, and while I applaud AMD for what they have developed, I am not into buying things based off of "what may happen in the future" nor would I recommend products based on this angle of logic.
 
Dudes, give the poor unpaid babysitter a break, will you.

I wouldn't mind all this bitching if he was paid babysitter but he's unpaid. Might as well rename him as Unpaid Shitter.
 
Think it interesting that despite being tagged in the 1st post the one best suited to put this thread to Bed has not responded
That to me indicates the importance or lack of of this threads relevance
 
Think it interesting that despite being tagged in the 1st post the one best suited to put this thread to Bed has not responded
That to me indicates the importance or lack of of this threads relevance

Well technically, should we not be calling in Cadaveca since he is reviewing the CPUs?
 
How about we address the topic and not have a go at each other of our "perceived intentions" of anyone's posts?

That out of the way I have sat here for most of the day thinking about this, and while I do see some merit to what is shown in the video, my biggest issue is that the content creator is slamming the media for not being mind readers, fortune tellers, or in any way being able to sort out what the future holds for a certain segment of the market. IMHO that is asking a bit too much of reviewers to do. While many will have an educated guess based off of past trends, that is all it really is, a guess!

To me, "games" are made for the masses, just like any other part of the PC world! Cases are generally made to sit to the right of the user, mice are either right handed or ambidextrous, and games are tuned for Intel as they are the market leader currently! In the end of the video we hear the guy say it is on AMD to make the right moves for Ryzen to pick up steam and have a real go at Intel when it comes to gaming, and it took the 8350/8370 five years to bridge that gap. As a writer, I would find it hard to tell my readers to buy something "in hopes" it will get better! While most of us hope the best for AMD and Ryzen, we are sitting here on our hands waiting for AMD to get the masses to adjust to what they made. It very well may come to fruition that games will become more adjusted to the new architecture and the way in which AMD wants the CPU to work, but the reality is, it may take that same five year span to accomplish this.

I do not think the current testing methodology is flawed in general. Of course, there is always room for change, but to base all of your information of a five year span of improvements to come, seems a bit washy to me as a solid reason to suggest people spend their money in that direction at this point in the game based on hopes, wants, and dreams. Reviewers (should) base their conclusions on facts, and currently this is what I am seeing. Yes Ryzen wins in many areas, but until games are made to conform to what AMD is offering now, Intel has the upper hand for gamers, and is why it is being explained as such.

This is just my take on what I saw in the video and what I have seen in reviews. It is not gospel in any way, but at the same time, we have seen promises not come to reality in the past, and have seen Intel at work playing heavy handed not to accommodate the competition. Without first hand control of any of these aspects, and while I applaud AMD for what they have developed, I am not into buying things based off of "what may happen in the future" nor would I recommend products based on this angle of logic.

For a CPU to come form an over 10% difference behing into a 10% lead is, IMO, reason enough to warrant a test to acertain the validity of the current test model.

The reason it took 5 years is because the difference in performance of the GPUs is so massive. In every of the reviews he used for his video, the tests were made with what was the fastest GPU @ the time (from a 680Ti to a 1080, if i'm not mistaken).

Obviously, nobody can predict the future and i 100% concur that reviewers should base their conclusions on facts. But what if the method to achieve said facts is flawed? Wouldn't you like to know this?
 
But what if the method to achieve said facts is flawed? Wouldn't you like to know this?

Prove to me it is. Nobody here, or the creator of the video has done that. It is easy to show a flaw when time has passed, drivers have changed, and parts are swapped out over time, but how does that help to predict the future in your testing now?

Point of fact that even the video creator states... Games are not coded to work well, or to their best ability with Ryzen. So it is not on the reviewers or the testing, it is on AMD for releasing a product that is not made for today's environment. Yes they may have potential to be the king down the road, but I may have a rainbow colored elephant fall from my butt too!
 
@sneekypeet Great post #28 there. :) I'm so not into buying stuff on some unlikely promise that doesn't pan out, either.
 
Last edited by a moderator:
Prove to me it is. Nobody here, or the creator of the video has done that. It is easy to show a flaw when time has passed, drivers have changed, and parts are swapped out over time, but how does that help to predict the future in your testing now?

... That's why i asked for this to be tested in the original post ...
 
... That's why i asked for this to be tested in the original post ...

How though? I mean lets get down to brass tacks. You are expecting someone else, in this instance W1zzard, to know how to do this! I am not in any way demeaning his intelligence, but if he had a way to show this sort of thing, I think he would have already implemented it, or had a discussion with Dave about showing it in TPUs CPU reviews too!
 
How though? I mean lets get down to brass tacks. You are expecting someone else, in this instance W1zzard, to know how to do this! I am not in any way demeaning his intelligence, but if he had a way to show this sort of thing, I think he would have already implemented it, or had a discussion with Dave about showing it in TPUs CPU reviews too!
Well personally for a start I'd like to see 1080 games tested on normal or high and only certain benchmark software used for these 1080 GPU low load CPU runs ,that would be fair yet remove some unnecessary and unrepresentative testing yet retain that type of testings core values.
Imho.
 
How though? I mean lets get down to brass tacks. You are expecting someone else, in this instance Wizzard, to know how to do this! I am not in any way demeaning his intelligence, but if he had a way to show this sort of thing, I think he would have already implemented it, or had a discussion with Dave about showing it in TPUs CPU reviews too!

I thought he could test whether what Adored got by checking reviews over a span of 5 years (keep in mind that hardware was different across the reviews, as well as driver versions) is reproducible by trying to maintain as much as the same hardware / software accross the 2 CPUs being tested, to minimize the amount of variables.

- If the findings are similar, that should prove the model is flawed because the theory is that putting a much faster card later should not change the placing of the CPU in the tests and, as such, a new way to test a gaming CPU must be found because this one is flawed.

- If the findings contradict, then that proves the model is right and this argument ends here.
 
Well personally for a start I'd like to see 1080 games tested on normal or high and only certain benchmark software used for these 1080 GPU low load CPU runs ,that would be fair yet remove some unnecessary and unrepresentative testing yet retain that type of testings core values.
Imho.

IMHO limiting testing to only certain ones is not "fair". If you have a killer product, it should lead in all environments, not just those which may show favor in one direction or another. I would prefer you throw the whole gamut of tests at a product to see its merits overall.

I thought he could test whether what Adored got by checking reviews over a span of 5 years (keep in mind that hardware was different across the reviews, as well as driver versions) is reproducible by trying to maintain as much as the same hardware / software accross the 2 CPUs being tested, to minimize the amount of variables.

- If the findings are similar, that should prove the model is flawed because the theory is that putting a much faster card later should not change the placing of the CPU in the tests and, as such, a new way to test a gaming CPU must be found because this one is flawed.

- If the findings contradict, then that proves the model is right and this argument ends here.

While the idea is sound, I do not believe W1zz was using AMD for any of his video testing (GTX 680 and after), and our CPU reviews here were hit and miss until Dave took over and has made a go at it. TBH I am not even sure W1zz would have the parts you think he does, otherwise his house would be impossible to move around in ;)
 
IMHO limiting testing to only certain ones is not "fair". If you have a killer product, it should lead in all environments, not just those which may show favor in one direction or another. I would prefer you throw the whole gamut of tests at a product to see its merits overall.



While the idea is sound, I do not believe W1zz was using AMD for any of his video testing (GTX 680 and after), and our CPU reviews here were hit and miss until Dave took over and has made a go at it. TBH I am not even sure W1zz would have the parts you think he does, otherwise his house would be impossible to move around in ;)

That never crossed my mind, in all honesty ... you may ofc be absolutely right!
 
IMHO limiting testing to only certain ones is not "fair". If you have a killer product, it should lead in all environments, not just those which may show favor in one direction or another. I would prefer you throw the whole gamut of tests at a product to see its merits overall.



While the idea is sound, I do not believe W1zz was using AMD for any of his video testing (GTX 680 and after), and our CPU reviews here were hit and miss until Dave took over and has made a go at it. TBH I am not even sure W1zz would have the parts you think he does, otherwise his house would be impossible to move around in ;)
An i7 7700k leads in one kind of environment and a 6950k in others , neither one is killer then by that definition but i just think the testing should be fair And representative all games were made On Intel compiled for Intel and scheduled correctly on a four year old platform ,none of this low setting 1080p (500+FPS)gaming is representative of what people do with their pc.
Plus since bench loads remain stable yet are updated to allow new hardware more frequently then some tested games they could be said to show the CPUs bottleneck more consistent i did say keep them but just on benches.
All games would still get 1080 P tests as they already do but at least at normal settings.
 
An i7 7700k leads in one kind of environment and a 6950k in others , neither one is killer then by that definition but i just think the testing should be fair And representative all games were made On Intel compiled for Intel and scheduled correctly on a four year old platform ,none of this low setting 1080p (500+FPS)gaming is representative of what people do with their pc.
Plus since bench loads remain stable yet are updated to allow new hardware more frequently then some tested games they could be said to show the CPUs bottleneck more consistent i did say keep them but just on benches.
All games would still get 1080 P tests as they already do but at least at normal settings.

So you are saying that the environments which favor your theory are being left out of the mix in the reviews of Ryzen?
 
An i7 7700k leads in one kind of environment and a 6950k in others , neither one is killer then by that definition but i just think the testing should be fair And representative all games were made On Intel compiled for Intel and scheduled correctly on a four year old platform ,none of this low setting 1080p (500+FPS)gaming is representative of what people do with their pc.
Plus since bench loads remain stable yet are updated to allow new hardware more frequently then some tested games they could be said to show the CPUs bottleneck more consistent i did say keep them but just on benches.
All games would still get 1080 P tests as they already do but at least at normal settings.
For the record, while I think a CPU's framerate performance should be done with low settings to see what it's truly capable of, it should also be tested at the typical resolutions and quality settings that gamers use, like we see today. Basically, I want to add another layer of testing, not replace any testing.
 
What matters if a Game is smooth, not choppy if it is in max chaos whether sp or mp that's all I care about.
 
For the record, while I think a CPU's framerate performance should be done with low settings to see what it's truly capable of, it should also be tested at the typical resolutions and quality settings that gamers use, like we see today. Basically, I want to add another layer of testing, not replace any testing.
Same one set just to find CPU limits and another at playable settings.
 
Well technically, should we not be calling in Cadaveca since he is reviewing the CPUs?
oi vey! You know, if I had my way ,anything related to this "adored" guy would be blasted off the internet post haste. Anyone that admits faking an accent in videos for effect isn't worth the time of anyone here.

But I will say this, when it comes to CPU testing (and this applies to testing ANYTHING):

If you want to provide your readers with real tangible information, you show them results of a product that were generated when the product is used as intended, and you do not create artificial scenarios. When you create artificial scenarios, you create artificial results. So, testing a modern CPU at low resolutions is stupid. It shows an artificial and meaningless result.

You don't write reviews based on how well a motherboard can float on water. You don't test GPUs using SuperPi. You don't modify monitor resolutions outside of native. To test a CPU, you match it with appropriate hardware ( ie, high-end CPU gets high-end GPU, memory, cooling, and board), you run tests of varying workloads, and you rely on CPU-focused benchmarks to show differences in CPUs, not GPU-focused ones, which is what games are. Yes, you test games, and then you accept the results for what they are, and don't try to find situations that might show something different. A high-end CPU in a high-end system doesn't get matched wit ha monitor of low resolution, so to modify settings to show such things shows (to me), that your agenda is not showing truth, but opinion.
 
Shouldn't minimum frame rates and frame latency always be better with the faster threaded processor regardless of resolution?

I watched the video and it seems he is using two main arguments to reach his conclusion that the AMD is better.

One was the BF1 example less % utilization of cpu on AMD means more cores is better, and therefore the CPU is faster than the Intel. If this is the case, why was the Intel faster for gaming in his BF1 example?

And his first example in the video, we buy the best graphics card available each year after buying a 2500K and an 8350. 5 years later the 8350 runs faster than the 2500K with the latest graphics card.
How realistic is this scenario going to be? Most people buy the graphics card and the CPU at the same time and 5 years later expect both to be completely outdated. Second, I find it really hard to believe that the 8350 runs a titan x pascal better than a 2500k. What is the benchmark being used? Are they the same benchmarks tested on each graphics card? What made the titan x pascal perform so much worse then the 2500k when the 980TI ran better on the 2500K? Has anyone else run this test to duplicate the result, or should we just take this guy at face value from his simplistic bar chart?

This whole thing reminds me of Tek syndicate in 2012 with their bizzare 8350 vs 2500K video that was so popular by AMD fanboys, where they presented the cpu with a workload entirely unrealistic for most users that managed to make the 8350 come out ahead, with the conclusion that this made the 8350 better. Edit:
Who even uses Xsplit when OBS exists?

Here are his three main data sources for his conclusion:
point1.jpg

point2.jpg

point3.jpg


oi vey! You know, if I had my way ,anything related to this "adored" guy would be blasted off the internet post haste. Anyone that admits faking an accent in videos for effect isn't worth the time of anyone here.

But I will say this, when it comes to CPU testing (and this applies to testing ANYTHING):

If you want to provide your readers with real tangible information, you show them results of a product that were generated when the product is used as intended, and you do not create artificial scenarios. When you create artificial scenarios, you create artificial results. So, testing a modern CPU at low resolutions is stupid. It shows an artificial and meaningless result.

You don't write reviews based on how well a motherboard can float on water. You don't test GPUs using SuperPi. You don't modify monitor resolutions outside of native. To test a CPU, you match it with appropriate hardware ( ie, high-end CPU gets high-end GPU, memory, cooling, and board), you run tests of varying workloads, and you rely on CPU-focused benchmarks to show differences in CPUs, not GPU-focused ones, which is what games are. Yes, you test games, and then you accept the results for what they are, and don't try to find situations that might show something different. A high-end CPU in a high-end system doesn't get matched wit ha monitor of low resolution, so to modify settings to show such things shows (to me), that your agenda is not showing truth, but opinion.

Is he using a fake scottish accent? If so, that says it all.
 
Last edited:
I'd look for a seal clapping meme but can't be assed
And I can't be assed having the same argument again tbh.
It's old and boring
 
I'd say it's not completely flawed but a bit. In the end seeing future by yourself instead of forcing it by low resolution is what made the difference. Reality is worth more than a simulation by low resolution. The FX 8350 sure has more power than the 2500k, but only if it's used well, and exactly that needed time. It's more on par with the 2600K than the 2500K.

BTW about the offtopic blatter:
He said he talks in a more extreme accent in his videos not that he "fakes" anything - also it's of no importance and consequence related to the topic, haters gonna hate anyway.
 
Back
Top