• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is CPU game benchmarking methodology (currently) flawed?

You, me and everybody is aware how we, the people, play our games ... we put shit to ultra and resolution to native, sometimes we curse and swear because of the fps dips then adjust the settings only as much is needed :)
In other words, you are not understanding me right, let me put it the other way ... using high resolution when benching cpus in games is ok (it applies) when game is cpu hungry enough so you can get meaningful value range for your graph (for example if you don't like differences that are fractions of a frame) ... if you want wider value range for your graph in those couple of games (that's when it doesn't apply) you lesser the burden on the gpu and get less compressed graph. It's fine because you are trying to analyze relative cpu performance executing that particular game code. It's kinda game to game basis IMHO, to get a more readable graph.
Now, argument how we should always and exclusively always test cpu in games using uhd res is resonable to an extent only if somehow lowering the resolution lowers the amount of job cpu has to do inside a frame. Does it? Valid question because games usually adjust level of detail system at higher resolutions, but LOD is all about geometry, not draw calls. Tough one.
Thanks for taking another stab at it. I do think I understood already. :)

I simply disagree with your assertion. I don't want a more readable graph because of an unrealistic testing environment. I want to know what the results are at resolution and settings most people play at and not an 'exaggerated to show results' situation using high end cards on below 1080p res and low settings. Show me what it's like on ultra with AA at 1080/2560x1440/4k(no AA @ 4K is fine with me since it isn't needed)!!!! That's where we all play!
 
It's not flawed per se, but the conclusions that people make from the results are.
If anyone really thinks low-res testing is THE defining test that is going to tell them the exact difference in gaming performance between CPUs they're quite simply wrong and always have been.
 
I don't want a more readable graph because of an unrealistic testing environment.
I don't mind both methods, I read them differently ... one graph is to see what performance you can expect in real life scenario with your cpu compared to others (to see true minimum cpu requirement), the low res graph would be to correctly rank cpu-s for the particular game (simply avoiding 2 close match cpus sharing rank)
 
The lower graph doesn't show you anything. Again, you can't extrapolate those performance differences to a higher setting. It's literally of zero value unless you have that abhorrent setup (high end card, low res, low settings) as performances differences are different when using it at a higher resolution and appropriate settings.

Oh well, we agree to disagree. :) :toast:
 
The lower graph doesn't show you anything.
When higher graph shows no difference in fps between sku-s, lower graph does just that.
1440_Division.png 1080_Division.png
720p run would show bigger spread and some proper ranking.

Your disagreement is a powerful one :laugh:
 
It shows nothing because there is nothing to be seen at that res and those settings... where people play. You lower the res, suddenly you made it a difference. It doesn't rank squat where people play. You are MAKING an issue where there isn't one just to show a difference??? Thats... asinine (to me).

And your disagreement just as powerful. So we should just smile and back away. Let's show this forum how to have a civil disagreement. :pimp:
 
One thing I wonder, how does ryzen perform on Starcraft 2, since that is the only game I've personally played where processor really mattered. All these AAA FPS commonly benchmarked games really don't care about processor. Slow processor and the battles really chug enough to make you lose rank. Its the only game where an overclock to 4.8 Ghz from stock IVB made a difference for me or where I noticed a stark difference between my old Phenom II and my current I7.

As far as game benchmarking goes, maybe instead of low res, run the res most people run at, 1080, 1440, etc and play games that require a fast processor such as SC2, Total War, Arma 3.

The only thing that really matters is minimum FPS.
 
Last edited:
I think this topic can be closed now because someone has tested this and found the 2500K to be faster on all but 1 title (of 16 tested):


It seems the methodology still holds: as such, no point in keeping this topic open, i think.
 
Wait a sec when did the Scottish dude ever say that the FX are faster than that i5? What he said and what still holds is that over time the difference between the CPUs is getting narrower instead of increasing, which is the underlying assumption of low-res benchmarking.
 
Wait a sec when did the Scottish dude ever say that the FX are faster than that i5? What he said and what still holds is that over time the difference between the CPUs is getting narrower instead of increasing, which is the underlying assumption of low-res benchmarking.

He also said that the more recent review was putting the FX8370 ahead when clearly it's not: it's one thing for the lead to shrink but it's quite a bit more for it to be reversed, and keep in mind i'm talking about 10%+ in either direction, which is what was being argued. If it were like 1% - 2% behind to ahead, then i could see it happen but not over 10% behind to 10% ahead.
 
Thanks for taking another stab at it. I do think I understood already. :)

I simply disagree with your assertion. I don't want a more readable graph because of an unrealistic testing environment. I want to know what the results are at resolution and settings most people play at and not an 'exaggerated to show results' situation using high end cards on below 1080p res and low settings. Show me what it's like on ultra with AA at 1080/2560x1440/4k(no AA @ 4K is fine with me since it isn't needed)!!!! That's where we all play!

Then you should read a GPU review though. The point of the CPU review is to remove the GPU constraints. There is a small handful of reviewers that do test on 'most common' system setups, if I recall, but they are extremely rare and I honestly don't pay them any mind, because they don't tell me anything new compared to a 'normal' CPU review.

Bottom line, this is a non-issue, especially because the only key differences between 'gaming' CPUs are clockspeed, making it easy to deduce the performance of your own rig. And Ryzen always clocks to just about 4 Ghz so that's even simpler... We also see that between low and higher resolution testing, CPU performance scales linearly with GPU load.

Ah look what I found here!

https://nl.hardware.info/reviews/72...euwe-videokaart-oude-processor-testresultaten
 
Last edited:
Then you should read a GPU review though. The point of the CPU review is to remove the GPU constraints.
So, you too prefer a completely unrealistic test environment to CREATE a situation which at worst does not exist, at best exists but isn't proportional, at a higher resolution or setting where people actually play their games. Let's not forget it's less than 1080p with a high end card used to create the differences at low res too. ;)

I mean who cares if the cpu is holding things back at resolutions and settings we don't play at, riggt? Lets conjur up a test to show a difference.... that makes zero sense to me to create a situation fake situation where it happens when it doesn't happen at all or as much at a higher res. The data is pretty useless to me.
 
So, you too prefer a completely unrealistic test environment to CREATE a situation which at worst does not exist, at best exists but isn't proportional, at a higher resolution or setting?

I mean who cares if the cpu is holding things back at resolutions and settings we don't play at. It makes zero sense to create a situation fake situation where it happens

I do prefer a clean testing environment to a dirty one, yes. May sound strange to you, but apples & apples comparisons do matter.

Gaming loads change all the time, even GPU drivers can create a different test result (such as when nvidia introduced Shader Cache on Kepler). When you put GPU load into play in a CPU review, you also enter the realm of GPU driver branches, game engine quirks and all that other fun stuff.... that we essentially uncover in GPU reviews. This makes it much more difficult to gauge whether an issue in performance is related to the CPU, or the GPU.

Ryzen is case in point: it is the 1080p testing that reveals its shortcomings, not the 4K or 1440p test. Precisely the fact that reviewers test the lower end resolutions enables them to draw sensible conclusions.

The only debate worthy of consideration is whether CPU reviewers should bump the lowest res up from 720p > 1080p, IMO.
 
That's the thing.. it is NOT apples to apples. You are fabricating an environment to exaggerate a difference at resolution and settings most people play don't play at. Add to that the horrifically lopsided system for the settings and res used to show create the difference, and you are left with a pile of day you can't extrapolate up... useless data unless you play at that low res and settings, and have a hugh end system.

See previous posts....you are saying the same thing the other dude is and I simply don't agree. :)
 
That's the thing.. it is NOT apples to apples. You Re fabricating an environment to exaggerate a difference at resolution and send most people play at. Add to that the horrifically lopsided system for the settings and res used to show the difference, and you are left with a pile of day you can't extrapolate up... useless data unless you play at that low res and settings, and have a hugh end system.

See previous posts....you are saying the same thing the other dude is and I simply don't agree. :)

I've played my fair share of MMO's, on less than mid range GPUs, and my stance on this is directly related to that.

Gaming loads NEVER put the burden on CPU, until you go online and add a bunch of other players, network related CPU load, latency and all that other fun stuff. Back in the day I played 720p and there was no CPU money could buy that did remove that bottleneck in for example, games like Guild Wars 2. Meanwhile GPU was 'chillin'. On the other side of the fence, with offline and low-CPU load games, the load shifts to GPU and with that the bottleneck shifts to GPU, making the CPU performance irrelevant.

Even today there are still plenty of reasons to fabricate a heavy CPU load gaming environment to see how the CPU stacks up against others, because there are gaming loads where you want the best CPU money can buy, and where the GPU load is laughable.

Perhaps you remember high player count maps in BF3, a scenario where almost all gamers were CPU limited on 1080p wth powerful GPUs.

If you still can't see the point, let's just leave it here and agree to disagree :)
 
But it doesn't scale... so the data is useless. Well, it's good for that low res, and low settings, with a high end gpu where nobody plays. Nothing more.

Anyhoo... I digress. I'm just saying the same things over and over. :)
 
Let's show this forum how to have a civil disagreement.
There I was thinking we did have a civil disagreement, yet my post came across as asinine, damn it that sucks ... oh well
 
I've played my fair share of MMO's, on less than mid range GPUs, and my stance on this is directly related to that.

Gaming loads NEVER put the burden on CPU, until you go online and add a bunch of other players, network related CPU load, latency and all that other fun stuff. Back in the day I played 720p and there was no CPU money could buy that did remove that bottleneck in for example, games like Guild Wars 2. Meanwhile GPU was 'chillin'. On the other side of the fence, with offline and low-CPU load games, the load shifts to GPU and with that the bottleneck shifts to GPU, making the CPU performance irrelevant.

Even today there are still plenty of reasons to fabricate a heavy CPU load gaming environment to see how the CPU stacks up against others, because there are gaming loads where you want the best CPU money can buy, and where the GPU load is laughable.

Perhaps you remember high player count maps in BF3, a scenario where almost all gamers were CPU limited on 1080p wth powerful GPUs.

If you still can't see the point, let's just leave it here and agree to disagree :)
I would actually love to see Ryzen tested with Black Desert Online, because that's one of the very few games that taxes both the CPU and the GPU very heavily.
 
There I was thinking we did have a civil disagreement, yet my post came across as asinine, damn it that sucks ... oh well
@BiggieShady

NO NO!!! You(we) were good the whole time!!!! That was a genuine kudos and agree to disagree moment...same with Vayara. Don't take asinine personally...its the idea, not you I found to be that way. Its just my opinion. :)
 
I don't mind about testing any CPU & GPU configuration but if people test CPUs & GPUs in 640x320... all I can say... come on guy, maybe use 1080p at least? Intel still has a nice advantage, but AMD could be nowhere, and for a good reason Zen is here. Not sure if people really want competition, I think that most want just one CPU and one GPU maker since it's easier. In real life everyone wants competition but the truth is... nobody really wants it. Why? Because games are frustrated people, I don't know gamer that talks about atmosphere, or gameplay mechanics or a nice soundtrack or even more important a nice story. Every test, every gamer talks about one thing: FPS and how many a particular system can go. Ask a gamer about a anteresting game with a nice story, or a nice atmosphere and you'll have silence. Talk about how many FPS his system can go and OMG we have a debate.

I was an AMD fan since I bought my 1st computer and AMD 486 DX 100. I am not a person that wants to always have the most expensive system, the most expensive car or the bigget house in the world. I am not really competitive in gaming also, I don't really care if I am the best player in the world or the worst but I care about understanding the story or the idea behind the game that I am playing (no matter the game). I started with Dizzy and Target Renegate and continued with Lands Of Lore, Dungeon Keeper, Outpost II, Dune II (still the best game in my opinion, even if I liked Dune, because it had one awesome story and atmosphere). I played over the years hundreds of games, like Chaser, Half Life, StarCraft and Syberia just to name some nice series. That doesn't mean that I wasn't playing Doom or Hexen or Duken Nukem or any number of other games. But I prefer to play games, thus I am not really into the graphics, I care far more for the story and the atmosphere, rather than the 150% accent on graphics you young people focus on :P Don't get angry with me, I read books that you might not even heard of just imagine talking about the Ender's game saga, or The Empire series or well Dune series?
I don't think that you people ever heard of Star Trek Armada, or Star Trek Borg or Star Trek Bridge Commander :P Hell I can say way way more games that you've never heard of before lol :P There's sooo many good games that you have missed because most of you are way way too focused on something that's not really that important.

To put it simple, same is with the IT industry today. You don't need 4k to play a game, 1080p is perfect. Ultra or high have no meaning or purpose, it's just frustration and porn graphics for some. Games should be about story, atmosphere and music. Not about FPS's and mindless gameplay. Same with CPUs and GPUs. Quality over quantity. There's no need for 8 cores since Windows can't use them properly, there's no need for more than 60 FPS since you can't see any difference. 3D sucks anyway lol :P

So for me... it's about gaming, I don't want to spend 5000 euros just to play shit games in 4k. I want to play good and amazing games in 1080p. Just that! Too bad I am the only one in the whole word who wants that loooool :P
 
NO NO!!! You(we) were good the whole time!!!! That was a genuine kudos and agree to disagree moment...same with Vayara. Don't take asinine personally...its the idea, not you I found to be that way. Its just my opinion. :)
It's all right, articles shouldn't be presenting only low res cpu graphs anyway ... that could be seen (by some) as making a difference where there isn't one in real life. But I say, more graphs the merrier.
 
I don't mind about testing any CPU & GPU configuration but if people test CPUs & GPUs in 640x320... all I can say... come on guy, maybe use 1080p at least? Intel still has a nice advantage, but AMD could be nowhere, and for a good reason Zen is here. Not sure if people really want competition, I think that most want just one CPU and one GPU maker since it's easier. In real life everyone wants competition but the truth is... nobody really wants it. Why? Because games are frustrated people, I don't know gamer that talks about atmosphere, or gameplay mechanics or a nice soundtrack or even more important a nice story. Every test, every gamer talks about one thing: FPS and how many a particular system can go. Ask a gamer about a anteresting game with a nice story, or a nice atmosphere and you'll have silence. Talk about how many FPS his system can go and OMG we have a debate.

I was an AMD fan since I bought my 1st computer and AMD 486 DX 100. I am not a person that wants to always have the most expensive system, the most expensive car or the bigget house in the world. I am not really competitive in gaming also, I don't really care if I am the best player in the world or the worst but I care about understanding the story or the idea behind the game that I am playing (no matter the game). I started with Dizzy and Target Renegate and continued with Lands Of Lore, Dungeon Keeper, Outpost II, Dune II (still the best game in my opinion, even if I liked Dune, because it had one awesome story and atmosphere). I played over the years hundreds of games, like Chaser, Half Life, StarCraft and Syberia just to name some nice series. That doesn't mean that I wasn't playing Doom or Hexen or Duken Nukem or any number of other games. But I prefer to play games, thus I am not really into the graphics, I care far more for the story and the atmosphere, rather than the 150% accent on graphics you young people focus on :P Don't get angry with me, I read books that you might not even heard of just imagine talking about the Ender's game saga, or The Empire series or well Dune series?
I don't think that you people ever heard of Star Trek Armada, or Star Trek Borg or Star Trek Bridge Commander :P Hell I can say way way more games that you've never heard of before lol :P There's sooo many good games that you have missed because most of you are way way too focused on something that's not really that important.

To put it simple, same is with the IT industry today. You don't need 4k to play a game, 1080p is perfect. Ultra or high have no meaning or purpose, it's just frustration and porn graphics for some. Games should be about story, atmosphere and music. Not about FPS's and mindless gameplay. Same with CPUs and GPUs. Quality over quantity. There's no need for 8 cores since Windows can't use them properly, there's no need for more than 60 FPS since you can't see any difference. 3D sucks anyway lol :P

So for me... it's about gaming, I don't want to spend 5000 euros just to play shit games in 4k. I want to play good and amazing games in 1080p. Just that! Too bad I am the only one in the whole word who wants that loooool :P

Haha I can see exactly what you mean man with regards to gamers and gaming.

The funny thing is, the 'console peasants' get to talk alot about their games and the actual gameplay, because that is all they have. Losers ;) They're ACTUALLY GAMING, HAHA. Too bad they have only five exclusive games, and the rest just runs better on our PCs anyway, so screw them - I have all of those games on my steam backlog anyway, where they'll probably sit to never get looked at again! HA!

*starts up another Heaven bench*
/sarcasm

It's so true though. The best content within PC gaming and discussing the actual gameplay, has to be sought in the specific forums, subreddits, you really gotta be deep inside the community to find it, and it is only there that you get a real view of the game and its health (in case of online, mostly). PC gamers are just too diverse to NOT resort to talking about hardware, it's like talking about the weather when you meet a random dude anywhere - its that safe haven so you don't end up looking at each other silently like the nerds we are.
 
Last edited:
I am just glad that hardware unboxed finally nailed the coffin in the never ending Core i5 2500K VS FX-8350 gaming debate and put FX-8350 in it's proper shameful recycle bin place! And what a place it is: lower gaming results than the 60 EURO Pentium G4560! Sorry guys, i just could not resist. And i think Ryzen processors are awesome! Every AMD FX owner should kill themselves now!:D
 
I am just glad that hardware unboxed finally nailed the coffin in the never ending Core i5 2500K VS FX-8350 gaming debate and put FX-8350 in it's proper shameful recycle bin place! And what a place it is: lower gaming results than the 60 EURO Pentium G4560! Sorry guys, i just could not resist. And i think Ryzen processors are awesome! Every AMD FX owner should kill themselves now!:D
I don't think so, here comes the answer:
You guys were simply too fast. This guy is smarter than you thought he is and he didn't really do a mistake, I guess people simply didn't understand his logic. It's a fact, 8350 is now faster than at release, it's probably faster than 2500K when properly utilized. But then again, this is just vs. a 2500K at *stock* which means nothing. Who bought a K CPU, which cost extra, to not overclock it? Yeah. That said, the FX 8350 has not the slightest chance once the 2500K is overclocked, even slightly. The difference just increases with a higher clock. In the end, all that information was just good for one thing: was the FX 8350 a futuristic CPU ahead of its time? Yes. Is it better now than at release? Probably, but it's still not good enough. A overclocked 2500K is barely good enough, but a FX 8350 isn't.
 
Back
Top