Is it? The review(s) is(are) covering games (among other things?) and the performance of said cpus with a bunch of games..its pretty clear to me, the purpose of this thread and why everyone is clinging to the scope of gaming...because that is what was asked of us...
From the first post...
Did i misunderstand? Seems to me its a review of CPUs with a GTX 1080 and games. I dont know if its a round up of results with this config or just setting framework for game testing for new CPUs though...
I give up. You're right entirely about 'typical gaming scenarios' and the fact that a 1080p result is not a 720p result. But that only gives me new things to wonder...
When it comes to gaming, what matters FAR more than whatever res you test on, is the benchmarks used. I would love to see CPU heavy games, and not 20 different kinds of third and first person semi-RPG shooter junk that is GPU limited by design. Because in the end, its that shooter junk that you'd *situationally* play on low res/lowest detail levels - once again underlining the importance of the lower resolution. You say knucklehead, I say competitive gaming / high refresh gaming. But apparently we don't want to hit that market, and we do want to hit every resolution above that. Mkay
In addition: what the opposition to 720p is actually saying, is self-contradictory. Because how is a 1080p - low detail settings indicative of real world performance, because apparently CPU load changes drastically if we increase detail levels... It does with resolution, right... (even though in the vast majority of engines the CPU load is very, very similar between all resolutions and scales with FPS, but apparently this is wrong, even though its precisely what we see confirmed in literally every GPU review - look at how the 1080ti caps out (hits a CPU wall) on 1080p and its nuff said)
Why is 1080p LOW perceived to be some sort of relevant benchmark, when 720p LOW isn't. I would daresay there are more people on shitty laptops with 1366x768 res than there are full blown rigs with a dedicated GPU that game at 1080p/low. Same argument goes for 1440p/low. 4K @ low I can fully understand, this is an interesting bench to do, because its something people would realistically touch on.
To underline the laptop POV:
http://www.laptopmag.com/articles/laptop-screen-resolution-ripoff
But any realistic laptop buyer knows this.
All in all, there is so much contradiction here...
so...what does that tell us in gaming which is what this testing for? How is that actually useful at a higher res where the results arent the same?
Helo me understand the point..
Relative maximum performance, maximum achievable FPS.
It answers questions such as:
- how many FPS could I win by upgrading compared to my last CPU, regardless of GPU (could be integrated for all we know)
- it allows you to mirror with results from way way back
- is it useful to even buy a 240hz panel
- can I hit 120 fps reliably with CPU XYZ
- What is the difference of gaming @ 720p old system versus 1080p new system (not everybody, including TPU visitors, is rich)
And most importantly: Can it run Crysis? (I kid you not, Crysis 3 might prove surprising @ 720p)
Suffice to say there are many arguments to make for 720p, it would do people well to think about this in the scope of a wide range of readers, not just your own (enthusiast) situation.