Originally Posted by BigMack70
+1 to the # of people who don't understand my point has nothing to do with hardware
You have to churn out ~4x the pixels at 1080p60 as you do at 720p30. That's basic math.
Now, I understand (and have stated from very early on in this argument) that when you actually go look at how things perform in the real world, this breaks down and is not linear. But there's all sorts of various reasons for that and none of them have to do with the fact that you're putting out a different ratio of pixels at 1080p60 vs 720p30 than ~4x.
Why does it take dozens of posts to explain such a stupidly over-simple point? Go read the thread - I was making a picky point because of a lack of clarity in one of mailman's posts, and you guys have taken it to a whole other level.
why doesnt performance drop to 1/4, when you up the pixels by 4x?
because its not that simple -.-
that doesnt take into accounts the various hardware tweaks, lossless and lossy compression (hardware and software via drivers) and a million other things. you've based this argument around something you see as simple and obvious, without actually checking it yourself.
god, this thread is really full of over-simplified arguments, just one after another after another...