Discussion in 'General Hardware' started by wolf2009, May 16, 2008.
Read this brilliant article here
leave ur comments guys. THis proves Quads r not bad for gaming in any way.
it also means that core2 duos arent out dated at all given the normal gaming res is 1280x1024 + quads dont do badly marginally better but dual cores are no were near as obsolete as some think....and mine is at 3.7ghz.
Thats why I stay in the low end range and OC
If only goood cheap Hard drive research was done. Thats whats holding back PCs
notice that the E8400 is moving hand in hand with the Q9450, that suprises me as i thout the Q9450 was much superior also at gaming.
It would have been rather clever of the reviewer IF he had focused more on looking at clock-for-clock comparisons between the Duos and Quads WHEN THE GPU WASNT BOTTLENECKED. Really, what is the point of running benchmarks if you set the GPU at max max so that it is always the bottleneck? You get no insight.
If he had run the GPU at medium AA and AF settings where the GPU wasnt choking on its own breath... then we WOULD have seen a much wider differentiation in CPU behaviour between Duos, Quads, and Mhz.
Bad review IMO.
(Normally the 3dguru reviews are excellent)
I don't think you'll find anyone who said they were bad, but they don't perform THAT much better than a dualie in gaming to justify the cost increase.
An E8400 at 4Ghz will win against a Q6600 at 3.6Ghz, unless it's one of those games like World of Conflict that takes advantage of more than 2 cores.
Dualie is still the sweet spot for price/perfomance.
Good artical INTEL still holds the crown.
I was converting AVI files to MPEG last night and playing COD4 at the same time with the Quad CPU was at 65-70% load.
Yes, he used a high end GPU... but IMO he dialed up too many effects. Too many of the charts are all =. No CPU differentiation. Why? Because even the lowest CPUs are faster than what the GPU can render. Really needed to choose lower FSAA and AF multipliers (or something other graphics feature that is causing the problem). I was also surprised to see the bottleneck even at 1024x768. Perhaps the nVidia cant cope with so many shaders. I dont know where the problem is... only that there is a lot of time wasted benchmarking CPUs when the GPU is already the bottleneck.:shadedshu
there is an article on toms which kinda says any cpu around 2.6 gig or above is good enough for gaming.. anything more dosnt produce much better results.. once u get to about 3 gig thats about it.. more cores or faster speeds are wasted..
a stock q6600 at 2.4 gig even thow it has four cores dosnt do as well as a stock dual core e6750 at 2.66 gig..
I think it was more or less from a practical standpoint than anything else. More or less for your average gamer. I think these are more reflecting of practical real world conditions that your typical person would most likely face.
Even though i am currently happy with my overclocked E6850, i personally will be getting a Q9450 or Q9550 as soon as they are readily available here in denmark, the first batch was for those who preordered gazillion years ago.
You seemed to have missed one of the major points of the review...
The GPU bottleneck WAS what he was trying to show. One of his points behind the review was to show that in most games, you are going to jack the graphics up the GPU's limit, not the CPU's limit. The lower resolutions were included to show the issue between the different CPUs. But the higher ones were included to show that the GPU is still the major bottleneck, so it really doesn't matter anyway. If you look, all the low end non-GPU bound tests have framerates WAY beyond playable.
Good to see theres still plenty of life life in c2d's at the highest res gpu is much more important than cpu.
yay, i love my C2D!
Well i love my q6600 ..lol...just right for my needs...
i love my e6400...i never knew core2's were so awsome till i got mine..this thing totally rocks S0X!!!!!! and the over clocking is utterly insane....3.7ghz so far im still testing! and its on air!@..thats totally unheard of for amd.
exactly - he was trying to show that after a certain point, you hit a wall where performance across the board is equal, and any benefit from having more CPU cores/more GHz is null and void as performance isn't surpassing what the GPU is capable of.
The only games that don't seem to be as affected until you hit the supper resolutions (1920x1200, 2560x1600) are games that seem to be more optmized for multi-core processors (i.e. ETQW); but at extreme resolutions, you're still GPU bottlenecked.
Separate names with a comma.