Friday, March 16th 2012

GTX 680 Generally Faster Than HD 7970: New Benchmarks

For skeptics who refuse to believe randomly-sourced bar-graphs of the GeForce GTX 680 that are starved of pictures, here is the first set of benchmarks run by a third-party (neither NVIDIA nor one of its AIC partners). This [p]reviewer from HKEPC has pictures to back his benchmarks. The GeForce GTX 680 was pitted against a Radeon HD 7970, and a previous-generation GeForce GTX 580. The test-bed consisted of an extreme-cooled Intel Core i7-3960X Extreme Edition processor (running at stock frequency), ASUS Rampage IV Extreme motherboard, 8 GB (4x 2 GB) GeIL EVO 2 DDR3-2200 MHz quad-channel memory, Corsair AX1200W PSU, and Windows 7 x64.

Benchmarks included 3DMark 11 (performance preset), Battlefield 3, Batman: Arkham City, Call of Duty: Modern Warfare 3, Lost Planet 2, and Unigine Heaven (version not mentioned, could be 1). All tests were run at a constant resolution of 1920x1080, with 8x MSAA on some tests (mentioned in the graphs).

More graphs follow.

Source: HKEPC
Add your own comment

273 Comments on GTX 680 Generally Faster Than HD 7970: New Benchmarks

#76
HTC
There's a good thing and a bad thing coming out of this (if true).

The good thing: AMD will be forced to lower their prices on current offerings.

The bad thing: nVidia will probably sell these @ the price AMD is selling theirs right now and charge even more for the top cards (assuming this is the mid-range card).


I really was hoping for a tighter race here. I don't care who wins as long as it's done so by a small margin so that more aggressive pricing is warranted.

This is bad ... :(
Posted on Reply
#77
N3M3515
HTCThere's a good thing and a bad thing coming out of this (if true).

The good thing: AMD will be forced to lower their prices on current offerings.

The bad thing: nVidia will probably sell these @ the price AMD is selling theirs right now and charge even more for the top cards (assuming this is the mid-range card).


I really was hoping for a tighter race here. I don't care who wins as long as it's done so by a small margin so that more aggressive pricing is warranted.

This is bad ... :(
That's so true man...
Posted on Reply
#78
amdftw
I' ve got a sure source that the GPU-z clocks are totally wrong...
GTX680 works other clocks...
Posted on Reply
#79
sanadanosa
Capitan Harlocki wanna see real bench and gtx 680 vs 7970 same mhz of core and memory, make a test qith oc card vs stock is autofanboysm
880MHz HD6970 VS 732MHZ GTX 570, Is that an overclocked HD6970???
Posted on Reply
#80
Frizz
Please be true please be true please be true. I want AMD's 7970 to free fall price-wise.
Posted on Reply
#81
the54thvoid
Intoxicated Moderator


This is the correct picture from source... (not killing the 7970.) www.hkepc.com/7672/page/6#view

Just realised....

772 --> 1006 = 30% core clock increase

4008 ---> 6008 = 50% memory clock increase

54.2 ---> 72.2 fps = 33% fps boost

Now, given the clocks are 30% higher and there is likely a AA algorithm at work, I've just had a terrible dawning of, "it's not a super performer at all". The 28nm process just makes it allow for faster clocks. And with some clever tech helping out AA (which isn't a bad thing) it takes the biscuit.

But I'm just thinking if those 1GHz clocks and memory speed are true then it's simply a logical performance increase based on clocks, not architecture.
Posted on Reply
#82
Benetanegia
N3M3515Man, i guess it's also irrelevant if the name is gk104. If it performs like a 7970 or better, AND there isn't any higher performer nvidia product, AND it's priced equal to amd's highend part, then IT IS highend FFS.

There isn't any other nvidia card is it?, so, until nvidia releases a higher performer, GTX680 IS THE F** highend period.

It doesn't matter nvidia did an amazing job with gk110, the fact of the matter is that IT IS NOT EXISTENT, and when it is released then amd will have an answer, maybe not a better performer, maybe, but it won't be like nvidia will release that super duper hyper mega ultra uber monster gk110 and amd will only have 7970 as its highest performer (singlegpu). By that time amd may already have 8970, for all we know.
Yeah that's called adjusting to market, but that does not change the fact that this is Nvidia's mid-range GPU, and it WAS going to be priced at $300, before they realised they can price it higher.

Need to differentiate between high-end card/SKU and high-end chip. It's not the same. AMD's X2 cards where high-end cards, made of 2 mid-range GPUs. That is what their "small die" strategy is about. Apparently now Nvidia will do the same except with 1 GPU lol.
Posted on Reply
#83
m1dg3t
GC_PaNzerFINI think people always have too high expectations. Chip is quite a bit smaller than Tahiti so it is great feat if it even gets on par with HD 7970 performance. Like others, interesting to see if it also is better perf/w than Tahiti.
I never have expectation's of anything :laugh: I've been around enough GFX card release's to know that 99% of the time it's nothing special, maybe every 3 - 5 year's is cause to get excited because you KNOW the tech/performance is going to be massive but between card's 1 - 3 "series" apart the difference's aren't great enough for me to get my panties wet :laugh:

I was only expecting more of it because of all the "great" HYPE this card got! IF Nvidia did cut out the top performing card and send out it's mid range chip in it's place that is a TOTAL low blow to the customer's. If they have/had a monster they should have put it on the shelve's. Lord know's they always charge top $$$ might as well get top performance, when i say "top performance" i don't mean 5% - 10% more performance :o
Posted on Reply
#84
thematrix606
the54thvoidThis is the correct picture from source... (not killing the 7970.) www.hkepc.com/7672/page/6#view

Just realised....

772 --> 1006 = 30% core clock increase

4008 ---> 6008 = 50% memory clock increase

54.2 ---> 72.2 fps = 33% fps boost

Now, given the clocks are 30% higher and there is likely a AA algorithm at work, I've just had a terrible dawning of, "it's not a super performer at all". The 28nm process just makes it allow for faster clocks. And with some clever tech helping out AA (which isn't a bad thing) it takes the biscuit.

But I'm just thinking if those 1GHz clocks and memory speed are true then it's simply a logical performance increase based on clocks, not architecture.
A 7970 can't reach that FPS in BF3 on average, impossible.
Posted on Reply
#85
jpierce55
BenetanegiaYeah that's called adjusting to market, but that does not change the fact that this is Nvidia's mid-range GPU, and it WAS going to be priced at $300, before they realized they can price it higher.

Need to differentiate between high-end card/SKU and high-end chip. It's not the same. AMD's X2 cards where high-end cards, made of 2 mid-range GPUs. That is what their "small die" strategy is about. Apparently now Nvidia will do the same except with 1 GPU lol.
yet, nobody knows for 100% certain it WAS intended to be the high end. They MAY leave the gk100 for the x2 style card. Or maybe they called it 104 to get fanboys excited about how the mid-range card kicked AMD's butt so hard that they did not even release it:rolleyes:
Posted on Reply
#86
m1dg3t
jpierce55Or maybe they called it 104 to get fanboys excited about how the mid-range card kicked AMD's butt so hard that they did not even release it:rolleyes:
Ah the ol' bait 'n switch! Wouldn't be the first time (or the last) for shady tactic's, Nvidia style :ohwell:
Posted on Reply
#87
N3M3515
BenetanegiaYeah that's called adjusting to market, but that does not change the fact that this is Nvidia's mid-range GPU, and it WAS going to be priced at $300, before they realised they can price it higher.

Need to differentiate between high-end card/SKU and high-end chip. It's not the same. AMD's X2 cards where high-end cards, made of 2 mid-range GPUs. That is what their "small die" strategy is about. Apparently now Nvidia will do the same except with 1 GPU lol.
Doesn't matter man, "WAS" past tense. Still GTX680 = Highend.
Posted on Reply
#88
the54thvoid
Intoxicated Moderator
3DMark11 Extreme - 52.3%
3DMark11 Performance - 41.4%
BF3 - 33%
Batman AC - 35.7%
COD MW3 - 10%
Heaven Benchmark - 33.2%
Lost Planet 2 - 34.4%

This is how much faster it is over the GTX 580. Not bad at all for the 'mainstream card'. But again, those clock speeds make a big diff.

Given that a 14% clock increase on a GTX580 Lightning resulted in a 10% 3DMark11 Performance score on this review (www.guru3d.com/article/msi-geforce-gtx-580-lightning-review/20) you could argue that a 'X' % clock increase gives a ~70% 'X' increase in performance.

So by default with nothing else considered the architecture explains 30% of it's improvement but it's higher clocks make up 70%. Or in laymans terms, more than 2/3 of the extra fps from the GTX 680 are due to higher clocks.

This is simple percentages here based on figures. Not opinions and not conjecture. I know folk will argue but I dont care, to me, the clocks make the biggest impact.

Need a good W1zz review to make it all clear.
Posted on Reply
#89
m1dg3t
the54thvoidNeed a good W1zz review to make it all clear.
Exactly! So stop this:
the54thvoid3DMark11 Extreme - 52.3%
3DMark11 Performance - 41.4%
BF3 - 33%
Batman AC - 35.7%
COD MW3 - 10%
Heaven Benchmark - 33.2%
Lost Planet 2 - 34.4%

This is how much faster it is over the GTX 580. Not bad at all for the 'mainstream card'. But again, those clock speeds make a big diff.

Given that a 14% clock increase on a GTX580 Lightning resulted in a 10% 3DMark11 Performance score on this review (www.guru3d.com/article/msi-geforce-gtx-580-lightning-review/20) you could argue that a 'X' % clock increase gives a ~70% 'X' increase in performance.

So by default with nothing else considered the architecture explains 30% of it's improvement but it's higher clocks make up 70%. Or in laymans terms, more than 2/3 of the extra fps from the GTX 680 are due to higher clocks.

This is simple percentages here based on figures. Not opinions and not conjecture. I know folk will argue but I dont care, to me, the clocks make the biggest impact.
Thank's for another post :)
Posted on Reply
#90
the54thvoid
Intoxicated Moderator
I like numbers...
And I like reviews...
It gives me the right to post things and then demand validation of my thoughts.

Thanks for giving me an excuse to reply!! :laugh:

Okay, next question for Btarunr and W1zz.

WHEN IS THE FRICKIN' NDA UP?
Posted on Reply
#91
Benetanegia
jpierce55yet, nobody knows for 100% certain it WAS intended to be the high end. They MAY leave the gk100 for the x2 style card. Or maybe they called it 104 to get fanboys excited about how the mid-range card kicked AMD's butt so hard that they did not even release it:rolleyes:
Sure you can find all the "IF"s and "maybe"s you want but for the people who have nothing to fear, common sense prevails.

a) It looks like a mid-range chip and is called like a mid-range chip, but it was all part of a covert plan to confuse people. (And badly loose the high-end to AMD, had Tahiti performed as it should -> comparably as well as Pitcairn)

b) it looks like a mid-range chip and is called like a mid-range chip, because it IS a mid-range chip.

Occam's Razor== b) ;)
N3M3515Doesn't matter man, "WAS" past tense. Still GTX680 = Highend.
We are discussing different things. I'm not discussing at which price segment it belongs now, but which chip in the Kepler line this is. Now some people even pretend that GK100 and GK110 never existed and never will, but it does exist and will be released. Just because it may come several months later that does not change the fact that it will and it will be Kepler and it will be bigger than GK104. So by the fact that a faster/bigger Kepler chip is going to be released and was always planned to be released, GK104 is NOT, never was and never will be a high-end chip. it cannot be high-end, when there's something bigger on top of it. Period.
Posted on Reply
#92
HTC
the54thvoidI like numbers...
And I like reviews...
It gives me the right to post things and then demand validation of my thoughts.

Thanks for giving me an excuse to reply!! :laugh:

Okay, next question for Btarunr and W1zz.

WHEN IS THE FRICKIN' NDA UP?
That information in under NDA.
Posted on Reply
#94
m1dg3t
the54thvoidI like numbers...
And I like reviews...
It gives me the right to post things and then demand validation of my thoughts.

Thanks for giving me an excuse to reply!! :laugh:

Okay, next question for Btarunr and W1zz.

WHEN IS THE FRICKIN' NDA UP?
:roll:

NDA lift is 22nd of March IIRC and further reason why i don't FULLY believe/trust any of these released "stat's" ;)
Hayder Masterwhat about if we run 7970 at same clocks ??
If both card's were at same clock's i think 7970 would just barely pass (1% - 2%) it or they would be dead even.
Posted on Reply
#95
btarunr
Editor & Senior Moderator
Fixed the Battlefield 3 slide.
Posted on Reply
#96
Benetanegia
the54thvoid3DMark11 Extreme - 52.3%
3DMark11 Performance - 41.4%
BF3 - 33%
Batman AC - 35.7%
COD MW3 - 10%
Heaven Benchmark - 33.2%
Lost Planet 2 - 34.4%

This is how much faster it is over the GTX 580. Not bad at all for the 'mainstream card'. But again, those clock speeds make a big diff.

Given that a 14% clock increase on a GTX580 Lightning resulted in a 10% 3DMark11 Performance score on this review (www.guru3d.com/article/msi-geforce-gtx-580-lightning-review/20) you could argue that a 'X' % clock increase gives a ~70% 'X' increase in performance.

So by default with nothing else considered the architecture explains 30% of it's improvement but it's higher clocks make up 70%. Or in laymans terms, more than 2/3 of the extra fps from the GTX 680 are due to higher clocks.

This is simple percentages here based on figures. Not opinions and not conjecture. I know folk will argue but I dont care, to me, the clocks make the biggest impact.

Need a good W1zz review to make it all clear.
So basically higher clocks are achieved with fairy dust and have nothing to do with architecture.

Hmm that's weird because isn't Nvidia the one with the Fairy girl and AMD the one with Ruby? How come did AMD get that fairy dust first and for so many generations?!??! :mad::cool:
Posted on Reply
#97
the54thvoid
Intoxicated Moderator
Benetanegiaa) It looks like a mid-range chip and is called like a mid-range chip, but it was all part of a covert plan to confuse people. (And badly loose the high-end to AMD, had Tahiti performed as it should -> comparably as well as Pitcairn)

b) it looks like a mid-range chip and is called like a mid-range chip, because it IS a mid-range chip.
Yup.

GF 104 = GTX 460
GF 100 = oven, I mean GTX 480
GF 114 = GTX 560
GF 110 = GTX 580
GK 104 = GTX 660 whoops GTX 680
GK 100 or 110 = GTX 680 whoops GTX 780?
BenetanegiaSo basically higher clocks are achieved with fairy dust and have nothing to do with architecture.

Hmm that's weird because isn't Nvidia the one with the Fairy girl and AMD the one with Ruby? How come did AMD get that fairy dust first and for so many generations?!??!
Always so feisty Ben :rolleyes: Process shrinks allow faster clocks due to all the power hoo ha that happens at lower scales. I can't explain that in tech terms but i know it's a facet of process shrinking up to logical point. I know the architecture is different but if 40nm allowed 1GHz clocks we'd be seeing far lower improvements.

It's all moot anyway, I'm looking forward to seeing the pricing and then I can decide what to get until GK100/110 comes out :cool:
Posted on Reply
#98
Hayder_Master
m1dg3tIf both card's were at same clock's i think 7970 would just barely pass (1% - 2%) it or they would be dead even.
and if booth of them but them on max overclock without voltage increase i think 7970 will rock.
Posted on Reply
#99
btarunr
Editor & Senior Moderator
the54thvoidOkay, next question for Btarunr and W1zz.

WHEN IS THE FRICKIN' NDA UP?
According to 3DCenter.org, it's 22nd March.

Posted on Reply
#100
BlackOmega
Pathetic. It barely beats a 7970. But there's something awry with the results. The 7970 should've scored higher than 7700, especially with the CPU clocked as high as it is.
Posted on Reply
Add your own comment
Apr 25th, 2024 05:52 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts