Monday, March 19th 2012

GeForce GTX 680 SLI Performance Surfaces

NVIDIA's big GeForce GTX 680 launch is just around the corner, but performance figures are already trickling in. Last week, we were treated to a wide range of benchmarks covering a single GeForce GTX 680. Today, VR-Zone posted a performance-preview of the GeForce GTX 680 in 2-way SLI configuration. A set of two GTX 680 cards were put through 3DMark 11 in Entry, Performance, and eXtreme presets. It should be noted here, that the GTX 680 cards were clocked at 1150 MHz core, and 1803 MHz (7.20 GHz effective) memory.

In the Entry preset, GTX 680 2-way SLI scored E22878; it scored P16860 in Performance preset; and X6243 in eXtreme. 2-way SLI of GTX 680 should be fit for 2560x1440/1600 resolution gaming. The rest of the test-bench consisted of Intel Core i7-3930K six-core processor clocked at 5.00 GHz, with 16 GB of quad-channel DDR3-2133 MHz memory, and ASUS ROG Rampage IV Extreme motherboard.

Source: VR-Zone Chinese
Add your own comment

43 Comments on GeForce GTX 680 SLI Performance Surfaces

#1
the54thvoid
by: 20mmrain
*Look everybody" I just made my own video card brand! Call it 20mm's video cards.... The bad news is I was going to bring out a card 50 times more powerful then either the GTX 680 or HD 7970 but I was so disappointed with AMD's and Nvidia's performance that I came out with one only at the same speed. Don't worry it is the (WTF104) chip which was supposed to be my low end chip. But if you keep waiting I promise I will bring out that other 50 times more powerful chip later!!!

*Cough BS Cough*
I'll have one! Does the new CBA WTF104 waterblock work on it? :D
Posted on Reply
#2
Dancop
Moderate OC...no voltage adjustments!

Everything enabled...especially thinks like tesselation!
Nobody knows if it's disabled at his screenshot!
Poor performance!
Posted on Reply
#3
Dj-ElectriC
Couldn't you found a more spamming image sharing site? next time imgur bro.

And i wouldn't call it poor, its looks good for what its meant to be in the first place hence the name KG104
Posted on Reply
#4
Mindweaver
Moderato®™
by: W1zzard
huh what? i didn't have anything to do with this. what makes you think so ?
Your sarcasm meter is low today. :p hehehe I think what he is saying is he only wants to see your review, and nothing else until then.. hehehe I wont believe any of this until I read your review either, but that doesn't mean I don't want to see other stuff. :toast: It's like trying to make everybody happy all the time.. Just not going to happen.. ehehehe:toast:
Posted on Reply
#5
Dancop
by: Dj-ElectriC
Couldn't you found a more spamming image sharing site? next time imgur bro.

And i wouldn't call it poor, its looks good for what its meant to be in the first place hence the name KG104
Sorry bro...it's my favorite picture hoster...
Posted on Reply
#6
btarunr
Editor & Senior Moderator
by: W1zzard
huh what? i didn't have anything to do with this. what makes you think so ?
He simply means he doesn't like/trust benches other than yours. You must recommend Dubstep (things like Justin Bieber Skrillex remix) to him, for the lulz.
Posted on Reply
#7
ramcoza
Laugh:

To the people who are basing nvidia: (what I comment here is according to rumors. I know that rumours can be fake or real. But I do not care anyway, I don't want one of those anyway :cool:)

"nvidia need a overclocked card to compete against 7970"
Really? Then can you go back to last generation cards. Where gtx 570's core clock was 732 MHz and memory clock was 3.8 GHz. There AMD needed a overclocked card hd 6970 (Core clock 880MHz and memory clock 5.5 GHz) to barely beat it. Let alone gtx 580. According to your useless arguments, going on everywhere under every upcoming nvidia card rumor posts. But AMD fanboys did not talk about the higher clocks AMD cards had on that time even though they had almost 150 MHz core clock and 1.7 GHz memory higher clock than nvidia. But now screaming like nothing for rumored gtx680 having only about 80 MHz core clock and 0.5 GHz memory clock advantage over 7970. What a ridiculous minds you all have. The most ridiculous thing is; they are fighting against something that doesn't exist yet. No one knows about the actual specs, performance & power consumption yet. Then why? I think they could not bear the unofficial leaks. They even know those leaks can be fake. Yet they couldn't take it. Funny enough.. :roll:

"nvidia overclocked mid-end card and priced it to high end level card"
Another funny thought. 1st of all, according to rumors, reference base clock is 1006 MHz, which means it is not overclocked card. We can consider it as an overclocked card under one circumstances, that is; if it cannot be overclocked further more and match the performance of OCed 7970. Furthermore if you think, its performance is on par with what you think mid-end performance, then you have to bash AMD for these higher prices. Not nvidia. Because their 7970's performance is also on par with gtx 680's(This is a mid-end performance card according to you), if not a lower (According to rumour again). Moreover AMD released the 7970 first and fixed the price for it. SO why would you think that nvidia have to price this card to $300? If you need this performance to be on a $300 card, simply petition AMD to lower their 7970's price to $300. I have a funny feeling that those people screaming under such posts are the people who want an AMD 7970 card for $300 - $400 price tag and cannot afford to buy one at current price. My advice for you to simply buy pitcain or last gen card from your beloved AMD. If you still thinking that nvidia have to price this card below $400(even $500) then AMD will drop prices of the tahiti cards, so you can buy one, please erase those thoughts, because that will only happen in your dream, if the rumored gtx 680's or whatever they call it's performance is correct as they said. ;)
Posted on Reply
#8
badtaylorx
quite frankly im more exited about the gk104 than i am the gk110....the X60 line has been a notorious over-achiever ..... the only thing that sux is that they're selling it @ gk110 prices :banghead:
Posted on Reply
#9
OneCool
I dont know if anyone noticed but there is 1 pissed off wiener dog in this thread.
Posted on Reply
#10
Fairlady-z
I am set with my CF7970 set up and my ek blocks come in today, but GF is now wanting to dump her two MSI 460 Cyclones for a 680 lol....guess I will be playing with a new card again :D

For what ever reason she thinks Borderlands 2 wont run on her cards... she is on crack.
Posted on Reply
#11
Dj-ElectriC
Try to explain here that UE3+Cell shading = 9600GT feels like a boss
Posted on Reply
#12
Fairlady-z
by: Dj-ElectriC
Try to explain here that UE3+Cell shading = 9600GT feels like a boss
She claims she is a noob, but seems to always want to trump my system. lol she is lucky she is hot lol.
Posted on Reply
#13
TurdFergasun
by: W1zzard
huh what? i didn't have anything to do with this. what makes you think so ?
i'm not saying you did, i was stating none of this information is relevant in any way until you are.:D
Posted on Reply
#14
Prima.Vera
Can we get a proper review now please? Getting tired of those "leaked" tests & stuff... :nutkick:
Posted on Reply
#15
HumanSmoke
by: badtaylorx
quite frankly im more exited about the gk104 than i am the gk110....
So you should be:
1. The GTX 680 launch is imminent, the GK110 is not
2. If AMD/Nvidia deem this level of performance worthy of $550, what price tag is the GK110 likely to attract if it offers 50% more ?
2a. I doubt that GK110- even at high (for gaming card) prices, would be seen as in quantity as a gaming card. Historically, Nvidia's large die GPU is aimed at HPC and the pro market (Tesla & Quadro) where the ASP's are MUCH higher, and Nvidia likely already has a sizeable chunk of production earmarked if this is anything to go by.
Posted on Reply
#16
theeldest
Hey Wiz,

Would it be possible to start pushing higher resolutions? My feeling is that the 7970 gets a bigger performance lead over the 7870 in eyefinity setups. (1920x1200 eyefinity gives 40% more pixels than a single 2560x1600). And with prices dropping on monitors I think more and more members of these forums actually run 3 monitors vs single super hi-res displays.

Also, as nVidia and AMD will both natively support 3 monitor setups on a single card I think it would fit nicely in the reviews.
Posted on Reply
#17
Red_or_Dead
by: W1zzard
huh what? i didn't have anything to do with this.
i think that was his point
Posted on Reply
#18
OneCool
People just do not understand sarcasm AT ALL.

No sense of humor either.


sad :shadedshu
Posted on Reply
Add your own comment