Monday, March 19th 2012

GIGABYTE GeForce GTX 680 Graphics Card Pictured

Christmas came early to Overclock.net forum member "ironman86", who showed off his swanky new GeForce GTX 680 graphics card, branded by GIGABYTE (model: GV-N680D5-2GD-B). The card sticks to NVIDIA reference design to the book, except of course a sci-fi sticker on the cooler and fan. A futuristic art piece, with GIGABYTE and GTX 680 ("8" obscured by the protective plastic film). The card indeed draws power from two 6-pin power connectors, settling the power-connector debate once and for all, since this is the first picture of a retail-channel GTX 680.

Source: Overclock.net
Add your own comment

88 Comments on GIGABYTE GeForce GTX 680 Graphics Card Pictured

#1
Bjorn_Of_Iceland
That i3 bench has the i3 bottlenecking it I guess.. I can get 53 fps something with this 775 quad and the GTX580
Posted on Reply
#4
Delta6326
I think the 7970 and 680 are going to be neck an neck.
Posted on Reply
#5
MxPhenom 216
Corsair Fanboy
by: amdftw
Confirmed! The stock clock is 706/1502MHz.
By this way it is easy to have the GTX680 195W TDP.
CONFIRMED!

......what?

Where in the article does it say the stock clock is 706mhz on the core lol! :laugh:


EDIT: I will definitely be getting one of these cards for sure around summer time.
Posted on Reply
#6
15th Warlock
by: st.bone
DOES IT SUPPORT 4K and DirectX 11.1?

Yes it does matter, coz if it doesn't support 4K i wont buy, if it doesn't DX11.1 i still wont buy, i will get AMD Radeon HD7xxx which does support all this, so basically to me if it doesent support this features its a worthless card and generally speaking not fast coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.

Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1
by: st.bone
4K or 3K i do not intend for gaming mostly for photo editing
I don't know if you're trolling... :wtf::shadedshu
Posted on Reply
#7
amdftw
by: nvidiaintelftw
CONFIRMED!

......what?

Where in the article does it say the stock clock is 706mhz on the core lol! :laugh:


EDIT: I will definitely be getting one of these cards for sure around summer time.
Can you read this?
The Gpu-z capture.
Posted on Reply
#8
MxPhenom 216
Corsair Fanboy
by: 15th Warlock
I don't know if you're trolling... :wtf::shadedshu
god yeah wtf.

DX10.1 didn't do shit for ATI cards. The GTX260 Core 216 was still raping them.
Posted on Reply
#9
MxPhenom 216
Corsair Fanboy
by: amdftw
Can you read this?
and so then that means according to some other stuff that has been released they are capable of a 63% overclock.

That guys i3 is holding the 680 back a bit, but it stil lbeat a guys 7970 at the 706mhzx,unless the dynamic clock stuff took it to 1006. Either way once drivers are better(since the minimum FPS seems a bit low) I think this card will take off.
Posted on Reply
#10
okidna
I notice that the driver version is different.
300.99 in the earlier SLI leak and 300.65 in this Gigabyte retail.
Posted on Reply
#11
semantics
by: okidna
I notice that the driver version is different.
300.99 in the earlier SLI leak and 300.65 in this Gigabyte retail.
not that surprising that a retail box would have old drivers on a new card which will likely go though several driver updates in the months following release. Gotta pre package that stuff weeks in advance.
Posted on Reply
#13
amdftw
by: Crap Daddy
That version of GPU-z DOES NOT SUPPORT KEPLER. So we don't really know.
So you can not read...
Ha said the NV control panel showed the same clocks!
Posted on Reply
#14
Crap Daddy
by: amdftw
So you can not read...
Ha said the NV control panel showed the same clocks!
OK wise guy look here, same forum, same thread, another aparent owner of a GTX680 a little later:



The right GPU-z version.

Posted on Reply
#15
Protagonist
by: Crap Daddy
OK wise guy look here, same forum, same thread, another aparent owner of a GTX680 a little later:

http://i43.tinypic.com/2am93r.jpg

The right GPU-z version.

http://i41.tinypic.com/34etiu0.jpg
Thanks for this at list my mind is at peace knowing it supports DX11.1 according to GPU-Z 0.6.0 now just to confirm if it supports 4K if so, i might just still use Nvidia for my next GPU upgrade, tho AMD Radeon HD7970 seems appealing.
Posted on Reply
#16
Crap Daddy
I really don't think that's important right now. Things are really heating up over at Overclockers.
Look at the unigine result and compare.
Posted on Reply
#17
ChristTheGreat
There is one guy showing the card doing about 2000 at heaven benchmark.. And one guy showed a score of GTX 580 SLI, 2200...
Posted on Reply
#18
MxPhenom 216
Corsair Fanboy
by: Crap Daddy
I really don't think that's important right now. Things are really heating up over at Overclockers.
Look at the unigine result and compare.
linky. I want to watch/read it.
Posted on Reply
#20
Crap Daddy
So GPU-z reads: 294mm2 and release date: Mar. 22.2012

Soon...
Posted on Reply
#21
Casecutter
by: nvidiaintelftw
and so then that means according to some other stuff that has been released they are capable of a 63% overclock.
Yep! that's what the whole Adaptive V-Sync (also what was termed Dynamic profiles) does... it manipulates the clocks of GPU/memory, along with stimulates or restrain sections of cuda cores as need to alter and maintain frame-rate transitions to be "fluid and/or organic". Basically when 3D scene starts dropping frame-rates below monitor refresh-rate or above, it limits or augments such resources trying to maintain close to 60Fps. As the scene loads the GPU and other chip/card resources provide exactly how much energy each single scene require smooth frame-rates. So within milliseconds the card juggles various resources dynamically against present profiles.

What's nice is now Nvidia doesn’t need to supply a cooling system built to handle the constant say 40-50% top OC’n spikes, it’s BTU performance can be scaled back, because chiefly the profiles will maintain the lowest clock for the 3D load and FpS required. The chance of a 40-50% OC boost might last a few seconds then be down again. That permits a theoretically higher max TDP, because they turn up the heat that high very infrequently. The only downside is the conventional OC’n enthusiasts are use to may no longer be there, if you disable the Adaptive V-Sync now that chip has a TDP of 160W consistently, OC’d at say 775Mhz.
Posted on Reply
#23
alexsubri
I still am worried about this...its 256-bit and not 384-bit ...it's 2GB and not 3GB ...so if you were to play n-finity (nvidia's eyefiniti) you won't get the best bang for you buck. For the price to be 15% higher than a 7970 it better damn do better than a 7970! I just don't see the comparison that much here ... BTW, what's with the 6-pin connectors?
Posted on Reply
#24
[H]@RD5TUFF
by: alexsubri
I still am worried about this...its 256-bit and not 384-bit ...it's 2GB and not 3GB ...so if you were to play n-finity (nvidia's eyefiniti) you won't get the best bang for you buck. For the price to be 15% higher than a 7970 it better damn do better than a 7970! I just don't see the comparison that much here ... BTW, what's with the 6-pin connectors? http://www.techpowerup.com/forums/attachment.php?attachmentid=46289&stc=1&d=1332212566
It's not fugly at all, also why would you care how the PCI-E power ports look as you will never see them when the card is plugged in, that's a pretty lame gripe . .. but to each their own I suppose.
Posted on Reply
Add your own comment