Monday, March 19th 2012

GIGABYTE GeForce GTX 680 Graphics Card Pictured

Christmas came early to Overclock.net forum member "ironman86", who showed off his swanky new GeForce GTX 680 graphics card, branded by GIGABYTE (model: GV-N680D5-2GD-B). The card sticks to NVIDIA reference design to the book, except of course a sci-fi sticker on the cooler and fan. A futuristic art piece, with GIGABYTE and GTX 680 ("8" obscured by the protective plastic film). The card indeed draws power from two 6-pin power connectors, settling the power-connector debate once and for all, since this is the first picture of a retail-channel GTX 680.
Source: Overclock.net
Add your own comment

88 Comments on GIGABYTE GeForce GTX 680 Graphics Card Pictured

#76
MxPhenom 216
ASIC Engineer
jaredpace7970 OC at 1005/1500 (default CCC)
img20.imageshack.us/img20/2950/heaven2012032008491694.jpg

GTX680 stock at 1006/1500 with boost
img853.imageshack.us/img853/4920/680m.jpg

7970 OC at 1005/1500 (with AMD optimized Tesselation selected in CCC)
img706.imageshack.us/img706/1745/heaven2012032009023656.jpg

www.xtremesystems.org/forums/showthread.php?277763-%93Kepler%94-Nvidia-GeForce-GTX-780&p=5071228&viewfull=1#post5071228
AMD Optimized Tesselation? :laugh:
Posted on Reply
#78
alexsubri
[H]@RD5TUFFIt's not fugly at all, also why would you care how the PCI-E power ports look as you will never see them when the card is plugged in, that's a pretty lame gripe . .. but to each their own I suppose.
I'm more into Graphical Design :)
Posted on Reply
#80
semantics
benchmarks have unrealistic and nonsensical amounts of load on gpu tasks, who would have known :D
Posted on Reply
#81
Rahmat Sofyan
...
TRWOVImagine if that was an actual CCFL :D
hahaha...LOL, it'll increase the heat I guess :laugh:
Posted on Reply
#82
CrAsHnBuRnXp
st.boneDOES IT SUPPORT 4K and DirectX 11.1?
Was the bold necessary?
The new piece of information is that Kepler will support PCIE Gen 3 and DirectX 11.1, so it’s all set for Windows 8.
Source

Simple Google search. First result yielded that. Not sure what you mean with 4K though.
st.boneDOES IT SUPPORT 4K and DirectX 11.1?

Yes it does matter, coz if it doesn't support 4K i wont buy, if it doesn't DX11.1 i still wont buy, i will get AMD Radeon HD7xxx which does support all this, so basically to me if it doesent support this features its a worthless card and generally speaking not fast coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.

Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1
Ok now this just sounds like to me someone that is brain dead and just wants the latest and greatest and if it has a .1 of a difference that must mean its super better than the regular non .1 version because they really have no tech knowledge to begin with.
Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1
And even though those cards didnt have that .1, didnt nvidia still whoop their ass?
Posted on Reply
#83
ValenOne
CrAsHnBuRnXpWas the bold necessary?



Source

Simple Google search. First result yielded that. Not sure what you mean with 4K though.



Ok now this just sounds like to me someone that is brain dead and just wants the latest and greatest and if it has a .1 of a difference that must mean its super better than the regular non .1 version because they really have no tech knowledge to begin with.



And even though those cards didnt have that .1, didnt nvidia still whoop their ass?
NV's DX10.X kitbash has some DX10.1 like features.
Posted on Reply
#84
micropage7

i dunno whats the point they place the connector like that?
Posted on Reply
#86
HuLkY
The Design, SUCKS! what the hell is this "GeForce GTX" Thing! at least bring the old GTX280 glowing LED :( , and no back plate?!

In EVGA We TRUST
Posted on Reply
#87
dj-electric
People must must understand that although GTX680 is a high-end card it's replacement will come relatively fast. It's just NVIDIA's quick answer there to HD7900.
Don't get bummed for not getting high-end looks on that card.
"In EVGA We TRUST" ? makes no sense.
I will trust whoever gives me a good product.
Posted on Reply
#88
Casecutter
micropage7i dunno whats the point they place the connector like that?
Probably has to do with freeing up property on the PCB and traces. The card has the construction of a $250-300 parts and components, but with that you get firmware/software optimizations that maximize game play to be 60Fps... right?

Can we continue using the normal maximum/average/low way of calculating performance when 60fps is always the target? From what I see this thing promotes the "average" and nothing higher. It a way of unearthing efficiency and lowering thermals, while giving it a shot of nitrous when not keeping up to the mean! So will all the old data and graphs necessarily translate to apple-to-apples as provided in the past? Will it look/play great... in theory yes, though will Adaptive V-Sync sire new glitchy-ness over the traditional lag and spike as we’ve know it in past. Averaging at 60 Fps will not look all that different than say a 7970, which at say the old way of looking at an average hits say 80 Fps (that's what Nvidia is betting on). This really in my mind stops the whole I’m fastest, changes it to I’m the sam at providing the average, because no one can tell the difference. Kind’of the old "more than a mouth full is a waste"!

I don’t know what this now means, but traditional testing like W1zzard been done may well have very little merit any longer. We might need more of graphs like [H]ard|OPC has done those "spiky" graphs as the games played; although now there will be this slightly wavy green line hugging right a 60fps. Well that’s boring might really minimize graphic card reviews as we know it, sure plays BF3.... 60fps.

It feels like Nvidia took the ball and ran into left field and is saying, "the game as you knew it has changed". Which isn’t wrong or bad, but it really feels like the old way of figuring the best card has changed... but the price remains the same! Now, here’s a question why now can there be enthusist cards? In theory any GPU that’s strong and sought enough to game the newest or most demanding titles as long as they can clock it fast enough for the few milliseconds to keep the game from dipping below 60fps is all the best newest offerings need to be. The new mantra will be "we can render that @ XXXX resolution with this level/type of AA (as in Nvidia TXAA, anti-aliasing algorithm). It’s no longer about being fastest or faster. So if AMD take the Cape Verde and does this, are we all okay with it?

:twitch:
Posted on Reply
Add your own comment
Apr 19th, 2024 12:32 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts