Monday, March 19th 2012
GIGABYTE GeForce GTX 680 Graphics Card Pictured
Christmas came early to Overclock.net forum member "ironman86", who showed off his swanky new GeForce GTX 680 graphics card, branded by GIGABYTE (model: GV-N680D5-2GD-B). The card sticks to NVIDIA reference design to the book, except of course a sci-fi sticker on the cooler and fan. A futuristic art piece, with GIGABYTE and GTX 680 ("8" obscured by the protective plastic film). The card indeed draws power from two 6-pin power connectors, settling the power-connector debate once and for all, since this is the first picture of a retail-channel GTX 680.
Source:
Overclock.net
88 Comments on GIGABYTE GeForce GTX 680 Graphics Card Pictured
And since Unigine is a heavy tessellation benchmark it makes a huge difference to optimize this value.
Simple Google search. First result yielded that. Not sure what you mean with 4K though. Ok now this just sounds like to me someone that is brain dead and just wants the latest and greatest and if it has a .1 of a difference that must mean its super better than the regular non .1 version because they really have no tech knowledge to begin with. And even though those cards didnt have that .1, didnt nvidia still whoop their ass?
i dunno whats the point they place the connector like that?
In EVGA We TRUST
Don't get bummed for not getting high-end looks on that card.
"In EVGA We TRUST" ? makes no sense.
I will trust whoever gives me a good product.
Can we continue using the normal maximum/average/low way of calculating performance when 60fps is always the target? From what I see this thing promotes the "average" and nothing higher. It a way of unearthing efficiency and lowering thermals, while giving it a shot of nitrous when not keeping up to the mean! So will all the old data and graphs necessarily translate to apple-to-apples as provided in the past? Will it look/play great... in theory yes, though will Adaptive V-Sync sire new glitchy-ness over the traditional lag and spike as we’ve know it in past. Averaging at 60 Fps will not look all that different than say a 7970, which at say the old way of looking at an average hits say 80 Fps (that's what Nvidia is betting on). This really in my mind stops the whole I’m fastest, changes it to I’m the sam at providing the average, because no one can tell the difference. Kind’of the old "more than a mouth full is a waste"!
I don’t know what this now means, but traditional testing like W1zzard been done may well have very little merit any longer. We might need more of graphs like [H]ard|OPC has done those "spiky" graphs as the games played; although now there will be this slightly wavy green line hugging right a 60fps. Well that’s boring might really minimize graphic card reviews as we know it, sure plays BF3.... 60fps.
It feels like Nvidia took the ball and ran into left field and is saying, "the game as you knew it has changed". Which isn’t wrong or bad, but it really feels like the old way of figuring the best card has changed... but the price remains the same! Now, here’s a question why now can there be enthusist cards? In theory any GPU that’s strong and sought enough to game the newest or most demanding titles as long as they can clock it fast enough for the few milliseconds to keep the game from dipping below 60fps is all the best newest offerings need to be. The new mantra will be "we can render that @ XXXX resolution with this level/type of AA (as in Nvidia TXAA, anti-aliasing algorithm). It’s no longer about being fastest or faster. So if AMD take the Cape Verde and does this, are we all okay with it?
:twitch: