• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GIGABYTE GeForce GTX 680 Graphics Card Pictured

7970 OC at 1005/1500 (default CCC)
heaven2012032008491694.jpg


GTX680 stock at 1006/1500 with boost
680m.jpg


7970 OC at 1005/1500 (with AMD optimized Tesselation selected in CCC)
heaven2012032009023656.jpg


http://www.xtremesystems.org/forums...orce-GTX-780&p=5071228&viewfull=1#post5071228
 
It's not fugly at all, also why would you care how the PCI-E power ports look as you will never see them when the card is plugged in, that's a pretty lame gripe . .. but to each their own I suppose.

I'm more into Graphical Design :)
 
benchmarks have unrealistic and nonsensical amounts of load on gpu tasks, who would have known :D
 
DOES IT SUPPORT 4K and DirectX 11.1?

Was the bold necessary?

The new piece of information is that Kepler will support PCIE Gen 3 and DirectX 11.1, so it’s all set for Windows 8.

Source

Simple Google search. First result yielded that. Not sure what you mean with 4K though.

DOES IT SUPPORT 4K and DirectX 11.1?

Yes it does matter, coz if it doesn't support 4K i wont buy, if it doesn't DX11.1 i still wont buy, i will get AMD Radeon HD7xxx which does support all this, so basically to me if it doesent support this features its a worthless card and generally speaking not fast coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.

Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1

Ok now this just sounds like to me someone that is brain dead and just wants the latest and greatest and if it has a .1 of a difference that must mean its super better than the regular non .1 version because they really have no tech knowledge to begin with.

Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1

And even though those cards didnt have that .1, didnt nvidia still whoop their ass?
 
Last edited:
Was the bold necessary?



Source

Simple Google search. First result yielded that. Not sure what you mean with 4K though.



Ok now this just sounds like to me someone that is brain dead and just wants the latest and greatest and if it has a .1 of a difference that must mean its super better than the regular non .1 version because they really have no tech knowledge to begin with.



And even though those cards didnt have that .1, didnt nvidia still whoop their ass?
NV's DX10.X kitbash has some DX10.1 like features.
 
192c.jpg

i dunno whats the point they place the connector like that?
 
The Design, SUCKS! what the hell is this "GeForce GTX" Thing! at least bring the old GTX280 glowing LED :( , and no back plate?!

In EVGA We TRUST
 
People must must understand that although GTX680 is a high-end card it's replacement will come relatively fast. It's just NVIDIA's quick answer there to HD7900.
Don't get bummed for not getting high-end looks on that card.
"In EVGA We TRUST" ? makes no sense.
I will trust whoever gives me a good product.
 
i dunno whats the point they place the connector like that?
Probably has to do with freeing up property on the PCB and traces. The card has the construction of a $250-300 parts and components, but with that you get firmware/software optimizations that maximize game play to be 60Fps... right?

Can we continue using the normal maximum/average/low way of calculating performance when 60fps is always the target? From what I see this thing promotes the "average" and nothing higher. It a way of unearthing efficiency and lowering thermals, while giving it a shot of nitrous when not keeping up to the mean! So will all the old data and graphs necessarily translate to apple-to-apples as provided in the past? Will it look/play great... in theory yes, though will Adaptive V-Sync sire new glitchy-ness over the traditional lag and spike as we’ve know it in past. Averaging at 60 Fps will not look all that different than say a 7970, which at say the old way of looking at an average hits say 80 Fps (that's what Nvidia is betting on). This really in my mind stops the whole I’m fastest, changes it to I’m the sam at providing the average, because no one can tell the difference. Kind’of the old "more than a mouth full is a waste"!

I don’t know what this now means, but traditional testing like W1zzard been done may well have very little merit any longer. We might need more of graphs like [H]ard|OPC has done those "spiky" graphs as the games played; although now there will be this slightly wavy green line hugging right a 60fps. Well that’s boring might really minimize graphic card reviews as we know it, sure plays BF3.... 60fps.

It feels like Nvidia took the ball and ran into left field and is saying, "the game as you knew it has changed". Which isn’t wrong or bad, but it really feels like the old way of figuring the best card has changed... but the price remains the same! Now, here’s a question why now can there be enthusist cards? In theory any GPU that’s strong and sought enough to game the newest or most demanding titles as long as they can clock it fast enough for the few milliseconds to keep the game from dipping below 60fps is all the best newest offerings need to be. The new mantra will be "we can render that @ XXXX resolution with this level/type of AA (as in Nvidia TXAA, anti-aliasing algorithm). It’s no longer about being fastest or faster. So if AMD take the Cape Verde and does this, are we all okay with it?

:twitch:
 
Last edited:
Back
Top