Monday, March 19th 2012

GIGABYTE GeForce GTX 680 Graphics Card Pictured

Christmas came early to Overclock.net forum member "ironman86", who showed off his swanky new GeForce GTX 680 graphics card, branded by GIGABYTE (model: GV-N680D5-2GD-B). The card sticks to NVIDIA reference design to the book, except of course a sci-fi sticker on the cooler and fan. A futuristic art piece, with GIGABYTE and GTX 680 ("8" obscured by the protective plastic film). The card indeed draws power from two 6-pin power connectors, settling the power-connector debate once and for all, since this is the first picture of a retail-channel GTX 680.

Source: Overclock.net
Add your own comment

88 Comments on GIGABYTE GeForce GTX 680 Graphics Card Pictured

#2
NHKS
st.bone said:
DOES IT SUPPORT 4K and DirectX 11.1?
i guess it will support 4K res like 7970.. but DX11.1 not sure.. the Gigabyte box says only DX11.. still no official info on either of these specs..

that 'Heaven benchmark' is @ 1600x900.. ironman86 @ OCN probably doesn't have a full HD monitor
he says - "sorry for not in 1080p resolution,i only got 20" inch monitor to test "

earlier when someone asked him to confirm the clocks in nvidia control panel.. he did and said - "i see is same clock as GPU-Z,so im not posted it"
Posted on Reply
#3
Crap Daddy
NHKS said:
i guess it will support 4K res like 7970.. but DX11.1 not sure.. the Gigabyte box says only DX11.. still no official info on either of these specs..

that 'Heaven benchmark' is @ 1600x900.. ironman86 @ OCN probably doesn't have a full HD monitor
he says - "sorry for not in 1080p resolution,i only got 20" inch monitor to test "

earlier when someone asked him to confirm the clocks in nvidia control panel.. he did and said - "i see is same clock as GPU-Z,so im not posted it"
Hmm. Is he running the card at lowish clocks because the results are pretty bad?
Posted on Reply
#4
NHKS
Crap Daddy said:
Hmm. Is he running the card at lowish clocks because the results are pretty bad?
i guess not.. his comment on the clocks(shown by nV control panel) was before his heaven benchmark..

also his CPU is i3-2100 & 32-bit OS.. bottleneck could be one of the reasons.. but the heaven scores cannot be compared directly...

UPDATE: just to compare with ironman86's GTX680 Heaven score of 1362, one of the members at OCN did a Heaven run using 7970 @ 1600x900 (he has i7) and here is his score
Posted on Reply
#6
Crap Daddy
NHKS said:
i guess not.. his comment on the clocks(shown by nV control panel) was before his heaven benchmark..

also his CPU is i3-2100 & 32-bit OS.. bottleneck could be one of the reasons.. but the heaven scores cannot be compared directly...
Something is wrong there. He also said that the clocks (700MHz) are shown in NV CP but I still believe that this is not base clock for GTX680 hence the results which are poor.

I also don't think we are talking about any bottleneck as Crysis 1 has nothing to do with more than 2 fast cores which the i3 provides a plenty and he says something around 50FPS with only 4AA at that resolution. If I remember corectly I had above 50 on 1680x1050 (higher res) with my GTX570.

So it's either the bios, the driver or settings in NV CP.
Posted on Reply
#7
Shurakai
Seems pretty on par, can't take the minimum fps into account because I bet he ran that test asap without letting it loop a bit hence the very low 8fps for a split second but still acceptable average.
Posted on Reply
#8
[H]@RD5TUFF
I'm thinking that i3 gimped the results.
Posted on Reply
#9
Dj-ElectriC
I don't, the unigine heaven test gives the same results with either I3 2100 or I5 2500
Posted on Reply
#10
Protagonist
Dj-ElectriC said:
Does it that much matter to you?
DOES IT SUPPORT 4K and DirectX 11.1?

Yes it does matter, coz if it doesn't support 4K i wont buy, if it doesn't DX11.1 i still wont buy, i will get AMD Radeon HD7xxx which does support all this, so basically to me if it doesent support this features its a worthless card and generally speaking not fast coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.

Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1
Posted on Reply
#11
Dj-ElectriC
CAPS I DONT KNOW NOW IT ISNT OUT YET.

And the lack of DX10.1 support didn't disturbed the GTX260 216SP to whop HD4870's behind...
Posted on Reply
#12
beck24
Official benchies soon. These are rubbish.
Posted on Reply
#13
Dj-ElectriC
Alright, i had this aluminum biscuit tested again

Posted on Reply
#14
Protagonist
Dj-ElectriC said:
CAPS I DONT KNOW NOW IT ISNT OUT YET.

And the lack of DX10.1 support didn't disturbed the GTX260 216SP to whop HD4870's behind...
At list some one who had HD4870 could see the details in DX10.1 unlike those who had Nvidia 8..... 200 series. who could not see the details of DX10.1,

I Had a 9800GTX then and i would not repeat the same mistake, i will go for features this time round for my next GPU purchase.

And for the record i have used both camps, currently on Geforce GTX460 1GB 336cuda 256bit, which i bought July 2010, and it replaced My AMD Radeon HD5770. other wise its a long list of both GeForce, Radeon & Intel iGPs.
Currently on 1920x1080 and not switching to higher soon, may be 2013 ill switch to 3k or 4k displays that will be available
Posted on Reply
#15
lZKoce
st.bone said:
DOES IT SUPPORT 4K and DirectX 11.1?

Yes it does matter, coz if it doesn't support 4K i wont buy, if it doesn't DX11.1 i still wont buy, i will get AMD Radeon HD7xxx which does support all this, so basically to me if it doesent support this features its a worthless card and generally speaking not fast coz does not possess the same features it would be a cranked up last gen GPU by the name of Kepler.

Kind of like Nvidia 8,9 100, 200, were all DX10, while AMD Radeon HD4xxx had DX 10.1
(facepalm). Does the lack of 11.1 have a HUGE impact on your productivity level such as windtunnel testing, programming, rendering and actual income on using it? And the card is not out yet, so on what basis did you deduce that is "not fast and worthelss" I just don't know :)
Posted on Reply
#16
Protagonist
lZKoce said:
(facepalm). Does the lack of 11.1 have a HUGE impact on your productivity level such as windtunnel testing, programming, rendering and actual income on using it? And the card is not out yet, so on what basis did you deduce that is "not fast and worthelss" I just don't know :)
It's my money i am planning to spend on a new GPU that replaces my GTX460 which has served me very well no complains, but if it turns out that the GTX680 has no 4K and DX 11.1 support then it should cost less (hence the worthless) which i still would not buy even if it were less and doesn't have 4K and DX11.1 support, my current card does a great job not supporting those features.
So i want to spend my money on features... and a faster card than what i have currently, which does a great job anyway.
Posted on Reply
#17
Crap Daddy
st.bone said:
It's my money i am planning to spend on a new GPU that replaces my GTX460 which has served me very well no complains, but if it turns out that the GTX680 has no 4K and DX 11.1 support then it should cost less (hence the worthless) which i still would not buy even if it were less and doesn't have 4K and DX11.1 support, my current card does a great job not supporting those features.
So i want to spend my money on features... and a faster card than what i have currently, which does a great job anyway.
You know that the GTX680 has only 2GB of memory? That's not a concern for you?
Posted on Reply
#18
lZKoce
st.bone said:
It's my money i am planning to spend on a new GPU that replaces my GTX460 which has served me very well no complains, but if it turns out that the GTX680 has no 4K and DX 11.1 support then it should cost less (hence the worthless) which i still would not buy even if it were less and doesn't have 4K and DX11.1 support, my current card does a great job not supporting those features.
So i want to spend my money on features... and a faster card than what i have currently, which does a great job anyway.
Of course it's your money. I don't want to be intrusive or rude. I was just pointing out that these arguments don't make sense to me personally. But I am biased anyway. So I am sorry if I caused any disturbance.
Posted on Reply
#19
Protagonist
Crap Daddy said:
You know that the GTX680 has only 2GB of memory? That's not a concern for you?
No its not.

At list Intel Ivy bridge processors will support 4K, i plan on getting a 4K or 3K display next year 2013.
Posted on Reply
#21
Count Shagula
st.bone said:
No its not.

At list Intel Ivy bridge processors will support 4K, i plan on getting a 4K or 3K display next year 2013.
4096×3072.

I cannot even begin to imagine the cost of such a screen and what power would be needed to power games on it.
Posted on Reply
#22
Protagonist
Count Shagula said:
4096×3072.

I cannot even begin to imagine the cost of such a screen and what power would be needed to power games on it.
I mean 4K (4096x2xxx)

or 3K (3860x2xxx)


money is no problem
Posted on Reply
#23
brandonwh64
Addicted to Bacon and StarCrunches!!!
st.bone said:
I mean 4K (4096x2xxx)

or 3K (3860x2xxx)


money is no problem
Even so, I doubt this card alone would handle that res in games such as BF3. you would need at minimum, 2 or more.
Posted on Reply
#24
Protagonist
brandonwh64 said:
Even so, I doubt this card alone would handle that res in games such as BF3. you would need at minimum, 2 or more.
4K or 3K i do not intend for gaming mostly for photo editing
Posted on Reply
#25
brandonwh64
Addicted to Bacon and StarCrunches!!!
st.bone said:
4K or 3K i do not intend for gaming mostly for photo editing
fair enough
Posted on Reply
Add your own comment