Tuesday, March 6th 2012

GeForce GTX 680 Final Clocks Exposed, Allegedly

Waiting on Kepler before making a new GPU purchase? Well, you have to wait a little longer. Thankfully, this wait can be eased with the latest leaks about NVIDIA's 28 nm chip and the GeForce GTX 680 it powers.

According to VR-Zone, the GTX 680 does indeed feature 1536 CUDA Cores and a 256-bit memory interface, but it also has hotclocks, meaning the GPU is set to 705 MHz but the shaders operate at 1411 MHz. The memory (2 GB most likely) is supposed to be clocked at 6000 MHz giving a total memory bandwidth of 192 GB/s.

NVIDIA's incoming card is 10 inches long and also has 3-way SLI support, and four display outputs - two DVI, one HDMI and one DisplayPort. The GeForce GTX 680 is expected to be revealed on March 12 and should become available on March 23rd.

Source: VR-Zone
Add your own comment

63 Comments on GeForce GTX 680 Final Clocks Exposed, Allegedly

#1
arnoo1
Thank god! Nvidia sticks to hot clocks!, this card must destroy the 7970! 3x more shaders than gtx580!
Me wants benchmarks so bad!!
Posted on Reply
#4
Live OR Die
by: legends84
is this $299??
:laugh: your joking right more like 2x that more like $600-700
Posted on Reply
#5
DarkOCean
by: live or die
:laugh: Your joking right more like 2x that more like $600-700
+1
Posted on Reply
#7
djxinator
2GB of memory?

Looks like I'll get my 7950 Twin Frozr with Triple ASUS 24inch Monitors on order.

Enjoy your "2D Surround" memory bottlenecks.
Posted on Reply
#8
btarunr
Editor & Senior Moderator
by: symmetrical
1536 CUDA cores!?!?!?!?
Yes, one thousand five thirty six. Only they're radically different from the ones found on current NV GPUs.
Posted on Reply
#9
Live OR Die
by: djxinator
2GB of memory?

Looks like I'll get my 7950 Twin Frozr with Triple ASUS 24inch Monitors on order.

Enjoy your "2D Surround" memory bottlenecks.
Tri monitors using a 7950 :wtf: that 1gb per screen Enjoy your bottle neck. sorry you wont get one if your playing games from the 80s.
Posted on Reply
#11
punani
by: btarunr
Yes, one thousand five thirty six. Only they're radically different from the ones found on current NV GPUs.
I must have missed this. Any link or additional info on how the new CUDA cores perform compared to the old ones ?
Posted on Reply
#12
wolf
Performance Enthusiast
bring on the benchmarks!! this has the potential to best a 7970 handily IMO, time will tell, and with any luck a price war ensues!
Posted on Reply
#13
blibba
And they're calling it the 680. I sense another naming scheme f**kup when they have to call GK100/GK110 the 780...
Posted on Reply
#14
20mmrain
Since this is a brand new architecture sure it might have 3x as many cuda..... but I thought it was established awhile ago that Nvidia was using a less power hungry architecture and the cuda cores were less powerful? So if that is still true 1536 shaders might not be as hard core as we think?
I don't know much about GPU architecture but I do remember reading this point several times in this forum and others
Posted on Reply
#15
pioneer
by: Hayder_Master
epic fail
like AMD Faildozer??i hope this not happend- nvidia is not like AMD fail producer- nvidia every when do its job well - in 5870 dx11 is like joke - gtx480 does com to fix it - now GK104 come to defeat Tahiti and GK100 come to exterminate any thing;)
Posted on Reply
#16
marek100
OK nice but some benchmarks pls.
Posted on Reply
#17
jigar2speed
by: blibba
And they're calling it the 680. I sense another naming scheme f**kup when they have to call GK100/GK110 the 780...
This is the only way they can fool people.
Posted on Reply
#18
Aquinus
Resident Wat-man
I like how the English VR-Zone cites the Chinese VR-Zone and the Chinese VR-Zone doesn't have any sources. Damn it, nVidia, stop making us guess and just release the damn thing already! :banghead:
Posted on Reply
#19
blibba
AMD have exactly the same issue. Because they moved their whole naming scheme up a notch, abandoned the "X2" name and couldn't bring themself to not call the GCN launch flagship the 7970, the new higher clocked version will either have to be the 7980 (leaving too little distance between it and any dual-GPU 7990) or the 8970 (implying it's something new and different, which it isn't).

Imo, the 7970 should have been launched as the 7960, allowing the new card to be called a 7970. The GK104 GTX680 should be called, at most, the GTX670Ti, allowing for the variously crippled GK100 variants to be the GTX680SE, GTX680, GTX680Ti (and GTX680 Ultra?). They can then safely call any dual-GPU variant of either GK104 or GK100 (unlikely to do both) the GXT690, wtihout looking like retards. Honestly, I'd actually buy and recommend cards more readily from either of them if they could get their naming sh*t in order. It's dishonest and violates my OCD.
Posted on Reply
#20
Aquinus
Resident Wat-man
by: blibba
AMD have exactly the same issue. Because they moved their whole naming scheme up a notch, abandoned the "X2" name and couldn't bring themself to not call the GCN launch flagship the 7970, the new higher clocked version will either have to be the 7980 (leaving too little distance between it and any dual-GPU 7990) or the 8970 (implying it's something new and different, which it isn't).

Imo, the 7970 should have been launched as the 7960, allowing the new card to be called a 7970. The GK104 GTX680 should be called, at most, the GTX670Ti, allowing for the variously crippled GK100 variants to be the GTX680SE, GTX680, GTX680Ti (and GTX680 Ultra?). They can then safely call any dual-GPU variant of either GK104 or GK100 (unlikely to do both) the GXT690, wtihout looking like retards. Honestly, I'd actually buy and recommend cards more readily from either of them if they could get their naming sh*t in order. It's dishonest and violates my OCD.
The name shouldn't be what makes you decide what video card you're getting. Who cares what the name is if you're looking at the specs?
Posted on Reply
#21
blibba
by: Aquinus
The name shouldn't be what makes you decide what video card you're getting. Who cares what the name is if you're looking at the specs?
You misunderstand my point. Yes, one shouldn't pick a videocard by its name. But many people do, and that's what makes the prevalent naming systems a bad thing - they deliberately mislead all but the best informed customers.

Additionally, when their naming systems are retarded, what does that say about them? Do I really want to purchase a 3 billion transistor £400 GPU from a company that can't devise a functional, logical naming system? It doesn't exactly inspire confidence.
Posted on Reply
#22
Dj-ElectriC
192GB/s makes me worry...

by: pioneer
like AMD Faildozer??i hope this not happend- nvidia is not like AMD fail producer- nvidia every when do its job well - in 5870 dx11 is like joke - gtx480 does com to fix it - now GK104 come to defeat Tahiti and GK100 come to exterminate any thing;)
HD5870 is poopoo, GTX480 witch consumes twice fix it. Sure, very reasonable.

by: Live OR Die
Tri monitors using a 7950 :wtf: that 1gb per screen Enjoy your bottle neck. sorry you wont get one if your playing games from the 80s.
NVIDIA fanboy is being a fanboy? then go and buy ur green GPUs who has less VRAM and does not support triple monitors per single GPU.
Im sure you will be pleased with a couple of GTX570, a whole 1280MB for 5760X1080
Posted on Reply
#23
Live OR Die
by: Dj-ElectriC
192GB/s makes me worry...



HD5870 is poopoo, GTX480 witch consumes twice fix it. Sure, very reasonable.



NVIDIA fanboy is being a fanboy? then go and buy ur green GPUs who has less VRAM and does not support triple monitors per single GPU.
Im sure you will be pleased with a couple of GTX570, a whole 1280MB for 5760X1080
I am a fanboy i glad to be AMD can go jump in front of a bus, and how do you even know if the 680 cant support tri monitors it has DisplayPort as well :rolleyes:.

And glad to see you have 2x 7970 i also had crossfire and sold both cards and glad there gone 120Hz or nothing!
Posted on Reply
#24
blibba
To be fair, I tried eyefinity recently, and even in racing games, where it works best, I felt it was a total waste. I'd rather a single large monitor, for which 2GB is still overkill.

Remember also that GK104 is designed to be a cost effective solution to take the fight to AMD in the mid-range later on - it's standing in as a high-end product now because GK100 is taking so long.
Posted on Reply
#25
Dj-ElectriC
by: Live OR Die
I am a fanboy i glad to be AMD can go jump in front of a bus, and how do you even know if the 680 cant support tri monitors it has DisplayPort as well :rolleyes:.

And glad to see you have 2x 7970 i also had crossfire and sold both cards and glad there gone 120Hz or nothing!
I dont pay a dime for the GPUs i have here, just the pleasures of being a reviewer.
And to conclude u have nothing smart to say and probably agreed with what i said, its ok son.

I still don't know what the GTX680 has to offer, so i can't really say anything about it. I just hope it will include out-of-the-box triple monitor support.
Posted on Reply
Add your own comment