Saturday, September 24th 2011

EVGA GTX 580 Classified + watercooling Doubles Core Clock Speed!

The EVGA GTX 580 Classified 3072MB, previously announced on TechPowerUp, is now available to buy according to this forum post by an EVGA product manager - in limited quantities, of course. According to Gaming Blend, this card can amazingly reach a doubled 1.6GHz core clock when overclocked using waterblocks - GTX 590 eat your heart out! This card also has custom designed VRMs to take all the extra power that the card will use, which means that they won't squeal when the card is overclocked hard and also when running intensive applications such as Folding@Home.
UPDATE: Turns out that 1.6GHz overclock was actually achieved using LN2, not water. To confirm it, click the EVGA promo link after the jump and see the extreme cooling section video, or just skip directly to the YouTube video here.



The design comes courtesy of Vince "k|ngp|n" Lucido and Illya "TiN" Tsemenko who wanted a powerful card without the limitations of SLI, saying:
While competitors spend time adding dual GPUs to cards that already support SLI, increasing them to 3 slots, crippling them with only 1.5GB of memory, and/or quadrupling the price, we wanted to take a different approach. One that made sense for both gamers and enthusiasts…"With the EVGA GeForce GTX 580 Classified, we focused on the key elements that the GPU and Memory need when overclocking. Power, stability, noise ripple reduction and extreme OC mode without limits. The design was created with overclocking in mind
EVGA promo page

EVGA product page

It's currently in stock at NewEgg for $599.99

Basic specs:

Performance
NVIDIA GTX 580
512 CUDA Cores
855 MHz GPU
1710MHz Shader Clock

Memory
3072 MB, 384 bit GDDR5
4212 MHz (effective)
202.1 GB/s Memory Bandwidth

Interface
PCI-E 2.0 16x
DVI-I, DVI-I, EVBot
SLI Capable

Resolution & Refresh
240Hz Max Refresh Rate
2048x1536 Max Analog
2560x1600 Max Digital

Product Warranty
This product comes with a Limited Lifetime warranty with registration within 30 days of purchase.
Add your own comment

51 Comments on EVGA GTX 580 Classified + watercooling Doubles Core Clock Speed!

#1
qubit
Overclocked quantum bit
Thanks to _JP_ for the tip. :toast:

Oh and this is post 5000 and I can now have my custom title - I finally got there! :rockout:
Posted on Reply
#2
tigger
I'm the only one
GZ mate on 5k
Posted on Reply
#3
IlluminAce
Congrats

Halfway to five figures mate. I don't know what's more impressive - the post count or the card :rolleyes:
Posted on Reply
#7
Jegergrim
I have pretty limited knowledge on the effect of increased core clock speed, but doubling that, whilst keeping it stable- Has it ever been done before ? :confused:

And congratulations with your title Qubit! :D
Posted on Reply
#8
Cuzza
Seriously though? 1.6Ghz =100% overclock. That's completely unheard of. I'll believe it when I see it.
Posted on Reply
#9
qubit
Overclocked quantum bit
by: Jegergrim
I have pretty limited knowledge on the effect of increased core clock speed, but doubling that, whilst keeping it stable- Has it ever been done before ? :confused:
No it hasn't been done before to my knowledge. This is why the card has been especially designed for this. The GPU is being run waay over it's design limit and will consume an insane amount of power. I'll bet you money that these chips are cherry picked for the best performance. Also, it wouldn't surprise me if the GPU burns out quickly while running like this. However, it will be stable while it still works. A product like this being overclocked so insanely is mainly for enthusiasts with very deep pockets who want extra epeen. It's also handy for benchmarks, which EVGA can boast about and theoretically increase sales.

OMG can you imagine how much power four of these babies in quad SLI at 1.6GHz must consume! :eek: I'm not sure if there's a PSU beefy enough to power them. Heck, would there be enough power from a standard wall socket, even? I dunno.

by: Jegergrim
And congratulations with your title Qubit! :D
Thanks dude. :cool: :toast::toast:
Posted on Reply
#10
Cuzza
Oh yeah, well done qubit, I've been here longer than you and look at my post count!
Posted on Reply
#13
qubit
Overclocked quantum bit
I see there's quite a few skeptical kitties on here about that sky high overclock - and I don't blame you. (and I like the pics).

You know how this could be definitively verified, don't you. Yeah, you know what I'm gonna say... wait for it... give it to W1zzard to test. Now wouldn't that be awesome? :rockout: Who knows, perhaps he can even hit 2GHz with that thing?! :eek:
Posted on Reply
#14
Cuzza
doooooooo eeeeeeet W1zzard
Posted on Reply
#15
Delta6326
Did I miss something? I don't see where they officially claim 1.6Ghz...


Posted on Reply
#16
qubit
Overclocked quantum bit
by: Delta6326
Did I miss something? I don't see where they officially claim 1.6Ghz...


http://img.techpowerup.org/110924/Capture0213.jpg
I agree, it's not officially claimed.

In fact, the Gaming Blend article states that it was achieved, but doesn't clarify who exactly achieved it and doesn't supply all the necessary details surrounding this achievement, which is odd.
Posted on Reply
#17
cadaveca
My name is Dave
Hydro-copper water-cooling, 0.6mm pin matrix for cooling transfer, water-flow monitoring, 4ghz memory clock speeds and a whopping 202gbs per second memory bandwidth makes this one beast of a card.
Uh, they seem to be claiming that there will be a watercooled edition with 1600 MHz GPU clocks. I think they are refering to the shader clocks, which are twice the GPU clocks, based on the line above?


I think you interpreted the article wrong, qubit. ;)
Posted on Reply
#19
cadaveca
My name is Dave
LN2 is NOT water, like claimed on that page qubit linked. Nobody is really gonna doubt 1600 MHz core(3200 MHz shader) on LN2...
Posted on Reply
#20
Steevo
They fail already I hit 1.1Ghz at 1.4vcore on my 5870, and after years of running that and running F@H for quite some time I am still stable with no damage. Of course my custom BIOS is only set for 1052Mhz core....but with enough voltage I could hit that with this core too, or even higher.


Lets see a 580 do that..........Oh wait, it cant. Needz moar powah, then due to die stress it needs more replacements.
Posted on Reply
#21
v12dock
by: cadaveca
LN2 is NOT water, like claimed on that page qubit linked. Nobody is really gonna doubt 1600 MHz core(3200 MHz shader) on LN2...
I don't know about you but I always take my showers with LN2
Posted on Reply
#22
qubit
Overclocked quantum bit
[quote="I'm your huckleberry, post: 2405669"]http://www.evga.com/articles/00645/#extreme

This video is the proof using LN2. Screen cap of GPU-Z shows insane GPU Core clock of 1.6 GHz.[/quote]Thankyou, yes I thought it had to be on LN2. I'll update the article. :) So all you skeptical cats were right! Article now updated. Welcome to the forum. :toast:

But did you see how many PSU's they used - FOUR! :eek:
Posted on Reply
#23
micropage7
meh..
the card aint looks like enthusiast card, looks plain

why dont they design like this
Posted on Reply
#24
qubit
Overclocked quantum bit
by: micropage7
meh..
the card aint looks like enthusiast card, looks plain
http://www.techpowerup.com/img/11-09-24/03g-p3-1588-ar_lg_7.jpg
why dont they design like this
http://www.pcper.com/files/imagecache/article_max_width/review/2011-08-25/asus01.jpg
Well, it seems to depend on which photo you look at. It doesn't look too impressive in your photo, but if you take the first two photos in the article, it looks wicked. How it would actually look in real life, I wouldn't hazard to say...
Posted on Reply
#25
HalfAHertz
by: Steevo
They fail already I hit 1.1Ghz at 1.4vcore on my 5870, and after years of running that and running F@H for quite some time I am still stable with no damage. Of course my custom BIOS is only set for 1052Mhz core....but with enough voltage I could hit that with this core too, or even higher.


Lets see a 580 do that..........Oh wait, it cant. Needz moar powah, then due to die stress it needs more replacements.
Yeah? And I could do 1250MHz on a hd3300. What's your point? :p
Posted on Reply
Add your own comment