Thursday, October 28th 2010

NVIDIA GeForce GTX 580 Reference Design Graphics Card Pictured

Here are the first pictures of what is touted to be the GeForce GTX 580 reference design graphics card by NVIDIA, by sections of the Chinese media. There are some interesting inferences that can be drawn just by the looks of the card. To begin with the cooler bears an uncanny resemblance to one of the earliest design iterations of the GeForce GTX 480 (pictured here and here). In its final iteration, NVIDIA gave the GTX 480 a more massive cooler, perhaps to keep up with its finalized clock speeds. If the design of the GTX 580 cooler is anything to go by, it means that either NVIDIA refined the GF100 architecture in the GF110 (on which GTX 580 is based) a great deal, increasing performance per Watt; or that since GTX 580 is in its development stage, its final version could look different. GeForce GTX 580 is being designed as a counter to AMD's Radeon HD 6900 series single-GPU graphics cards that are based on the new Cayman graphics core, which is slated for release in late November. It is expected to be 20% faster than the GTX 480.
Source: PCinLife
Add your own comment

213 Comments on NVIDIA GeForce GTX 580 Reference Design Graphics Card Pictured

#51
the54thvoid
Intoxicated Moderator
BorgOvermindYour argument is flawed.
If you talk $ (aka you afford) the logical choice is the 5970. So Red Team still wins.
Eh? I'm not arguing. My point IS that ATI make the most expensive desktop cards and the loudest and the most power hungry. The other guys post stated when economy struggles, i.e. people get poorer, more ATI people appear because "they can't afford 470's etc".

My post negated his declaration by stating ATI fans pay MORE for the MORE power hungry cards. Thats not an ATI win.

and on topic so i dont get my ass kicked, if someoe has some solid specs of a GTX 580, that would be lovely because it'll stop all this poop from going round in circles.

Dont hit me Sneeky. I can't not respond to a guy quoting me.
Posted on Reply
#52
Athlon2K15
HyperVtX™
What was the last high end nVidia GPU that wasnt hot? If you want a cool low power pc you get ATI if you want a power hungry monster you get nvidia. Its always been this way it will not change
Posted on Reply
#53
CDdude55
Crazy 4 TPU!!!
mdm-adphYou know, you guys could start filtering admittance to threads based upon what cards people list on their system specs -- no Nvidia or Intel owners in AMD threads, no AMD owners in Intel threads, etc.

Matrox and Via owners could be permitted anywhere, of course.
That would suck, you shouldn't deny people a section of the forums based on what they have in their specs. I own an Nvidia card, but i should still be able to discuss AMD products as i wish, of course on the circumstance that there is no trolling or anything, i don't see why someone should be blocked off based on no reason besides what's in their specs.
Posted on Reply
#54
Jonap_1st
HalfAHertzThey haven't enabled the full Gf104not because they can't but because it will cannibalize the GTX470 sales. You can see that in the many overclocked variations of the GTX460 1GB that are already as fast as the GTX470
let hope it have same arrangement with GF104, but looking GF110 (still picking the same process of 40nm) only to replace current gtx480 and GF112 that will update current gtx460. i dont know where power and heat issue would stack..

i love 460 for its power, performance, and temps. but the future 580 kinda look have the same ending with 480, take performance crown but forget to solve other problems.
HalfAHertzThe way I see it, the GTX580 just needs to take the performance king crown back. It doesn't matter how much power it will consume, how hot it will get or how big it will be as log as Nvidia's name gets out there. Nvidia doesn't really need the low margin consumer market anymore. They recently finished a deal with the Chinese government for a super computer with over 7000 tesla GPUs which resulted in the world's current record holdrer at over 2,5 PFLOPS.

For Nvidia the consumer market is just a way to get rid of all the cores that didn't make the binning process for Tesla and cut their losses.
so now they have more focussed on bigger projects to make their name bigger than just selling a standard consumer card?

from what you say, it seems that they starting to losing focus on standart market, especially low-segment. i hope they still have to get both sides. and optimized the whole. not just for sake of names..
Posted on Reply
#55
DigitalUK
this just looks like nvidia doing anything they can to steal AMD's thunder, and it seems to be working in afew small circles. show people a picture of a card that doesnt exsist and probably wont for probably 6-8 months then say its 20% faster than stuff out now "so it should be". and people put off buying a new 6950/6970 could also put people off buying a new GTX480
Posted on Reply
#56
Xaser04
CDdude55That would suck, you shouldn't deny people a section of the forums based on what they have in their specs. I own an Nvidia card, but i should still be able to discuss AMD products as i wish, of course on the circumstance that there is no trolling or anything, i don't see why someone should be blocked off based on no reason besides what's in their specs.
Completely agree.

I currently own both a single heavily overclocked GTX460 and two 460's in SLI, yet I am sorely tempted by a 6970 or 6990 as a replacement. My previous GPU's have been a mix of ATI and Nvidia in equal measure.

Heck I am lining up a HD5750 for my wife's pc (to replace a GTX460 she gets lent on occassion) as there isn't anything from Nvidia that can match a HD5750 for performance per £.

Why should I be prevented from discussing ATI products just because I have a nvidia card in my specs?
Posted on Reply
#57
claylomax
nvidiaintelftwTo bad fermi isnt even that hot. I have a 470 and ive ran a 480 for a while. And they do not get that hot. I dont know why you guys get so butt hurt about things getting hot. we are PC Enthusiast we should expect to have things run hot. FERMI can run that hot to its not like its hurting it. my GTX480 ran at 75c gaming and my 470 ran at 65 to 70c gaming. Whats wrong with those temps. I had like 150mhz overclocks on them both to.
I've had my GTX 480 since May, idle 44c load 86c; I still have my reference HD4870 on my other system, idle 77c load 88c ; and how about HD4870x2 ... when talking about hot cards, some people suffer memory loss. :shadedshu
Posted on Reply
#58
alucasa
You know, these AMD vs Nvidia topics are really getting tiresome. I am not talking about the topics themselves, but attitudes found within.
Posted on Reply
#59
ariff_tech
I dont like this card, GTX 580.
I dont like nVidia.
I dont like HD 6xxx.
I dont like AMD and ATI.

I want matrox G-series back,
I want S3 Savage back.
I want 3DLabs Permedia back.
I want Trident Blade Back.

But what i really need, a Voodoo card.
Posted on Reply
#60
Bjorn_Of_Iceland
Im surprised, 3 pages down and no one mentioned "Wood screw"
Posted on Reply
#61
jamsbong
NdMk2o1oYea like your 4890 is a cool running card, it was a monster for heat and power when it came out for ATI and isn't that far off fermi now :ohwell:
There are many key differences between 4890 and fermi. 4890 is old and uses 55nm manufacturing. Of course, the most important thing is that the power consumption of FERMI is simply ridigulous compare to 4890. Put FERMI's cooler onto my 4890 and you'll get arctic temperatures.

Honestly, I don't worry bout my 4890 going the distance as I watercooled it.
Posted on Reply
#62
sneekypeet
Retired Super Moderator
Bjorn_Of_IcelandIm surprised, 3 pages down and no one mentioned "Wood screw"
Its a shame you didnt read all 3 of those pages, namely, this post.
Posted on Reply
#63
btarunr
Editor & Senior Moderator
Jiraiyaimg89.imageshack.us/img89/9397/39113955.jpg

bbs.expreview.com/viewthread.php?tid=37388&from=recommend_f
Thanks. 384-bit memory, 8+6 pin power, display logic unchanged, similar VRM can be made out from that picture.

This is pure fud, but it looks like they took GF100, and burned its fat. Maybe they replaced the 32-core SM design with a more space-optimized 48-core one, reduced some redundant components that weren't having much of a positive impact on performance for the power draw, and made up for it with higher clock speeds. GF110 could be GF104's architecture, up-scaled. It's Fermi done right.
HalfAHertzThey haven't enabled the full Gf104not because they can't but because it will cannibalize the GTX470 sales.
GF100 is larger than GF104. It always makes more business sense for them to clear GF100 inventory (GTX 470 and GTX 465 (which is already cannibalized)), and replace it with this high-clock 384-SP GF104. For this reason many sources pointed out that full-GF104 could end up getting the SKU name "GTX 475". So maybe there's some GF100 inventory to gulp down.
Posted on Reply
#64
KainXS
we don't know much of squat about cayman and we don't know much of squat about the 580 so why argue?

all we know is the memory bus thats it.

for all we know it could be a rebrand(for example) we don't know

don't the newer cores(104) have 48
Posted on Reply
#65
Tatty_Two
Gone Fishing
BorgOvermindYour argument is flawed.
If you talk $ (aka you afford) the logical choice is the 5970. So Red Team still wins.
Speaking from a UK perspective, I would suggest that your argument is flawed also, the cheapest GTX 480 that I can find in the UK is some £89 cheaper than the cheapest HD5970, thats a difference in price of 20.something percent, now seeing as the 5970 is 20% faster across the board in all resolutions that would suggest that in the UK at least, the 480 is at least on an equal footing as a "value for money" proposition at the high end. If it was simply about performance then yes AMD win's, however, I would suggest to you that since the GTX 480 was released, it will have probably outsold the 5970 by some margin, simply because it IS more affordable, much why the GTX 470 has done so well against the HD5870.................. I love ATi but I regularily own both, it is Fail IMO to ignore both the strengths and weaknesses of each product.
Posted on Reply
#68
wolf
Performance Enthusiast
btarunrThis is pure fud, but it looks like they took GF100, and burned its fat. Maybe they replaced the 32-core SM design with a more space-optimized 48-core one, reduced some redundant components that weren't having much of a positive impact on performance for the power draw, and made up for it with higher clock speeds. GF110 could be GF104's architecture, up-scaled. It's Fermi done right.
Definitely seems like GF104 architecture, makes for more CUDA cores in less die area, win win IMO, they just need to improve the memory controller.

And contrary to popular belief, FERMI is not hot, FERMI is an architecture, it's unfair to say it's inherantly hot because it's not, take the GTX460 or GTS450 as examples. GF100 is hot, there's no denying that, but FERMI, by definition, isn't.
Posted on Reply
#69
yogurt_21
HalfAHertzThey haven't enabled the full Gf104not because they can't but because it will cannibalize the GTX470 sales. You can see that in the many overclocked variations of the GTX460 1GB that are already as fast as the GTX470

The way I see it, the GTX580 just needs to take the performance king crown back. It doesn't matter how much power it will consume, how hot it will get or how big it will be as log as Nvidia's name gets out there. Nvidia doesn't really need the low margin consumer market anymore. They recently finished a deal with the Chinese government for a super computer with over 7000 tesla GPUs which resulted in the world's current record holdrer at over 2,5 PFLOPS.

For Nvidia the consumer market is just a way to get rid of all the cores that didn't make the binning process for Tesla and cut their losses.
ah but that means we may see a fully enabled gf104 in the 5XX series lineup then, whcih would be fine by me.

now at 20% faster than the gtx480 we're looking at a hair better than 5970 performance which wouldn't be bad if it were priced right. This is especially true if it's lower power/heat than the gtx480.
Posted on Reply
#70
erocker
*
It looks like any other stock Nvidia card with GTX 580 shopped on it. Awesome. It will be interesting if we actually see these selling in mass within the next six months.
wolfDefinitely seems like GF104 architecture, makes for more CUDA cores in less die area, win win IMO, they just need to improve the memory controller.

And contrary to popular belief, FERMI is not hot, FERMI is an architecture, it's unfair to say it's inherantly hot because it's not, take the GTX460 or GTS450 as examples. GF100 is hot, there's no denying that, but FERMI, by definition, isn't.
That's because when most people refer to Fermi, they are refering to GF100, GTX480/470/465.
Jiraiya
Strange looking! Are those VRM's on the backside of the GPU? Could just be empty spots too I guess..
Posted on Reply
#71
RejZoR
So was not the Preshott... ;)
Posted on Reply
#72
Yellow&Nerdy?
Very interesting indeed. If this card is not a paper monster and is as good as Nvidia claims, it would help greatly in bringing down the prices of Cayman. Which is great for everybody, no matter which company you prefer. The next few months will be very interesting indeed.
Posted on Reply
#73
the54thvoid
Intoxicated Moderator
Oh dear...

If reports by Fudzilla are true, the 6970 is a hot little beast as the AMD engineers try and squeeze every once of power out of it. I hope this is not true.

I'll not buy a 6970 that's hot and loud and power crazy (just as i didnt buy a GTX 480 for those reasons).

Heaven forbid, maybe i'll hang onto my 5850's until that GTX 580 comes out after all, then i'll decide which side does the best perf per watt.
Posted on Reply
#74
Hayder_Master
only 20% faster, shame just want to beat new 6970 for a bit
Posted on Reply
#75
CDdude55
Crazy 4 TPU!!!
hayder.masteronly 20% faster, shame just want to beat new 6970 for a bit
Hopefully it ends up being a good amount faster then that while maintaining reasonable power consumption and heat. That's all i really ask personally.

At least a 30% and above increase with improvements in efficiency over the Fermi Architecture.:)
Posted on Reply
Add your own comment
Apr 23rd, 2024 04:19 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts