• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 580 Reference Design Graphics Card Pictured

coolers still look nearly the same as fermi does, if nv still stick with 40mm and make 580 just like they said with 512 SP enabled. then it would be hot and gone to power disasters..

they could reconstruct the chip based on GF104 arrangement, so 580 can be more power friendly. but because we havent see fully enabled GF104 chip either, that construction will need more time, and that februari releases could be delayed..

They haven't enabled the full Gf104not because they can't but because it will cannibalize the GTX470 sales. You can see that in the many overclocked variations of the GTX460 1GB that are already as fast as the GTX470

The way I see it, the GTX580 just needs to take the performance king crown back. It doesn't matter how much power it will consume, how hot it will get or how big it will be as log as Nvidia's name gets out there. Nvidia doesn't really need the low margin consumer market anymore. They recently finished a deal with the Chinese government for a super computer with over 7000 tesla GPUs which resulted in the world's current record holdrer at over 2,5 PFLOPS.

For Nvidia the consumer market is just a way to get rid of all the cores that didn't make the binning process for Tesla and cut their losses.
 
Your argument is flawed.
If you talk $ (aka you afford) the logical choice is the 5970. So Red Team still wins.

Eh? I'm not arguing. My point IS that ATI make the most expensive desktop cards and the loudest and the most power hungry. The other guys post stated when economy struggles, i.e. people get poorer, more ATI people appear because "they can't afford 470's etc".

My post negated his declaration by stating ATI fans pay MORE for the MORE power hungry cards. Thats not an ATI win.

and on topic so i dont get my ass kicked, if someoe has some solid specs of a GTX 580, that would be lovely because it'll stop all this poop from going round in circles.

Dont hit me Sneeky. I can't not respond to a guy quoting me.
 
What was the last high end nVidia GPU that wasnt hot? If you want a cool low power pc you get ATI if you want a power hungry monster you get nvidia. Its always been this way it will not change
 
You know, you guys could start filtering admittance to threads based upon what cards people list on their system specs -- no Nvidia or Intel owners in AMD threads, no AMD owners in Intel threads, etc.

Matrox and Via owners could be permitted anywhere, of course.

That would suck, you shouldn't deny people a section of the forums based on what they have in their specs. I own an Nvidia card, but i should still be able to discuss AMD products as i wish, of course on the circumstance that there is no trolling or anything, i don't see why someone should be blocked off based on no reason besides what's in their specs.
 
They haven't enabled the full Gf104not because they can't but because it will cannibalize the GTX470 sales. You can see that in the many overclocked variations of the GTX460 1GB that are already as fast as the GTX470

let hope it have same arrangement with GF104, but looking GF110 (still picking the same process of 40nm) only to replace current gtx480 and GF112 that will update current gtx460. i dont know where power and heat issue would stack..

i love 460 for its power, performance, and temps. but the future 580 kinda look have the same ending with 480, take performance crown but forget to solve other problems.

The way I see it, the GTX580 just needs to take the performance king crown back. It doesn't matter how much power it will consume, how hot it will get or how big it will be as log as Nvidia's name gets out there. Nvidia doesn't really need the low margin consumer market anymore. They recently finished a deal with the Chinese government for a super computer with over 7000 tesla GPUs which resulted in the world's current record holdrer at over 2,5 PFLOPS.

For Nvidia the consumer market is just a way to get rid of all the cores that didn't make the binning process for Tesla and cut their losses.

so now they have more focussed on bigger projects to make their name bigger than just selling a standard consumer card?

from what you say, it seems that they starting to losing focus on standart market, especially low-segment. i hope they still have to get both sides. and optimized the whole. not just for sake of names..
 
Last edited:
this just looks like nvidia doing anything they can to steal AMD's thunder, and it seems to be working in afew small circles. show people a picture of a card that doesnt exsist and probably wont for probably 6-8 months then say its 20% faster than stuff out now "so it should be". and people put off buying a new 6950/6970 could also put people off buying a new GTX480
 
That would suck, you shouldn't deny people a section of the forums based on what they have in their specs. I own an Nvidia card, but i should still be able to discuss AMD products as i wish, of course on the circumstance that there is no trolling or anything, i don't see why someone should be blocked off based on no reason besides what's in their specs.

Completely agree.

I currently own both a single heavily overclocked GTX460 and two 460's in SLI, yet I am sorely tempted by a 6970 or 6990 as a replacement. My previous GPU's have been a mix of ATI and Nvidia in equal measure.

Heck I am lining up a HD5750 for my wife's pc (to replace a GTX460 she gets lent on occassion) as there isn't anything from Nvidia that can match a HD5750 for performance per £.

Why should I be prevented from discussing ATI products just because I have a nvidia card in my specs?
 
To bad fermi isnt even that hot. I have a 470 and ive ran a 480 for a while. And they do not get that hot. I dont know why you guys get so butt hurt about things getting hot. we are PC Enthusiast we should expect to have things run hot. FERMI can run that hot to its not like its hurting it. my GTX480 ran at 75c gaming and my 470 ran at 65 to 70c gaming. Whats wrong with those temps. I had like 150mhz overclocks on them both to.

I've had my GTX 480 since May, idle 44c load 86c; I still have my reference HD4870 on my other system, idle 77c load 88c ; and how about HD4870x2 ... when talking about hot cards, some people suffer memory loss. :shadedshu
 
I dont like this card, GTX 580.
I dont like nVidia.
I dont like HD 6xxx.
I dont like AMD and ATI.

I want matrox G-series back,
I want S3 Savage back.
I want 3DLabs Permedia back.
I want Trident Blade Back.

But what i really need, a Voodoo card.
 
Im surprised, 3 pages down and no one mentioned "Wood screw"
 
Yea like your 4890 is a cool running card, it was a monster for heat and power when it came out for ATI and isn't that far off fermi now :ohwell:

There are many key differences between 4890 and fermi. 4890 is old and uses 55nm manufacturing. Of course, the most important thing is that the power consumption of FERMI is simply ridigulous compare to 4890. Put FERMI's cooler onto my 4890 and you'll get arctic temperatures.

Honestly, I don't worry bout my 4890 going the distance as I watercooled it.
 

Thanks. 384-bit memory, 8+6 pin power, display logic unchanged, similar VRM can be made out from that picture.

This is pure fud, but it looks like they took GF100, and burned its fat. Maybe they replaced the 32-core SM design with a more space-optimized 48-core one, reduced some redundant components that weren't having much of a positive impact on performance for the power draw, and made up for it with higher clock speeds. GF110 could be GF104's architecture, up-scaled. It's Fermi done right.

They haven't enabled the full Gf104not because they can't but because it will cannibalize the GTX470 sales.

GF100 is larger than GF104. It always makes more business sense for them to clear GF100 inventory (GTX 470 and GTX 465 (which is already cannibalized)), and replace it with this high-clock 384-SP GF104. For this reason many sources pointed out that full-GF104 could end up getting the SKU name "GTX 475". So maybe there's some GF100 inventory to gulp down.
 
we don't know much of squat about cayman and we don't know much of squat about the 580 so why argue?

all we know is the memory bus thats it.

for all we know it could be a rebrand(for example) we don't know

don't the newer cores(104) have 48
 
Your argument is flawed.
If you talk $ (aka you afford) the logical choice is the 5970. So Red Team still wins.

Speaking from a UK perspective, I would suggest that your argument is flawed also, the cheapest GTX 480 that I can find in the UK is some £89 cheaper than the cheapest HD5970, thats a difference in price of 20.something percent, now seeing as the 5970 is 20% faster across the board in all resolutions that would suggest that in the UK at least, the 480 is at least on an equal footing as a "value for money" proposition at the high end. If it was simply about performance then yes AMD win's, however, I would suggest to you that since the GTX 480 was released, it will have probably outsold the 5970 by some margin, simply because it IS more affordable, much why the GTX 470 has done so well against the HD5870.................. I love ATi but I regularily own both, it is Fail IMO to ignore both the strengths and weaknesses of each product.
 
Last edited:
when are the 580s suppose to be released?
 
This is pure fud, but it looks like they took GF100, and burned its fat. Maybe they replaced the 32-core SM design with a more space-optimized 48-core one, reduced some redundant components that weren't having much of a positive impact on performance for the power draw, and made up for it with higher clock speeds. GF110 could be GF104's architecture, up-scaled. It's Fermi done right.

Definitely seems like GF104 architecture, makes for more CUDA cores in less die area, win win IMO, they just need to improve the memory controller.

And contrary to popular belief, FERMI is not hot, FERMI is an architecture, it's unfair to say it's inherantly hot because it's not, take the GTX460 or GTS450 as examples. GF100 is hot, there's no denying that, but FERMI, by definition, isn't.
 
They haven't enabled the full Gf104not because they can't but because it will cannibalize the GTX470 sales. You can see that in the many overclocked variations of the GTX460 1GB that are already as fast as the GTX470

The way I see it, the GTX580 just needs to take the performance king crown back. It doesn't matter how much power it will consume, how hot it will get or how big it will be as log as Nvidia's name gets out there. Nvidia doesn't really need the low margin consumer market anymore. They recently finished a deal with the Chinese government for a super computer with over 7000 tesla GPUs which resulted in the world's current record holdrer at over 2,5 PFLOPS.

For Nvidia the consumer market is just a way to get rid of all the cores that didn't make the binning process for Tesla and cut their losses.

ah but that means we may see a fully enabled gf104 in the 5XX series lineup then, whcih would be fine by me.

now at 20% faster than the gtx480 we're looking at a hair better than 5970 performance which wouldn't be bad if it were priced right. This is especially true if it's lower power/heat than the gtx480.
 
It looks like any other stock Nvidia card with GTX 580 shopped on it. Awesome. It will be interesting if we actually see these selling in mass within the next six months.

Definitely seems like GF104 architecture, makes for more CUDA cores in less die area, win win IMO, they just need to improve the memory controller.

And contrary to popular belief, FERMI is not hot, FERMI is an architecture, it's unfair to say it's inherantly hot because it's not, take the GTX460 or GTS450 as examples. GF100 is hot, there's no denying that, but FERMI, by definition, isn't.

That's because when most people refer to Fermi, they are refering to GF100, GTX480/470/465.

39113955.jpg

Strange looking! Are those VRM's on the backside of the GPU? Could just be empty spots too I guess..
 
Last edited:
So was not the Preshott... ;)
 
Very interesting indeed. If this card is not a paper monster and is as good as Nvidia claims, it would help greatly in bringing down the prices of Cayman. Which is great for everybody, no matter which company you prefer. The next few months will be very interesting indeed.
 
Oh dear...

If reports by Fudzilla are true, the 6970 is a hot little beast as the AMD engineers try and squeeze every once of power out of it. I hope this is not true.

I'll not buy a 6970 that's hot and loud and power crazy (just as i didnt buy a GTX 480 for those reasons).

Heaven forbid, maybe i'll hang onto my 5850's until that GTX 580 comes out after all, then i'll decide which side does the best perf per watt.
 
only 20% faster, shame just want to beat new 6970 for a bit
 
Back
Top