• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 480 Reference Design Card Final Design Pictured

I really dont think there will be a dual GPU until they can do another two revisions of fermi or until 32mn is tapped out.

Nahh ! Im sure the dual GPU version will be announced soon and out end of summer, they absolutley need to beat the 5970, and i need my new Dx11 dual GPU too !!



That will be a GTX 460 Dual at best


Maybe you right, only asus can do the crazy shit ( mars, ares )
 
No, ATi advertises the HD5870 as having a TDP of 188w, in reality it pulls ~212w under load, if you don't believe me go look at some reviews.

I'm assuming max power draw possible here with the GTX480 of 300w to get an 88w difference, however the article and leaked information actually put it in the 395w range. And 212w is just the reference design, we have already seen partners release HD5870s that are pulling over 230w.

For those of you that don't like to be bothered to read real reviews:
http://tpucdn.com/reviews/Powercolor/HD_5750_Go_Green/images/power_maximum.gif

It could be, but that would serioulsy over stress the PCI-e power specs. Maybe 305w, but I would say that would be the very maximimum, any higher and they would have put two 8-pin connectors on the card.
Do you realize that the GTX 295 rated at 289W TDP is already pulling 320Ws in that graph?
Both nVidia and ATi rate their TDP by typical use, not "power virus" draw. ;)

Unless the only thing you do is looking at the furry dounut the whole day, this is the graph to look for:
power_peak.gif
 
Last edited:
Hm.. maybe the power measurement isn't accurate. AFAIK I've seen other reviews where the 5870 was under its TDP of 188W.
Take a look at the GTX 295 for example. According to tpu it draws a max of 320W while its tdp is 289.
So, either all the TDPs are false or it has not been measured accurately.

Doh, Zubasa beat me 8(
Actually, don't Nvidia and ATI use Futuremark for their TDP?
 
that heatsink with direct touch and pipes sticking out looks amazing, like a muscle car ,,reeks of power, gasoline and oil
The_fast_and_the_furious_muscle_car.jpg

_____________________________
cpidtagb.php
 
that heatsink with direct touch and pipes sticking out looks amazing, like a muscle car ,,reeks of power, gasoline and oil
http://us4.pixagogo.com/S54Wv7ew067...1YPo_/The_fast_and_the_furious_muscle_car.jpg
_____________________________
http://stats.free-dc.org/cpidtagb.php?cpid=59693a2ed1d0ab4f24e571d332537dfb&theme=9&cols=1

Burning oil from overheating, 1MPG Pinto with bald tires and one brown door, merging onto the information superhighway like the Titanic into a iceberg.


You can dress up a turd all you want, but at the end of the day it's still a turd.
 
Fermi, now with 3 slots for more cooling power?

I can almost hear it "With occupying 3 slots instead of the standard two, we can deliver exceptional cooling capabilities to keep the temperatures down and prolong the lifespan of the GTX480"
 
Fermi, now with 3 slots for more cooling power?

I can almost hear it "With occupying 3 slots instead of the standard two, we can deliver exceptional cooling capabilities to keep the temperatures down and prolong the lifespan of the GTX480"

it's only a matter of time. Nvidia don't care about being efficient, they will just make behemoths to say they have the better card. I can't wait for one day when ATi decides 'screw it, our very top end mono gpu card can have the same die size and TDP as nvidias, now lets pack the punch and leave the rest of the lineup efficient and resource friendly'. That would be the day nvidia dies.
 
Aren't we all mostly missing something....

GTX480 = ~300W TDP and price of ÂŁ450*
HD5870 = max 212W TDP and price of ÂŁ310+
Thats about >40% increase on both counts.

(* based on Fudzilla's quoted 450 euros which will translate to 450 british quid.)

So, performance aside (because we dont know what it really is) why is everyone comparing these two cards? They are not in the same ballpark. It's like comparing a 2 litre engine car to a 1.4.

If this card, the cream of NV is to be classed as the best perfoming single GPU, it's performance needs to be classified towards the HD5970 end. And for those saying Dual GTX 480, what planets do you live on? It has TREMENDOUSLY bad power issues (512 cut to 480 cores) and an NV - not custom - cooling solution straight out the blocks.

This will beat the 5870 but the margins need to be justified. Can we not start being realistic and bring in more logistical arguments like

yo mommas gonna kick yo ass when she sees her power bill, assuming the house hasnt been burned down.

From a working technological position, ATI have won this round. Now if drivers were put aside many more NV folk would switch.

And remember, before i get called an ATI fanboi, I went to red because i foresaw Fermi being delayed way back in September, going on all those rumours. So NV lost my custom.
 
Aren't we all mostly missing something....

GTX480 = ~300W TDP and price of ÂŁ450*
HD5870 = max 212W TDP and price of ÂŁ310+
Thats about >40% increase on both counts.

(* based on Fudzilla's quoted 450 euros which will translate to 450 british quid.)

So, performance aside (because we dont know what it really is) why is everyone comparing these two cards? They are not in the same ballpark. It's like comparing a 2 litre engine car to a 1.4.

If this card, the cream of NV is to be classed as the best perfoming single GPU, it's performance needs to be classified towards the HD5970 end. And for those saying Dual GTX 480, what planets do you live on? It has TREMENDOUSLY bad power issues (512 cut to 480 cores) and an NV - not custom - cooling solution straight out the blocks.

This will beat the 5870 but the margins need to be justified. Can we not start being realistic and bring in more logistical arguments like

yo mommas gonna kick yo ass when she sees her power bill, assuming the house hasnt been burned down.

From a working technological position, ATI have won this round. Now if drivers were put aside many more NV folk would switch.

And remember, before i get called an ATI fanboi, I went to red because i foresaw Fermi being delayed way back in September, going on all those rumours. So NV lost my custom.

They're being compared because the hd5970 stomps all over it. Plus people like to compare the single gpu's from both parties. We all know fermi is the hd2900 of 2010, but we just want to see how much of a fail it truly is, to see if there's something to be salvaged.

Also, i bought my hd5870 in October :) epic card for the past 6 months! now fermi got kicked in the head by ridiculous power consumption! I don't want an easy bake oven with built in hairdryer. I want a gaming machine where i can still hear the game over the fan!
 
it's only a matter of time. Nvidia don't care about being efficient, they will just make behemoths to say they have the better card. I can't wait for one day when ATi decides 'screw it, our very top end mono gpu card can have the same die size and TDP as nvidias, now lets pack the punch and leave the rest of the lineup efficient and resource friendly'. That would be the day nvidia dies.

Its already happening.
5970 is faster....
Nvidia fan boys are tired of waiting.
Their drivers isnt as good as they were, ati's drivers are improving at a rapid rate.
Ati is getting more acceptance by the open source community.
Ati is getting more out to the community, and developers.

Nvidia is:
Holding their prioritary standards close.
Delaying.
Showing off wood.
renaming.

With fusion comming, intel with their "igp" on die and so on, nvidia's chipset business AND low low end graphics solutions is obsolute, core 2 duo generation is dead very very soon.
Nvidia's chipset business started failing already at 6 series.

Nvidia's tegra is doing great.
Their tesla is doing great.

yeah...

I can only see intel buying nvidia when its stocks falling.
 
The only way a dual fermi gpu is comming is if it's two 470 chips undervolted and underclocked
 
isn't that as much power as the GTX 295?
and this should be as fast as a GTX 295, so....Nvidia is justified to do so?


but really. that much power, and that much heat, and that much possible price, why is nvidia even bothering with it?
 
Do you realize that the GTX 295 rated at 289W TDP is already pulling 320Ws in that graph?
Both nVidia and ATi rate their TDP by typical use, not "power virus" draw. ;)

Unless the only thing you do is looking at the furry dounut the whole day, this is the graph to look for:
http://tpucdn.com/reviews/ASUS/EAH_5830_DirectCu/images/power_peak.gif

Yep, and the HD4870x2 is pulling well over 380w, despite the 300w limit it should be adhering to.

So where does that leave us? Where we started, knowing nothing about the actual power consumption of the GTX480.
 
Seems we have another 5800 on our hands.

The new and improved DUST BUSTER!!! :laugh:
 
Seems we have another 5800 on our hands.

The new and improved DUST BUSTER!!! :laugh:

at least the hd5870 doesn't need it at 100% (actually not even 60%) when under load :S the gtx480 looks like it'll need all that jumbo has got!
 
Burning oil from overheating, 1MPG Pinto with bald tires and one brown door, merging onto the information superhighway like the Titanic into a iceberg.


You can dress up a turd all you want, but at the end of the day it's still a turd.

That made me really laugh... Thanks


LOL, as long as you dont OC anything....

This will be interesting to see the real card and real specs..
 
A sad day has come when an SLi/i7 rig uses more power than my fucking microwave. :shadedshu
 
Last edited:
I call Global Warming Alarm!!!!! Nvidia is killing the polar bears..

(lol, ok now il stop)
 
Thing is with 3 of them and a i7 ur comp can now pop popcorn just by resting the bag on the top of ur case...

I'll even be able to shove it outside in the winter and melt the snow in the -30'C weather and watch the steam rise.....I'll be the talk of the neighbourhood with my "Fermi Sauna"! :laugh: :roll:
 
LOL, ether they put the decimal in the wrong place or that is a 10K rpm fan (I bet its really quiet ):laugh:
You are evil :roll:
The only other way that they can put the decimal point is to the right, and that will make it an 18.0A fan. :p
I am sure we humans will not hear the fan itself anymore. The noise is most likely ultrasound :nutkick:

Edit: fixed.
 
Back
Top