• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ZOTAC GeForce GTX 480 AMP! Edition

The fans have 4 wires and they should be capable of being dynamically controlled .
Maybe there is something wrong with the sample TPU received. I've just read another ZOTAC GTX480 test and it shows big variations in idle/load dbA, so the fans must be controlled.

the fans are temperature controlled. it's just that there is so little difference between the idle fan speed and load fan speed that the noise difference is smaller than 1 dba
 
1.
Significantly lower power consumption thanks to lower temperatures
Since when is lower heat = lower consumtion ?
2. PLEASE benchmark on new games!!!
 
the fans are temperature controlled. it's just that there is so little difference between the idle fan speed and load fan speed that the noise difference is smaller than 1 dba

I wonder if some more aggressive fan speed control set by the user would change that. I think the fan speed curve Zotac is using just doesn't lower the fan speed enough when idle.

1. Since when is lower heat = lower consumtion ?
2. PLEASE benchmark on new games!!!

Cooler VRM area means they work more efficiently.
 
Cooler VRM area means they work more efficiently.
Most likely by a margin which might be considered a measurement error.
 
Most likely by a margin which might be considered a measurement error.

20-30w definitely isn't a margin that would be considered a measurment error, and go back an read the review. W1z being the great reviewer that he is did additional testing to see what was going on, and sure enough the power draw goes up with temperature in a linear fashion.
 
Varying power consumption in these cards is perhaps caused by wider than usual variances in the leakage current from these GPUs. Unsurprising considering their problems with the process. So it's the luck of the draw on whether you get one that's fairly efficient or one that's a particularly wasteful minifurnace. :)
 
Varying power consumption in these cards is perhaps caused by wider than usual variances in the leakage current from these GPUs. Unsurprising considering their problems with the process. So it's the luck of the draw on whether you get one that's fairly efficient or one that's a particularly wasteful minifurnace. :)

On again, read the review, power consumption increased with temperature. So it has nothing to do with the luck of the draw.
 
So that means that the typical GTX 480 runs its VRMs so hot that they become rather impressively inefficient. That is pretty sloppy of NV. It could just be the tested samples here though.

But leakage current does vary between these GPUs. You can bet on that. That is the case with every GPU out there.

Regardless, these chips are just not that great. Need a refresh for sure. I can't wait to see how that turns out and if this ends up being one of the more inefficent GPUs. I'd bring up the R600 -> RV670 transition but they enjoyed a major new manufacturing process there and added Powerplay. I can't see the refresh to this beast being on anything other than 40nm again.
 
So that means that the typical GTX 480 runs its VRMs so hot that they become rather impressively inefficient. That is pretty sloppy of NV. It could just be the tested samples here though.

i measured gpu temperature, not vrm temperature
 
i measured gpu temperature, not vrm temperature

Right, but it is logical to assume that if the GPU is heating up, the VRMs are also. And since the VRMs are where the power is converted for the GPU, their efficiency it was it dropping and causing the higher power draw.
 
Right, but it is logical to assume that if the GPU is heating up, the VRMs are also. And since the VRMs are where the power is converted for the GPU, their efficiency it was it dropping and causing the higher power draw.

i doubt the difference in vrm temps will be that big, but unfortunately no sensor in vrms so no way to find out.
the power draw changes near instantly with gpu temperature, vrms would need some time to heat up from the reduced air flow of the fans going slower
 
could it be the fan ?

because reference GTX 480 use very high ampere, and RPM

and this zotac use very efficient cooler and because that it make the GPU more cooler with less RPM
 
Most likely by a margin which might be considered a measurement error.

It's a known fact that when electronics get hot, they become inefficient. That's why I don't let my CPU over 60C and my GPU over 70C because they just start getting really inefficient, plus it's not good for them. When you are dealling with something with 3 billion transistors, it's definately going to make a difference when they start expanding from heat and such.

Anyways, nice review W1zzard. Good to see the inclusion of power consumption graph showing how it uses more power when it gets hotter. I knew this happened but didn't realise it made that much of a difference. Probably why this has a lower power consumption: lower temps.
 
It's a known fact that when electronics get hot, they become inefficient. That's why I don't let my CPU over 60C and my GPU over 70C because they just start getting really inefficient, plus it's not good for them. When you are dealling with something with 3 billion transistors, it's definately going to make a difference when they start expanding from heat and such.
Chips are specified for much more heat than 60C and they have to maintain their design power across their designated temp range. Lifetime is of no concern to me because, comparatively, I'm not running 486s or Pentium IIIs anymore and what I have today will be just as pathetic in 10 years as those are today.

I'm still thinking that the GTX 4xx cards have significantly varying leakage current. These GPUs are obviously the most complex ever and are very difficult to manufacture so they are probably not having the best of luck with chip quality overall. Just doing a search regarding GF100 and 'leakage current' brings tons of results, unsurprisingly. They need 28nm to work amazingly well but I'm not sure what to expect there and it's a long time off I believe. It's amazing how far and hard they push processes these days compared to 10 years ago. That's competition for ya.
 
Last edited:
Chips are specified for much more heat than 60C and they have to maintain their design power across their designated temp range. Lifetime is of no concern to me because, comparatively, I'm not running 486s or Pentium IIIs anymore and what I have today will be just as pathetic in 10 years as those are today.

While they a specified for higher, it's still not good for them. Doesn't matter how well it's built, it's simply physics: when something heats up, it expands. When you have that many tiny little things all bunched together, expansion is the last thing you want, no matter how little.
 
I'm happy as is.

Seeing the stats im happy with 4870x2 xfire.Ok it pulls about 1000watt full power, but the visuals are ok,even with only a I7 920.:)
 
Yeah people do buy cards not GPUs, and many of those same folks whine when they realize their 2Gb card has a 1Gb frame buffer and performs worse than a single GPU in games if a proper driver profile isn't present. See it constantly on gaming forums.
 
Back
Top