Saturday, November 13th 2010

Disable GeForce GTX 580 Power Throttling using GPU-Z

NVIDIA shook the high-end PC hardware industry earlier this month with the surprise launch of its GeForce GTX 580 graphics card, which extended the lead for single-GPU performance NVIDIA has been holding. It also managed to come up with some great performance per Watt improvements over the previous generation. The reference design board, however, made use of a clock speed throttling logic which reduced clock speeds when an extremely demanding 3D application such as Furmark or OCCT is run. While this is a novel way to protect components saving consumers from potentially permanent damage to the hardware, it does come as a gripe to expert users, enthusiasts and overclockers, who know what they're doing.

GPU-Z developer and our boss W1zzard has devised a way to make disabling this protection accessible to everyone (who knows what he's dealing with), and came up with a nifty new feature for GPU-Z, our popular GPU diagnostics and monitoring utility, that can disable the speed throttling mechanism. It is a new command-line argument for GPU-Z, that's "/GTX580OCP". Start the GPU-Z executable (within Windows, using Command Prompt or shortcut), using that argument, and it will disable the clock speed throttling mechanism. For example, "X:gpuz.exe /GTX580OCP" It will stay disabled for the remainder of the session, you can close GPU-Z. It will be enabled again on the next boot.
As an obligatory caution, be sure you know what you're doing. TechPowerUp is not responsible for any damage caused to your hardware by disabling that mechanism. Running the graphics card outside of its power specifications may result in damage to the card or motherboard. We have a test build of GPU-Z (which otherwise carries the same-exact feature-set of GPU-Z 0.4.8). We also ran a power consumption test on our GeForce GTX 580 card demonstrating how disabling that logic affects power consumption.

DOWNLOAD: TechPowerUp GPU-Z GTX 580 OCP Test Build
Add your own comment

116 Comments on Disable GeForce GTX 580 Power Throttling using GPU-Z

#1
W1zzard
on most decent motherboards you can draw well over 100 w from the slot alone without bad things happening - because motherboard designers specifically design for that.

if you buy a $30 motherboard then the boss of those guys told them to save 5 cents to meet their price target -> possible damage when drawing too much current for a long time
Posted on Reply
#2
Wile E
Power User
W1zzard said:
on most decent motherboards you can draw well over 100 w from the slot alone without bad things happening - because motherboard designers specifically design for that.

if you buy a $30 motherboard then the boss of those guys told them to save 5 cents to meet their price target -> possible damage when drawing too much current for a long time
My old crappy ECS KA3-MVP allowed me to manually set the wattage limit to over 100w, let alone a quality board.

So, basically, you are saying the limiter is basically to prevent crappy boards from burning out, not for the benefit of the card itself? (Which is pretty much what I suspected from the beginning)
Posted on Reply
#3
GC_PaNzerFIN
Nothing new with cards detecting furmark and throttling. ATi has done it ever since 2008. I can't believe how big a fuss out of nothing this has become.
Posted on Reply
#4
W1zzard
Wile E said:
the limiter is basically to prevent crappy boards from burning out, not for the benefit of the card itself?
it's for both

GC_PaNzerFIN said:
ATi has done it ever since 2008
any idea where the system is described? i dont see any evidence of it when doing my power consumption testing. maybe you mean vrm overheat protection which reduces clocks on overheat?
Posted on Reply
#5
GC_PaNzerFIN
W1zzard said:

any idea where the system is described? i dont see any evidence of it when doing my power consumption testing. maybe you mean vrm overheat protection which reduces clocks on overheat?
Yes they changed the method, HD4 detected exe and HD5 have hardware protections. End result is same.

http://www.anandtech.com/show/2841/11

T
Posted on Reply
#6
OneMoar
There is Always Moar
!firehazard :shadedshu
350 watts @ 12v = 30 amps - the 75-100 that the pcie-buss can supply so you are right on the edge of what the wires from you're psu can handle ..
most wires are only rated for about 8-10 AMPS > then.melt
I put forth this question when does performance out weight the extra power usage and risk
Posted on Reply
#7
Wile E
Power User
OneMoar said:
!firehazard :shadedshu
300 watts @ 12v = 30 amps
wires are only rated for about 20 > then.melt
I put forth this question when does performance out weight the extra power usage and ris
Not all wires melt at 20A. Higher gauge/thicker wire = more amps, and not only that, but that load is spread across multiple wires.

A quality board and quality psu has no problems handling these kinds of loads.
Posted on Reply
#8
OneMoar
There is Always Moar
Wile E said:
Not all wires melt at 20A. Higher gauge/thicker wire = more amps, and not only that, but that load is spread across multiple wires.

A quality board and quality psu has no problems handling these kinds of loads.
I know that but consider when you start adding the rest of the power hungry components
bad things will happen
unless you can find a psu with 15g wiring most are 20-18
Posted on Reply
#9
Wile E
Power User
OneMoar said:
I know that but consider when you start adding the rest of the power hungry components
bad things will happen
unless you can find a psu with 15g wiring most are 20-18
It's still not a problem with a high qulaity psu. If you can afford a 580, you can afford the high quality psu to go with it. I'll put 30A from a video card thru my psu all day, and never have to worry about it.
Posted on Reply
#10
bakalu
MikeMurphy said:
OK, if you don't care about the topic of this thread then please don't post in this thread. This isn't a sneer remark or anything but just a way to get this thing back on topic.

Thanks,
HTC said:
@ bakalu: Any chance you could rename the EXE Furmark to whatever you like and run it again with your 580? If @ anytime you see the temp rising too much, please interrupt the program but do post a screenie after.
My ASUS GeForce GTX 580 @ 880MHz/1050. This overclock was 100% stable on games and 3DMark Vantage

vCore=1.138V, stock fan, FAN set 85%, Room Temp=25oC. I rename Furmark to nemesis and run over 40'. Here is my result:


Maximum Temp 70oC

What more you can say :D
Posted on Reply
#11
W1zzard
renaming furmark will not make any difference for nvidia's detection system
Posted on Reply
#12
Splave
alot of crying in this thread? if you are a gamer/normal user this doesnt even apply to you. To a bencher innovation like this is invaluable. thanks w1zzard :D
Posted on Reply
#13
INTHESUN
What is this meant to do. Just tried it and I don't know if I did it correct.


is this correct?.





Uploaded with ImageShack.us
Posted on Reply
#15
trt740
thx wizzard great addition.
Posted on Reply
#16
encor3
How do I use the command lines in GPU-Z? Haven't really used the program more than just monetoring the GPU so I would really appreciate an explanation on how to disable the throttling on my 580GTX...
Thanks in advance!

//encor3
Posted on Reply
Add your own comment