• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Disable GeForce GTX 580 Power Throttling using GPU-Z

GTX580 it's not quite what they anounced,

i was amazed when reviews pointed less power consumption and about more

10% ~20% in some cases (against GTX480)... well now we all know that's not true!
 
Yeah, watercooling definitely sounds like a good idea for this.

Ya know, I think I read somewhere (was it on TPU?) that the throttle is there to also protect the mobo, as well as the card. However, I don't quite understand why motherboard damage could happen: the PCI-E slot is rated for 75W, so the card will simply pull a max of 75W from there, in order to stay PCI-E compliant and the rest through its power connectors, therefore the risk to the mobo shouldn't be there.

Anyone have the definitive answer to this one?

+1

WTF.


how would the GTX5xx dmg the motherboard?

24 Pin P1 Connector Wires getting extremely hot

That is what can happen if you overload the PCI-e slots. Now that was an extreme case of course, but once you start pulling more than 75w through the PCI-E connector things can get hairy pretty quickly.
 
yes thats true.. but he was running more then 1 card.


is the slot/card not designed to stop it from sending more then 75watts through it?
 
24 Pin P1 Connector Wires getting extremely hot

That is what can happen if you overload the PCI-e slots. Now that was an extreme case of course, but once you start pulling more than 75w through the PCI-E connector things can get hairy pretty quickly.

Thanks NT - that's quite a nasty burn on that connector there.

But my point is that wouldn't the card limit its power draw to stay within that limit and pull the rest from it's power connectors? That would prevent any damage to the mobo and stay PCI-E standards compliant. I don't know if it would, which is why I'm throwing the question out to the community.
 
yes thats true.. but he was running more then 1 card.


is the slot/card not designed to stop it from sending more then 75watts through it?

Not really, it will attempt to send as much as is demanded of it.

Thanks NT - that's quite a nasty burn on that connector there.

But my point is that wouldn't the card limit its power draw to stay within that limit and pull the rest from it's power connectors? That would prevent any damage to the mobo and stay PCI-E standards compliant. I don't know if it would, which is why I'm throwing the question out to the community.

That is pretty much the idea behind this limit. The PCI-E slot provides 75w, a 6-pin PCI-E power connector provies 75w, and an 8-pin PCI-E power connector provides 150w. That is 300w. So once you go over that, it doesn't matter if the power is coming from the PCI-E power connectors or the motherboard's PCI-E slot, you are overloading something somewhere, and you aren't PCI-E standards compliant.
 
Sure something would go pop, but it still doesn't answer the question if the card would pull more than 75W from the mobo under such a condition. Properly designed, it should limit the current. I just don't know if it does or not and I don't think anyone else does either.
 
Sure something would go pop, but it still doesn't answer the question if the card would pull more than 75W from the mobo under such a condition. Properly designed, it should limit the current. I just don't know if it does or not and I don't think anyone else does either.

W1z might know if he has power consumption numbers from just the PCI-E slot.

However, if you assume pretty even load across all the connectors, 1/4 from the PCI-E slot, 1/4 from the PCI-E 6-pin, and 1/2 from the PCI-E 8-Pin, once the power consumption goes over 300w, the extra will be divided between all the connectors supplying power. I don't believe the power curcuits on video cards are smart enough to know that once the power consumption goes over a certain level to load certain connectors more than others.
 
Thanks NT, that sounds quite likely. And because of this limitation, I'll bet that's why the current limiter operates the way it does.

W1zz, you wanna give us the definitive answer on this one?
 
Yeah, watercooling definitely sounds like a good idea for this.

Ya know, I think I read somewhere (was it on TPU?) that the throttle is there to also protect the mobo, as well as the card. However, I don't quite understand why motherboard damage could happen: the PCI-E slot is rated for 75W, so the card will simply pull a max of 75W from there, in order to stay PCI-E compliant and the rest through its power connectors, therefore the risk to the mobo shouldn't be there.

Anyone have the definitive answer to this one?

People privy to inside information, and who are smarter than you and I, decided it was necessary.

I suspect the tech specs published to manufacturers didn't account for the unusual power consumption under furmark etc. This wouldn't have been an accident, but rather a procedure to keep costs down re power circuits and cooling.
 
I don't believe the logic exists for that either, I believe some cards pull the memory and other power through the PCIe slot and the core power through he connectors. I hope that is how they have the 580 setup.
 
@ bakalu: Any chance you could rename the EXE Furmark to whatever you like and run it again with your 580? If @ anytime you see the temp rising too much, please interrupt the program but do post a screenie after.
Can you answer my question?

You buy the GTX 580 to play games or run Furmark ?
 
Can you answer my question?

You buy the GTX 580 to play games or run Furmark ?

Neither: i don't buy it.

Took you a long time to reply but no matter. Since i asked, W1zzard has stated that the card really does react to Furmark and OCCT and, as such, what i asked is now irrelevant.
 
Neither: i don't buy it.

Took you a long time to reply but no matter. Since i asked, W1zzard has stated that the card really does react to Furmark and OCCT and, as such, what i asked is now irrelevant.
I bought the GTX 580 to play games so I dont' care the temperature of the GTX 580 when running Furmark

The temperature of the GTX 580 when playing is very cool and that is what interests me.
 
I bought the GTX 580 to play games so I dont' care the temperature of the GTX 580 when running Furmark

The temperature of the GTX 580 when playing is very cool and that is what interests me.

OK, if you don't care about the topic of this thread then please don't post in this thread. This isn't a sneer remark or anything but just a way to get this thing back on topic.

Thanks,
 
Yeah, watercooling definitely sounds like a good idea for this.

Ya know, I think I read somewhere (was it on TPU?) that the throttle is there to also protect the mobo, as well as the card. However, I don't quite understand why motherboard damage could happen: the PCI-E slot is rated for 75W, so the card will simply pull a max of 75W from there, in order to stay PCI-E compliant and the rest through its power connectors, therefore the risk to the mobo shouldn't be there.

Anyone have the definitive answer to this one?

the same way sticking a wire from your 12V rail onto the metal of your case makes shit melt. excess power use will simply cause shit to fry.
 
I bought the GTX 580 to play games so I dont' care the temperature of the GTX 580 when running Furmark

Really? Funny because the first thing you posted on this thread was ...

Maximum Temp with Furmark - 70oC
http://forum.amtech.com.vn/attachme...eforce-gtx-580-da-co-mat-o-amtech-temp-70.jpg

Maximum Power Consumption of Core i7 965 @ 3.6GHz + ASUS GTX 480 when running Furmark
http://forum.amtech.com.vn/attachme...tx-480-bai-binh-phuc-han-gtx480-full-load.jpg

Maximum Power Consumption of Core i7 965 @ 3.6GHz + ASUS GTX 580 when running Furmark
http://forum.amtech.com.vn/attachme...0-da-co-mat-o-amtech-cs-peak-chay-furmark.jpg

The temperature of the GTX 580 when playing is very cool and that is what interests me.

If you say so ...
 
So the card actually reaches above 360W. Just as I anticipated. If they would of let it unleashed it would of exceeded the 300W PCI-E specs limit. Good thing it can unlock easy tho'.
 
Not really, it will attempt to send as much as is demanded of it.



That is pretty much the idea behind this limit. The PCI-E slot provides 75w, a 6-pin PCI-E power connector provies 75w, and an 8-pin PCI-E power connector provides 150w. That is 300w. So once you go over that, it doesn't matter if the power is coming from the PCI-E power connectors or the motherboard's PCI-E slot, you are overloading something somewhere, and you aren't PCI-E standards compliant.

PCI-E can give more than 75 W
PCi-e 1.1 can only give 75W 2.0 can give more than 75. 150 if i recall right...
 
I could do with a stop throttling tool for my HD5870 as when i watch films on my secondary display it throttles down and causes stuttering playback.

Will it work on the HD5870 as well?
 
Will it work on the HD5870 as well?

no. this is only for the gtx 580 power throttling which is a unique mechanism at this time that no other card before ever used
 
PCI-E can give more than 75 W
PCi-e 1.1 can only give 75W 2.0 can give more than 75. 150 if i recall right...
I'm not 100% sure, but I don't think so. A card expecting 150W from the slot would not run on a 1.0 slot, but the compatibility is 100%. The additional W come from the external connectors.
75W+2x75W from 2x6Pins makes 225. 8-pins are used if power drain is larger then 225W.
A PCI-E with 150W from slot could get to 350+ with the extra 8-pins, which is not the case.
 
I'm not 100% sure, but I don't think so. A card expecting 150W from the slot would not run on a 1.0 slot, but the compatibility is 100%. The additional W come from the external connectors.
75W+2x75W from 2x6Pins makes 225. 8-pins are used if power drain is larger then 225W.
A PCI-E with 150W from slot could get to 350+ with the extra 8-pins, which is not the case.

^ you're correct on the wattages.


putting what i said in simpler terms:

drawing more than 75W from the slot wont magically turn the slot off, or anything else like that... if the card has no internal mechanism to deal with the power draw, the wiring feeding the slot will just start to overheat, and bad things can happen.
 
Back
Top