• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Official Statement from AMD on the PCI-Express Overcurrent Issue

Why is this an issue? Both the GTX 750 Ti and the GTX 950 both drew significantly more than 75W from the PCI bus! In short bursts, there is absolutely no problem with brief power spikes. Shitty deal if you have an older/cheap mobo and this is an issue; Seriously, quality motherboards are dirt cheap!

:toast:

None of those breaks the average spec of 75W (actually, for 12V its even less), while RX 480 breaks average spec by significant amount. Its not short bursts only. Huge difference.
 
Honestly I think Nvidia pushed AMD with their new gpu's.
The fact that you can barely OC the RX480 makes me believe AMD quickly issued as high a clock out of the box as possible to make the cards look good in the performance section, but that originally they were not meant to run this high.
I guess its best to get a custom designed RX480 from a partner that indeed has an 8pin connector and more cooling capability.

6pin was used just because AMD targets OEMs and they don't usually sell PCs with high end PSUs. So, they could put a cheap PSU into a PC with Polaris.

The custom ones don't need to take this into consideration so they will clock very high (rumors say about +1400 MHz which will reach or even surpass stock 980).

None of those breaks the average spec of 75W (actually, for 12V its even less), while RX 480 breaks average spec by significant amount. Its not short bursts only. Huge difference.

Driver fixable thing for those who don't know how to do it themselves, -10% in power settings for the current driver for those who want to do that now. And not any decrease in performance also. :toast:

As for the custom 480s, they will have alternative BIOS as usual and won't depend on default driver settings as usual.
 
Driver fixable thing for those who don't know how to do it themselves, -10% in power settings for the current driver for those who want to do that now. And not any decrease in performance also. :toast:
You think they are wasting power for no reason? -10% will change performance, or otherwise AMD would have done it already.

There is a more technical solution, which does not change power draw or performance but that is still under investigation. Feasibility of that depends whether PCI-E bus and 6-pin are connected to parallel as input to all VREGs or they supply different parts of the VREGs.
 
Driver fixable thing for those who don't know how to do it themselves, -10% in power settings for the current driver for those who want to do that now. And not any decrease in performance also. :toast:

well , logic suggests that what you say can not be accurate. If there is no decrease when you reduce the power, then why not -20%, or even -50% then?
 
Reading AMD's statement cynically, one could say that they deliberately released the cards like this so they'd look good/better in the benchmarks, because once the power use is brought down the performance is gonna be significantly lower and make their value for money much lower. Be interesting to see exactly how much the performance hit will be.

@W1zzard are we gonna see a quick retest review with a handful of benchmarks with the revised driver to check this out?
 
If AIB vendors are fixing the "out of spec PCIe limits" with an 8-pin or possibly 8 + 6-pin to reduce excessive power draw from the PCIe slot, then AMD shouldn't even come up with a statement that it will release a driver fix. Limiting the card's power even by a little affects a lot of aspect. @HD64G why not u bench your VGA before limiting it & then u throw a 10% reduction to it via tuning software & then run the same test again? We wanna see if limiting power does not reduce the card's performance, as per what you claimed.
 
Reading AMD's statement cynically, one could say that they deliberately released the cards like this so they'd look good/better in the benchmarks, because once the power use is brought down the performance is gonna be significantly lower and make their value for money much lower. Be interesting to see exactly how much the performance hit will be.

@W1zzard are we gonna see a quick retest review with a handful of benchmarks with the revised driver to check this out?

Except people actually report higher performance when restricting its power...
 
Except people actually report higher performance when restricting its power...
Reducing operating voltage MAY increase performance in thermally limited situations. Decreasing total power limit without changing anything is going to have opposite effect.
Reducing operating voltage MAY work on some GPUs, but due to obvious negative effect on stability (there is a reason why they put the VID it has now) it is too risky to do on all cards.
 
I'm wondering why no one thought to consider that the BIOS might be the problem, and that AMD simply gave all cards a BIOS that allowed maximum power draw, over-stepping the driver-based tools to increase that. I found it interesting that power draw was high, and no OC was possible using the driver-based tools to give the GPU more power, and when you put those two together, and then consider the ASUS and MSI GPUs recently reviewed, you get a potential BIOS problem.

Perhaps AMD gave the card a BIOS that allowed it to exceed PCIe spec because it wanted it to be reviewed in the best light, and knew reviewers sometimes do not investigate OC? Given how their clocks are "managed" compared to NVidia's Turbo, this actually seems like the most reasonable explanation for what happened. Not every site has the capability to accurately measure power consumption for PCIe devices, so many sites wouldn't even be able to test such an issue.
 
Hahaha.... Dead on arrival... People have more and more reason to wait and buy the 1060 now...
Get outta here you fanboy!
As if the gtx 970 or 960 is clean. this is only when you over clock. Try to do some research before commenting you sound like a idiot.
This is just some redit fanboy making a big deal of nothing. I didn't hear a cry when Nvidia came out with the 960 or 970.
 
None of those breaks the average spec of 75W (actually, for 12V its even less), while RX 480 breaks average spec by significant amount. Its not short bursts only. Huge difference.
Gotta do a bit more research.
Here a video I found to be correct.
 
Get outta here you fanboy!
As if the gtx 970 or 960 is clean. this is only when you over clock. Try to do some research before commenting you sound like a idiot.
This is just some redit fanboy making a big deal of nothing. I didn't hear a cry when Nvidia came out with the 960 or 970.

Well, for whatever it's worth, I have a habit of checking out user benchmark whenever new hardware is launched, just to check out the performance and popularity of CPU's and GPU's.

Around midnight of the launch day, going into July 30th, market share ranked 74th. Today it is tied with the R9 290 at 17th and obviously going up. I'm betting it'll -- in terms of sales -- trade blows with the GTX 1070 from here on out and probably slightly surpass it at some point once the 4GB versions hit the market.

Not exactly DOA by my reckoning.

I just think Nvidia fanboys (who are up there with Apple and Nintendo, imo) were spring loaded to fire at any sudden movement after the 970 fiasco.
 
Except people actually report higher performance when restricting its power...
That makes no sense. Perhaps there's something else going on here? Without details one can't say what the true situation is, but simply lowering power consumption isn't gonna increase performance.

Reducing operating voltage MAY increase performance in thermally limited situations. Decreasing total power limit without changing anything is going to have opposite effect.
Reducing operating voltage MAY work on some GPUs, but due to obvious negative effect on stability (there is a reason why they put the VID it has now) it is too risky to do on all cards.
This sounds more plausible.
 
If it's thermal throttling, it can throttle more than it does if it's power limited... this one would probably depend on quality of case cooling...
 

Lol. "RIP AMD" and here you are referencing one of the most biased source on seekingalpha.com to prove your point. Not sure if you know Hibben is known for his anti-AMD mentality.

This says a lot about how biased you are as well.
 
I've heard cases where the cards are power throttling on the factory overclock, some are doing it on stock also, so increasing the power limit increases the cards performance because its drawing too much power. Some other lower binned ones are running hotter and requiring undervolt or the power limit to be dropped to reduce temps and that increases performance since it won't thermal throttle down the boost clock, its pretty confusing really.
 
Last edited:
If you have a junky motherboard, maybe, but I doubt any mainstream brands don't build in A LITTLE reserve.

I think it has to do more with the connector limitation of the little pins in the slot.
 
Why is this an issue? Both the GTX 750 Ti and the GTX 950 both drew significantly more than 75W from the PCI bus! In short bursts, there is absolutely no problem with brief power spikes. Shitty deal if you have an older/cheap mobo and this is an issue; Seriously, quality motherboards are dirt cheap!

:toast:
When did that happen? My MSI GTX750Ti overclocked to 1300 / 1500 never exceeded 65W mark during stress testing (according to sensor readouts). Under normal conditions it stays below 58W.
GTX950 is a 90W card, so under normal circumstances, the only theoretical way it can overdraw power from PCI-E only if it draws no power from 6-pin connector at all (or if you own one of those newer bus-powered cards from ASUS or EVGA).

When it comes to motherboards, you'll be surprised how many shitty products hit the market nowadays. Just because it's high-end does not mean that it won't break.
 
ven the aforementioned Diablotek could handle powering one of these, paired with a latest-gen Skylake CPU, a couple sticks of RAM and some storage. It would all easily fit into a 250W envelope (absolute peak power draw, realistically less than that), which even the worst of the worst PSUs can manage, at least for a while.

That being said, everyone should have the common sense not to skimp on the PSU. No need to go crazy, a nice $30-or-so PSU from a reputable manufacturer should do fine, as @newtekie1 pointed out.

Sure, but like you pointed out, why skimp on the PSU. And if you have a latest-gen skylake, or even a last gen Haswell, or even an Ivy-Bridge, and your PSU doesn't have an 8-pin connector, seriously go buy a new one before upgrading your graphics card!
 
I think it has to do more with the connector limitation of the little pins in the slot.

No, that can't be it because otherwise connectors to boost the amperes to the slot (as my board has) would make little sense.
 
Sure, but like you pointed out, why skimp on the PSU. And if you have a latest-gen skylake, or even a last gen Haswell, or even an Ivy-Bridge, and your PSU doesn't have an 8-pin connector, seriously go buy a new one before upgrading your graphics card!
Yea, I mean even the most entry level PSU is $50 bucks comes with both an 8 and 6 pin (EVGA) so I don't see the logic in not having it and instead opting for the 6 pin.

When did that happen? My MSI GTX750Ti overclocked to 1300 / 1500 never exceeded 65W mark during stress testing (according to sensor readouts). Under normal conditions it stays below 58W.
GTX950 is a 90W card, so under normal circumstances, the only theoretical way it can overdraw power from PCI-E only if it draws no power from 6-pin connector at all (or if you own one of those newer bus-powered cards from ASUS or EVGA).

When it comes to motherboards, you'll be surprised how many shitty products hit the market nowadays. Just because it's high-end does not mean that it won't break.
Never checked my 950, but its the 0 power connector from Asus. Would be curious how much it draws under load.
 
Yea, I mean even the most entry level PSU is $50 bucks comes with both an 8 and 6 pin (EVGA) so I don't see the logic in not having it and instead opting for the 6 pin.

Never checked my 950, but its the 0 power connector from Asus. Would be curious how much it draws under load.


Even the $30 430w eVGA has an 8-pin.

And the ASUS gtx950 with no power connector pulls a maximum of 76w from the slot. So they managed to keep it right at the limit.

power_maximum.png
 
Back
Top