Saturday, July 2nd 2016

Official Statement from AMD on the PCI-Express Overcurrent Issue

AMD sent us this statement in response to growing concern among our readers that the Radeon RX 480 graphics card violates PCI-Express power specification, by overdrawing power from its single 6-pin PCIe power connector and the PCI-Express slot. Combined, the total power budged of the card should be 150W, however, it was found to draw well over that power limit.

AMD has had out-of-spec power designs in the past with the Radeon R9 295X2, for example, but that card is targeted at buyers with reasonably good PSUs. The RX 480's target audience could face troubles powering the card. Below is AMD's statement on the matter. The company stated that it's working on a driver update that could cap the power at 150W. It will be interesting to see how that power-limit affects performance.
"As you know, we continuously tune our GPUs in order to maximize their performance within their given power envelopes and the speed of the memory interface, which in this case is an unprecedented 8 Gbps for GDDR5. Recently, we identified select scenarios where the tuning of some RX 480 boards was not optimal. Fortunately, we can adjust the GPU's tuning via software in order to resolve this issue. We are already testing a driver that implements a fix, and we will provide an update to the community on our progress on Tuesday (July 5, 2016)."
Add your own comment

358 Comments on Official Statement from AMD on the PCI-Express Overcurrent Issue

#51
ppn
Because you don't SLI 750 Ti (which is 57 watts by the way reviews/NVIDIA/GeForce_GTX_750_Ti/23.html) but CF RX 480 is advertised, that's 150 watts and add 25+ watts DDR3 and all that on 4 pin +12V.
Posted on Reply
#52
Jism
The ATX 24 pins delivers one or 2 12V wires to the motherboard, providing power for the PCI-express and all. Any modern budget motherboard also carries a 4 pins power for the CPU. And still on some motherboards that powerline is shared as well.

If you think that the extra 16 watts should be a problem then buy a proper motherboard, or a 10 $ PCI-express "booster"
Posted on Reply
#54
McSteel
Jism
This is such a storm in a glass of water....

If both the 6 pins and motherboard provide a 75W for a total of 150 watts, and the card exceeds at 166 watts, this means that 16 watts split by two (8 watts) is being pulled more then it should.

I think any motherboard is capable of doing more then 25 watts on top, otherwise that system or motherboard would already be at it's limits.
The problem is most budget motherboards aren't as capable as you'd like... During the Bitcoin GPU mining craze, I've seen dozens of fried slots and 24-pin ATX connectors...

The connector itself will carry up to 14A (7A per wire pair, two +12V wires + two GND/COMs on the connector) which is 168W. And that's at 20°C, less at higher temps.
Wouldn't be a problem if the card only made brief excursions into 75W territory, but it consistently draws that much from the slot. Adding another one to a CF setup without additional power connectors for +12V on the board and you can be pretty sure you're running at the very limit of the ATX connector's capabilities. As for the individual PCI-E slots, I imagine that depends on the quality of the MoBo itself...

Even so, most people will probably be just fine. Some problems, which could've been avoided, are to be expected though.
Posted on Reply
#55
Jism
Was'nt that Bitcoin cards could be tweaked for maximum efficiency? I.e clock down the memory to get best power-usage and highest possible MH's a second?

If you stack multiple cards in just one motherboard and expect that only one 24 pins ATX connector is going to supply each card of sufficient power, then i think you should reconsider a better motherboard with external PCI-express source or the use of PCI-express boosters (nothing but an add-in card that provides extra current for the PCI-express bus).

I dont think that again 16watts should cause huge problems. Shared your still talking about 8W in maximum usage.
Posted on Reply
#56
Divenity
the54thvoid
Good they're addressing it but they can't blame the memory speed. The GTX 1070 runs at the 'unprecedented' 8Gbps.

Then again, not a huge issue as only really affected much older mobos?
the GTX1070 also has an 8 pin power connector, not a 6 pin, so it wouldn't need to overdraw the PCI-E slot.
Posted on Reply
#57
RejZoR
So much drama about something AMD already confirmed as fixable via something as simple as DRIVERS most retarded noobs can install. Jesus christ, everyone stop losing your shit already.

Lets just wait for this driver and see if things are resolved. Then whine about it if it wont' actually be fixed.
Posted on Reply
#58
rtwjunkie
PC Gaming Enthusiast
So, a driver will limit power draw to 150w. Now....what will this do to the AIB boards that are adding more power connectors, to up the power available for higher clocks?

If the driver affects all R 480's, then it seems AMD eill be dooming the AIB makers to lackluster performance and sales.
Posted on Reply
#59
Divenity
If it comes to that, rtw, I'm sure they will fix it.
Posted on Reply
#60
rtwjunkie
PC Gaming Enthusiast
Divenity
If it comes to that, rtw, I'm sure they will fix it.
I hope you're right. Not sure how they would do that without unnecessarily confusing things by adding specific model recognition into those drivers. It really needs a hardware fix, and send out rev. 02 cards. That way the AIB's can get on with making these cards better.
Posted on Reply
#61
RejZoR
rtwjunkie
So, a driver will limit power draw to 150w. Now....what will this do to the AIB boards that are adding more power connectors, to up the power available for higher clocks?

If the driver affects all R 480's, then it seems AMD eill be dooming the AIB makers to lackluster performance and sales.
How do you think NVIDIA is separating founders edition (reference models) of GTX 1080 with messed up fan profiles from the custom models?

EDIT:
There is no need for a "hardware" fix. Have you ever fiddled with Maxwell II Tweaker? It does exactly what AMD will fix via drivers. It's what I'm doing with my GTX 980. It's what thousands of Maxwell 2 users are doing. It's not rocket science once you figure it out and considering AMD knows where is what in their BIOS, it's a walk in the park. They can tap in with drivers easily, basically their new Wattman is what will they most likely use anyway.
Posted on Reply
#62
rtwjunkie
PC Gaming Enthusiast
RejZoR
How do you think NVIDIA is separating founders edition (reference models) of GTX 1080 with messed up fan profiles from the custom models?
I thought NVIDIA said BIOS update, not driver? AMD are saying here driver update, which is a more sweeping application.
Posted on Reply
#63
GhostRyder
Still foolish not to just put an 8 pin as the default... If they wanted to do this to the 4gb and limit the board spec to 150 with a 6 pin then it's fine, but they should have at least with the 8gb given an 8 pin reference.

This was just a foolish design choice.
Posted on Reply
#64
RejZoR
rtwjunkie
I thought NVIDIA said BIOS update, not driver? AMD are saying here driver update, which is a more sweeping application.
NVIDIA fixed fan issues with a driver. They can tap into anything, they are the makers of hardware and BIOS. Same goes for AMD in this case. If RX480 has similar power delivery logic as GTX 9xx and GTX 10xx series (which I suspect it does), they can do exactly the same thing via drivers. They don't need to issue complicated and risky BIOS updates, they can do it via driver update that simply taps into specific parameters between driver, BIOS and hardware. It's how they fixed certain issues with R9 290X cards if you remember. I think it was about thermal throttling and fan profiles as well. It's how you fix things the easiest. BIOS is just too complicated and risky for average users. Where installation of drivers is a thing of few clicks every noob can do basically risk free.

GhostRyder
Still foolish not to just put an 8 pin as the default... If they wanted to do this to the 4gb and limit the board spec to 150 with a 6 pin then it's fine, but they should have at least with the 8gb given an 8 pin reference.

This was just a foolish design choice.
Costs my friend, costs. They wanted to make really affordable product. Placing 8pin instead of 6pin could result in higher price. They always say this when bulk components could cost 5p to the final product, but it then somehow becomes a $20 addition...
Posted on Reply
#65
ZoneDymo
GhostRyder
Still foolish not to just put an 8 pin as the default... If they wanted to do this to the 4gb and limit the board spec to 150 with a 6 pin then it's fine, but they should have at least with the 8gb given an 8 pin reference.

This was just a foolish design choice.
Honestly I think Nvidia pushed AMD with their new gpu's.
The fact that you can barely OC the RX480 makes me believe AMD quickly issued as high a clock out of the box as possible to make the cards look good in the performance section, but that originally they were not meant to run this high.
I guess its best to get a custom designed RX480 from a partner that indeed has an 8pin connector and more cooling capability.
Posted on Reply
#66
BiggieShady
GhostRyder
This was just a cheaper design choice.
ftfy

Just like we have a saying "I'm not rich enough to buy cheap things", AMD should say "we are not rich enough to do cheap designs" :laugh:
Posted on Reply
#68
OneCool
W1zzard
bta seems asleep, I saw the email after crawling out of bed with my gf, so I thought "let's get this out to the people"
It wasnt worth it.Should have stayed in bed with your gf.
Posted on Reply
#69
Batou1986
I guess AMD is just going to keep silent on the fact that this is somewhat of an issue on PCIe 3.0 but a huge problem for PCIe 2.0 boards, like every single AM3+ board on the market.
:banghead:
Posted on Reply
#70
sith'ari
okidna
And they are priced decently as well : https://www.overclockers.co.uk/detail/index/sArticle/61887
250$ (edit: i just noticed that they are £ not $, so even more expensive than i thought)
. Exactly the price that i had predicted in the past for the RX480 8GB version, but when i said that, lot of people dissagreed. ( https://hardforum.com/threads/radeon-rx-480-competition-poll.1903083/page-3#post-1042373817 )
RejZoR said:
Costs my friend, costs. They wanted to make really affordable product. Placing 8pin instead of 6pin could result in higher price. They always say this when bulk components could cost 5p to the final product, but it then somehow becomes a $20 addition...
Yeah, that's the hole point: when a company decides to become dirty-cheap and transfer the cost from them to the customer, then i'd say that we are having a problem:rolleyes:
Posted on Reply
#71
R-T-B
$ReaPeR$
is it really that serious of a problem though? because people seem to be panicking about this..
If you have a junky motherboard, maybe, but I doubt any mainstream brands don't build in A LITTLE reserve.
Posted on Reply
#72
G33k2Fr34k
sith'ari
Yeah, very cheap, especially if i have to give another +-100$ to replace my likely damaged/fried motherboard!!:fear:
No matter what people have said in this thread, that's the 1st time i read a review (*2 reviews actually, TPU & Tom's) that recognises the GPU as a possible threat for the rest of the hardware !!
The design of the PCB is certainly not cheap, according to pcperspective. The 480 has a better 6+1 power phase design in addition to better beefier VRM setup. It seems that this issue only effects older motherboards. Newer ones don't have that problem.[/quote]
Posted on Reply
#73
R-T-B
Jism
Was'nt that Bitcoin cards could be tweaked for maximum efficiency? I.e clock down the memory to get best power-usage and highest possible MH's a second?
Going over bitcoin 12V draw on the PCIe rail war normal in mining. If you had a cheap board, it can and would burn up with 5 cards in it all drawing over spec. Not unheard of, I experienced it once even. It doesn't smell good.
Posted on Reply
#74
rtwjunkie
PC Gaming Enthusiast
RejZoR
NVIDIA fixed fan issues with a driver. They can tap into anything, they are the makers of hardware and BIOS. Same goes for AMD in this case. If RX480 has similar power delivery logic as GTX 9xx and GTX 10xx series (which I suspect it does), they can do exactly the same thing via drivers. They don't need to issue complicated and risky BIOS updates, they can do it via driver update that simply taps into specific parameters between driver, BIOS and hardware. It's how they fixed certain issues with R9 290X cards if you remember. I think it was about thermal throttling and fan profiles as well. It's how you fix things the easiest. BIOS is just too complicated and risky for average users. Where installation of drivers is a thing of few clicks every noob can do basically risk free.
Ok, thanks for the update and correct info!
Posted on Reply
#75
GC_PaNzerFIN
Bansaku
Why is this an issue? Both the GTX 750 Ti and the GTX 950 both drew significantly more than 75W from the PCI bus! In short bursts, there is absolutely no problem with brief power spikes. Shitty deal if you have an older/cheap mobo and this is an issue; Seriously, quality motherboards are dirt cheap!

:toast:
None of those breaks the average spec of 75W (actually, for 12V its even less), while RX 480 breaks average spec by significant amount. Its not short bursts only. Huge difference.
Posted on Reply
Add your own comment