Saturday, July 2nd 2016

Official Statement from AMD on the PCI-Express Overcurrent Issue

AMD sent us this statement in response to growing concern among our readers that the Radeon RX 480 graphics card violates PCI-Express power specification, by overdrawing power from its single 6-pin PCIe power connector and the PCI-Express slot. Combined, the total power budged of the card should be 150W, however, it was found to draw well over that power limit.

AMD has had out-of-spec power designs in the past with the Radeon R9 295X2, for example, but that card is targeted at buyers with reasonably good PSUs. The RX 480's target audience could face troubles powering the card. Below is AMD's statement on the matter. The company stated that it's working on a driver update that could cap the power at 150W. It will be interesting to see how that power-limit affects performance.
"As you know, we continuously tune our GPUs in order to maximize their performance within their given power envelopes and the speed of the memory interface, which in this case is an unprecedented 8 Gbps for GDDR5. Recently, we identified select scenarios where the tuning of some RX 480 boards was not optimal. Fortunately, we can adjust the GPU's tuning via software in order to resolve this issue. We are already testing a driver that implements a fix, and we will provide an update to the community on our progress on Tuesday (July 5, 2016)."
Add your own comment

358 Comments on Official Statement from AMD on the PCI-Express Overcurrent Issue

#151
RejZoR
Behold, it's "AMD fanboys" analysis when someone has to defend NVIDIA. But when it's the other way around, it was nothing, you know, NVIDIA is working on a driver fix, no need to make drama. But throw AMD in the same scenario and whole internet is losing their shit. Sometimes I'm ashamed for owning a NVIDIA card...
Posted on Reply
#152
okidna
RejZoR
Behold, it's "AMD fanboys" analysis when someone has to defend NVIDIA. But when it's the other way around, it was nothing, you know, NVIDIA is working on a driver fix, no need to make drama. But throw AMD in the same scenario and whole internet is losing their shit. Sometimes I'm ashamed for owning a NVIDIA card...
Oh sorry, that's my personal opinion. Need to do a quick edit before the Nekker army coming back to haunt my sleep.
Posted on Reply
#153
D007
They both screw up. AMD and Nvidia.. Why does it always have to turn into a competition? lol..
I just think it's shitty, regardless of who did it.. Sounds like the power spec is way out.
Kind of a big deal, seeing as they kept screaming how little power it used..
Posted on Reply
#154
cdawall
where the hell are my stars
I'm finding my lack of care and concern to be growing with this. It's a simple fix that is already being implemented by amd.
Posted on Reply
#155
HD64G
Tsukiyomi91
If AIB vendors are fixing the "out of spec PCIe limits" with an 8-pin or possibly 8 + 6-pin to reduce excessive power draw from the PCIe slot, then AMD shouldn't even come up with a statement that it will release a driver fix. Limiting the card's power even by a little affects a lot of aspect. @HD64G why not u bench your VGA before limiting it & then u throw a 10% reduction to it via tuning software & then run the same test again? We wanna see if limiting power does not reduce the card's performance, as per what you claimed.
The answer is in this article mate: http://semiaccurate.com/2016/07/01/investigating-thermal-throttling-undervolting-amds-rx-480/
Posted on Reply
#156
NDown
RejZoR
Behold, it's "AMD fanboys" analysis when someone has to defend NVIDIA. But when it's the other way around, it was nothing, you know, NVIDIA is working on a driver fix, no need to make drama. But throw AMD in the same scenario and whole internet is losing their shit. Sometimes I'm ashamed for owning a NVIDIA card...
Well what do you expect when most of their fanbase are mostly manchild/literal kid

you cant have a good gaming experience if you dont have the GeForce GTX® logo/sticker in your PC afterall :^)

most probably doesnt care about efficiency either, or they are simply too new to remember the HD5xxx vs GTX 4xx series
Posted on Reply
#157
cdawall
where the hell are my stars
Oh and one more thing when this driver pops up allowing them to limit pcie to an actual 75w everyone does understand the cheap low quality boards or old worn put boards are still going to pop right? This issue isn't going away. Junk was still never made to survive with actual full spec being pulled over pcie.

This doesn't even account the insane number of people who will never update past the driver that came on the DVD
Posted on Reply
#158
sith'ari
RejZoR
Behold, it's "AMD fanboys" analysis when someone has to defend NVIDIA. But when it's the other way around, it was nothing, you know, NVIDIA is working on a driver fix, no need to make drama. But throw AMD in the same scenario and whole internet is losing their shit. Sometimes I'm ashamed for owning a NVIDIA card...
what are you talking about? what drama? what driver fix? have you read the link posted by @okidna? ( http://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480/Evaluating-ASUS-GTX-960-Strix ).
The GTX 960, was tested, and even while overclocked it was working excellent regarding its power delivery (*unlike the RX480).
Posted on Reply
#159
okidna
cdawall
This doesn't even account the insane number of people who will never update past the driver that came on the DVD
This is an issue that I found very frequently, friends or colleagues who complained about their poor FPS when playing a new game, they don't know that both AMD and NVIDIA now use a different approach when it comes to new game support.

sith'ari
what are you talking about? what drama? what driver fix? have you read the link posted by @okidna? ( http://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480/Evaluating-ASUS-GTX-960-Strix ).
The GTX 960, was tested, and even while overclocked it was working excellent regarding its power delivery.
No no no, that's my fault, not RejZor fault. I wrote "fanboy analysis" on my original post before removing it because it's my personal opinion, shouldn't wrote it in the first time.
Posted on Reply
#160
sith'ari
okidna
No no no, that's my fault, not RejZor fault. I wrote "fanboy analysis" on my original post before removing it because it's my personal opinion, shouldn't wrote it in the first time.
Oh, apologies to RejZor then !;)
Posted on Reply
#161
alucasa
Anyone commit suicide yet? Some should have according to wild reactions here.
Posted on Reply
#162
newtekie1
Semi-Retired Folder
john_
In that review average power was 74W. After overclocking the GPU at over 1400Mhz and memory at over 2000MHz, the results where close to 20% extra performance. I think I am going to doubt that someone gets a 20% extra performance and stays under the 75W limit. Probably the card goes to 90W(20% extra power for 20% extra performance), if not more considering that usually power consumption goes faster up, compared to performance. If they where getting 1-3% extra performance, I would have agreed with you.
Again, not how it works. GPU Boost is designed to adjust the clock speed to keep the card below the power limit. So even if you do overclock the GTX950, GPU boost guarantees that the card will stay right around 75w. The only way to go beyond the 75w would be to adjust the power limit in your overclocking software or by BIOS. Either way, at that point the user would be aware they are overloading the PCI-E slot.

john_
Are those SSDs advertised as SLC SSDs? I believe not. Well, if they where Nvidia products, probably they would. And people would be happy to convince themselves that while being MLC, performing as SLC would made them equal to SLCs. And anyone saying the opposite, would have been a stupid fanboy that hates Nvidia and doesn't acknowledge Nvidia's superior engineering. "It is a good design".

Also SLC vs MLC is not just performance difference. If I am not mistaken SLCs are considered as having better longevity. The same applies to the 970. It is not just those slow 500 MBs. Also less cache, less ROPs, less memory bandwidth. Specs where completely wrong and we shouldn't be giving any excuses to companies.
Most of them do put blurbs in their advertising about this feature, yes.

Also, the advertising for the GTX970 was correct. They didn't advertise ROP count, they didn't advertise cache size. So you don't really have a point there.
Posted on Reply
#163
ensabrenoir
NDown
Well what do you expect when most of their fanbase are mostly manchild/literal kid

you cant have a good gaming experience if you dont have the GeForce GTX® logo/sticker in your PC afterall :^)

most probably doesnt care about efficiency either, or they are simply too new to remember the HD5xxx vs GTX 4xx series
:eek:.......you dare mock us? At least get your facts right!!!!! It take a GeForce GTX logo, and Racing Stripes you barbarian!
Posted on Reply
#164
sith'ari
ensabrenoir
:eek:.......you dare mock us? At least get your facts right!!!!! It take a GeForce GTX logo, and Racing Stripes you barbarian!
Probably trolling. If you look at his system he owns a GTX 970, so.......... !

P.S. Anyway, just like so many guys already said, i would like to emphasize on this video:
.Whoever wants to take a quick look, go at 20:50 and watch the estimation of the possible threat that the RX 480 is for our systems!!
Posted on Reply
#165
RejZoR
Jesus, people think this will just fry the system after 3 gaming sessions. Sure it's not healthy if you use it for 3 years like this, but c'mon, the card was released what, 3 days ago? And everyone still going absolutely batshit insane over it despite fix being promised (which will most likely limit the card to actual 150W or revert the power delivery for 6pin to accept more and PCIe to be within the limits. In either cases you wouldn't actually be "losing" performance because 150W was advertised from the beginning. But oh well. It's page 7 already...
Posted on Reply
#166
sith'ari
RejZoR
Jesus, people think this will just fry the system after 3 gaming sessions. Sure it's not healthy if you use it for 3 years like this, but c'mon, the card was released what, 3 days ago? And everyone still going absolutely batshit insane over it despite fix being promised (which will most likely limit the card to actual 150W or revert the power delivery for 6pin to accept more and PCIe to be within the limits. In either cases you wouldn't actually be "losing" performance because 150W was advertised from the beginning. But oh well. It's page 7 already...
Mate, i haven't paid 200€ on a high-end PSU, and 400€ for also a high-end UPS, only to let an RX 480 to endanger my system!!:mad:
But , hey, with your money you can do what you want.;)
Posted on Reply
#167
ensabrenoir
sith'ari
Probably trolling. If you look at his system he owns a GTX 970, so.......... !

P.S. Anyway, just like so many guys already said, i would like to emphasize on this video:
.Whoever wants to take a quick look, go at 20:50 and watch the estimation of the possible threat that the RX 480 is for our systems!!
...so am I;)
Posted on Reply
#168
RejZoR
sith'ari
Mate, i haven't paid 200€ on a high-end PSU, and 400€ for also a high-end UPS, only to let an RX 480 to endanger my system!!:mad:
But , hey, with your money you can do what you want.;)
[Drama Intensifies]

You don't even have RX480 and you're making it like it has already fried your PCIe circuitry...
Posted on Reply
#169
john_
newtekie1
Again, not how it works. GPU Boost is designed to adjust the clock speed to keep the card below the power limit. So even if you do overclock the GTX950, GPU boost guarantees that the card will stay right around 75w. The only way to go beyond the 75w would be to adjust the power limit in your overclocking software or by BIOS. Either way, at that point the user would be aware they are overloading the PCI-E slot.
You are avoiding to answer the question here. How can you have 100% performance at 74W and then get 120% performance and remain at 74W? Simple answer. You can't.
Most of them do put blurbs in their advertising about this feature, yes.

Also, the advertising for the GTX970 was correct. They didn't advertise ROP count, they didn't advertise cache size. So you don't really have a point there.
None advertise it as SLC. If you have any example of an MLC SSD that says "it uses SLC" you are free to show it.


In the end I feel like the stupid little fool, trying to have a honest conversation with people who will commit suicide before posting anything questionable about Nvidia.
Posted on Reply
#170
xorbe
RejZoR
[Drama Intensifies]

You don't even have RX480 and you're making it like it has already fried your PCIe circuitry...
A couple websites, one dusty motherboard, and everyone's PCs are on fire! What happened to the poster claiming the RX480 was sustaining 254 watts? Maybe that was at [H].

The one long-time poster I've seen with RX480 at [H] has 2 cards and stress tested them for hours without issue in one PC.
Posted on Reply
#171
sith'ari
RejZoR said:
[Drama Intensifies]
You don't even have RX480 and you're making it like it has already fried your PCIe circuitry...
since i have so expensive protection equipment, that means that i hate taking risks........AT ALL !
( P.S. Of course you are correct. I don't own a RX 480, and the last few years, i'm nowhere near interested fo buy AMD's GPUs. Already explained myself at post #43 of this thread. If you like, take a look at it;) )
Posted on Reply
#172
zAAm
RejZoR
Jesus, people think this will just fry the system after 3 gaming sessions. Sure it's not healthy if you use it for 3 years like this, but c'mon, the card was released what, 3 days ago? And everyone still going absolutely batshit insane over it despite fix being promised (which will most likely limit the card to actual 150W or revert the power delivery for 6pin to accept more and PCIe to be within the limits. In either cases you wouldn't actually be "losing" performance because 150W was advertised from the beginning. But oh well. It's page 7 already...
Sure it won't fry your system after 3 gaming sessions, but then again, VW's diesel engines won't destroy the planet in 3 months either ;) It isn't a big issue as long as they actually fix it, but we'll need to see if they can actually limit single phases via software control and if they cannot, how a global TDP limitation will affect the value proposition of the card in terms of potential reduced performance...
Posted on Reply
#173
RejZoR
VW outright lied and cheated intentionally. There is nothing to fix because what they've done was intentional. AMD simply cocked it up and they are already working on a fix for it. That's a big difference.
Posted on Reply
#174
Secoya
$ReaPeR$
how many are affected? find me the number. we are talking about 16watt over the specs, 16 ffs.

[IMG]http://www.guru3d.com/index.php?ct=articles&action=file&id=23242[/IMG]
If even one person was affected by the reckless overspecced wattage draw of the RX 480 simply because AMD didn't want to put a proper power source onto the card for PR reasons, then it is one too many. Science Studio has a video review posted on Youtube showing that the RX 480 isn't even playable with certain motherboards simply becasue it is WAY over spec. Gamers Nexus showed the card pulling 192W during testing. Don't link a chart showing calculated TDP and stand back and point saying THERE! This is the most ludicrous move in GPU history and it serves AMD right to have it blow up in their faces for doing it. Some cards do in fact go over spec, particularly when OC'd, the difference of course being that they pull that extra wattage through the 6 or 8 pin connector and not the PCIe slot. If you have a good power supply, it doesn't affect you. BUT with the RX 480, this is not the case and it's a problem that exists at the hardware level because of the way the phases are laid out. It's actually pulling more from hte PCIe slot than it is from the 6 pin connector.

Now, lets go back to the Gamers Nexus observed wattage pull when OC'd, 192W from a 150W power supply with MORE THAN HALF of that coming through the PCIe slot. That's a 128% jump over spec. That's like trying to pull 32 amps from a 25 amp wall socket. You could burn your house down if there were no fail-safes. Even a nonOC'd card will pull 86W from the PCIe slot, or 15% more than max. These aren't guidelines, they are absolute limits.

The RX 480 is the ONLY GPU in history to average more than 75W from the PCIe slot at stock clocks. The only way to prevent it is to limit wattage to 150W, really less becasue it pulls more from the PCIe slot than the 6 pin connector, so lets say 145W. That would require a 14.4% under-clock of the card. A card that already is 15% less powerful than a GTX 1060 which will be able to AT LEAST OC another 20%. So, now we're talking about a GTX 1060 that will be around 50% more powerful than a reference RX 480 and 25% more powerful than an OC'd AIB 480 for about $250. Being released at the same time as the AIB cards which will most likely cost $300.

This was a MUST WIN for AMD, but instead it became worst case scenario.
Posted on Reply
#175
zAAm
RejZoR
VW outright lied and cheated intentionally. There is nothing to fix because what they've done was intentional. AMD simply cocked it up and they are already working on a fix for it. That's a big difference.
It was meant as an analogy to the effect, not the intention of the company. No need to get all up in arms ;)
Posted on Reply
Add your own comment