• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Official Statement from AMD on the PCI-Express Overcurrent Issue

I can't even get mine to post atm. Now I am on a 420W(seasonic) psu on an mini itx board..so I didn't think it would be an issue, but I'll likely have to buy a new PSU or return the card...
 
Correct, the 960 Strix is much worse in terms of Power Consumption on the PCIE lane.
If the 960 Strix does not concern you, than the RX 480 is just fine and dandy.

U0RtGeu.jpg


The problem is slot power draw, not total power draw. The 960 Strix was proven to "overdraw" from the 6-pin external connector, the slot power draw stays below 50W even when overclocked...
 
Unless you paid 20€ the motherboard you are blowing this shit way out of proportion.
and everyone does realize we are talking about 7w over spec on the mainboard right?

480.png

I was deciding on if to say this or but but why not. The belief that 75watts is what you can draw from the PCI-e slot, That is True. However that 75 watts is COMBINED draw. Some people just reading up to this should already have an idea about this. Offically the spec says you can Pull 66 watts from the +12 volt pins in the board so the Sry to bust the bubble a bit there cdawall but its 16 watts over spec. The other 9 watts is +3.3volt pins. In the Pcper video linked they do say this but seems like a lot of people missed it. Below is table showing the draw and if you know how to do the conversions you will see it is the truth.

PCIe%20Power%20Rail-2_0.png
 
Last edited:
U0RtGeu.jpg


The problem is slot power draw, not total power draw. The 960 Strix was proven to "overdraw" from the 6-pin external connector, the slot power draw stays below 50W even when overclocked...

Incorrect. The GTX 960 Strix was peaking regularly at over 225 Watts on the PCIE Slot.
The RX 480 has much lower peaks around 160 Watts, and therefore should be of no concern to those who accept Nvidia's previous offerings.
 
Incorrect. The GTX 960 Strix was peaking regularly at over 225 Watts on the PCIE Slot..

You have a confirmed source for that? 225watt draw for a maxwell card that far down the hierarchy seems implausible.

EDIT: lol - I didn't actually use Toms hardware as a source for power draw (don't rate too highly) then watched the video.. As for that guy on AdoredTV - as a Glaswegian, his accent was a sham. Scottish people don't talk that way - when they do, they get a punch in the mouth.

power_maximum.gif
 
Last edited:
Incorrect. The GTX 960 Strix was peaking regularly at over 225 Watts on the PCIE Slot.
The RX 480 has much lower peaks around 160 Watts, and therefore should be of no concern to those who accept Nvidia's previous offerings.

Read the entire thread please, no need for anything to be repeated 100 times for everyone.
Also, this.
 
You have a confirmed source for that? 225watt draw for a maxwell card that far down the hierarchy seems implausible.

EDIT: lol - I didn't actually use Toms hardware as a source for power draw (don't rate too highly) then watched the video.. As for that guy on AdoredTV - as a Glaswegian, his accent was a sham. Scottish people don't talk that way - when they do, they get a punch in the mouth.

power_maximum.gif

Tom's Hardware was one of 4 websites that promoted this faux furor.
You are using a Total Power Consumption chart when the real issue here is the amount of power is coming through the mainboard slot, so it isn't relevant to the subject.
Thank you for dismissing the video based on the accent of the author. This methodology of establishing the accuracy of technical information on the basis of a speaker's accent should be Standard Operating Procedure.
 
Last edited:
Thank you for dismissing the video based on the accent of the author. This methodology of establishing the accuracy of technical information on the basis of a speaker's accent should be Standard Operating Procedure.

I think the point was more along the lines of "if he's faking an accent, how honest is he?"

Incorrect. The GTX 960 Strix was peaking regularly at over 225 Watts on the PCIE Slot.

Bullshit. The PCIe slot's standard watt limit is 50W's. I mean it can do more especially with boosters, but 225W? No, it'd melt and we'd have lawsuits. It pulls the majority of that by overspeccing the 6pin, which is far less of an issue.
 
I was deciding on if to say this or but but why not. The belief that 75watts is what you can draw from the PCI-e slot, That is True. However that 75 watts is COMBINED draw. Some people just reading up to this should already have an idea about this. Offically the spec says you can Pull 66 watts from the +12 volt pins in the board so the Sry to bust the bubble a bit there cdawall but its 16 watts over spec. The other 9 watts is +3.3volt pins. In the Pcper video linked they do say this but seems like a lot of people missed it. Below is table showing the draw and if you know how to do the conversions you will see it is the truth.

PCIe%20Power%20Rail-2_0.png

+/- 8% which puts you at 72w
 
Almost to page 10 guys!!!! Keep up the good Work!!! Seriously though.....we gotta hold these companies Nvdia(memory gate) Amd(pci-gate) Volkswagen(emissions-gate: gotta have a car analogy) to the highest standards or we all loose. Ain't right to unfairly trash'em...... but no free passes on anything that could potentially lead to the fleecing of our pockets or to our detriment.
 
Incorrect. The GTX 960 Strix was peaking regularly at over 225 Watts on the PCIE Slot.
The RX 480 has much lower peaks around 160 Watts, and therefore should be of no concern to those who accept Nvidia's previous offerings.
The "225watts" i bet is the info you got from toms hardware. If you watch pcper they would explain it but i guess i will have it. Dc to DC switch's operate in a way they turn on and off very fast they don't have mid state just cause its more efficient to go on and off. That 225watts you see is a spike power which happens only for a few milliseconds at a time. If you look at ALL video cards you will get that same thing at some point. Problem with Tomshardware power graph is even though its technically correct but when you show that to people that have no knowledge of basic electronics and how they work its easy to see that spike and cry wolf that it uses 225watts. The slot can do spike loads for short periods and not have issues cause the heat created is such a short amount of time and no damage will happen. Easy terms are like a dragster, you can run the engine in those cars for shot time and not burn them off without burning up the motor but if you ran the motor for 10 min it would damage it. Tomshardware does show an avg usage in their graph but what they show overall has people jumping in arms over what is normal spike loads that happen all the time. The thing that is the problem is when you avg a load higher then what the spec allows. If it happens for short time probably won't hurt the machine but do it for hours on end day after the day damage will happen.
 
I think the point was more along the lines of "if he's faking an accent, how honest is he?"

Correct.
Perhaps if I posted a Soundcloud of my voice he would take my word for it?
 
There is no way the 960 Strix was pulling 225W at the Slot. It would melt the board.
 
The "225watts" i bet is the info you got from toms hardware. If you watch pcper they would explain it but i guess i will have it. Dc to DC switch's operate in a way they turn on and off very fast they don't have mid state just cause its more efficient to go on and off. That 225watts you see is a spike power which happens only for a few milliseconds at a time. If you look at ALL video cards you will get that same thing at some point. Problem with Tomshardware power graph is even though its technically correct but when you show that to people that have no knowledge of basic electronics and how they work its easy to see that spike and cry wolf that it uses 225watts. The slot can do spike loads for short periods and not have issues cause the heat created is such a short amount of time and no damage will happen. Easy terms are like a dragster, you can run the engine in those cars for shot time and not burn them off without burning up the motor but if you ran the motor for 10 min it would damage it. Tomshardware does show an avg usage in their graph but what they show overall has people jumping in arms over what is normal spike loads that happen all the time. The thing that is the problem is when you avg a load higher then what the spec allows. If it happens for short time probably won't hurt the machine but do it for hours on end day after the day damage will happen.

This idea of average load vs. power peak spikes argument is a diversion that favors Nvidia's lower average due to it's more extreme oscillations.
If you would agree that overclocking cards like the 750's and 950 SE exceeds the power specs for PCIE, then where are all the broken mainboards?
 
If you would agree that overclocking cards like the 750's and 950 SE exceeds the power specs for PCIE, then where are all the broken mainboards?

Doesn't that support the idea that your assertion is in fact false? If Nvidia had cards out that were really breaking the limits as bad as you claim, we'd have bricked computers left and right, as well as thousands of people sueing Nvidia. But that doesn't exist, so obviously something is wrong with your claim that Nvidia cards (like the Strix 960) are somehow pulling 225W from the PCI-e slot--despite that being physically impossible.
 
There really aren't bricked computers left and right from the 480's...
 
There really aren't bricked computers left and right from the 480's...
yes, and im really getting bored of this drama.. we are having the same conversation over and over and over and over and over... for 16 watts over spec that will be fixed with a driver update.. i mean, the horror.. -_- and the only person actually owning a 480 isnt complaining ffs..
 
This idea of average load vs. power peak spikes argument is a diversion that favors Nvidia's lower average due to it's more extreme oscillations.
If you would agree that overclocking cards like the 750's and 950 SE exceeds the power specs for PCIE, then where are all the broken mainboards?
If the card did exceed the power draw then would been a story of it happening but it hasn't happened probably due to power limits enforced on the card to prevent it.

There really aren't bricked computers left and right from the 480's...
If power circuits on a board are working properly then machine would just shut it self off to protect itself, but if they don't well then some postings of PCI-e slots not working will happen. Either one those 2 are bad no either way to spell it out.

yes, and im really getting bored of this drama.. we are having the same conversation over and over and over and over and over... for 16 watts over spec that will be fixed with a driver update.. i mean, the horror.. -_- and the only person actually owning a 480 isnt complaining ffs..
New card, Limited stock so kinda hard for everyone to have one already. With just handful of people already with it, Reality is for AMD this was best case for them. This was Found early on and not months later. yes it hurt PR wise but could hurt even more if it was found out 6 months down the road that 100's of thousands of people's machines were damaged by this and it would cost AMD a ton of cash They don't have.
 
If power circuits on a board are working properly then machine would just shut it self off to protect itself, but if they don't well then some postings of PCI-e slots not working will happen. Either one those 2 are bad no either way to spell it out.

I did that coming up on 10 years ago with an am2 board and 3 way crossfire. Cards were 3850's, it was a common issue with the board. Guess what? No one died and people kept buying the product.
 
If the card did exceed the power draw then would been a story of it happening but it hasn't happened probably due to power limits enforced on the card to prevent it.


If power circuits on a board are working properly then machine would just shut it self off to protect itself, but if they don't well then some postings of PCI-e slots not working will happen. Either one those 2 are bad no either way to spell it out.


New card, Limited stock so kinda hard for everyone to have one already. With just handful of people already with it, Reality is for AMD this was best case for them. This was Found early on and not months later. yes it hurt PR wise but could hurt even more if it was found out 6 months down the road that 100's of thousands of people's machines were damaged by this and it would cost AMD a ton of cash They don't have.

and that was my point. the drama is too much compared to the actual damage. and this is something that can actually be fixed with a driver update. so.. it is becoming more and more pointless whining from people that probably will never own a 480.
 
Doesn't that support the idea that your assertion is in fact false? If Nvidia had cards out that were really breaking the limits as bad as you claim, we'd have bricked computers left and right, as well as thousands of people sueing Nvidia. But that doesn't exist, so obviously something is wrong with your claim that Nvidia cards (like the Strix 960) are somehow pulling 225W from the PCI-e slot--despite that being physically impossible.

That's my point.
Since Nvidia cards did not brick boards with their higher power spikes, then the furor over the RX 480 is needless unless there is a bias.

AMD will likely make unnecessary changes just to mollify the uproar.

If the card did exceed the power draw then would been a story of it happening but it hasn't happened probably due to power limits enforced on the card to prevent it.

Those cards routinely exceed the 75 Watt limit with it's power spikes, just like the RX 480. Yet PCI-SIG certifies all these cards---why?
Because it isn't an issue!
 
Last edited by a moderator:
That's my point.
Since Nvidia cards did not brick boards with their higher power spikes, then the furor over the RX 480 is needless unless there is a bias.

AMD will likely make unnecessary changes just to mollify the uproar.
High power spikes that is normal when a Dc to DC switch turns on, like a light bulb that turns on draw's a lot of power to turn on quick then drops down. Problem that could be from all this, people want to build super cheap 550$ gaming machine. Not gonna be a good thing if machine keeps shutting it self down in middle of game play. Most people probably wouldn't haven't the trouble shooting to figure out the gpu is drawing to much power from the board and causing it.
and that was my point. the drama is too much compared to the actual damage. and this is something that can actually be fixed with a driver update. so.. it is becoming more and more pointless whining from people that probably will never own a 480.
Well it is a 2 way street, the same people whined and complained about the gtx970 issue most them were not likely to ever buy one.

Those cards routinely exceed the 75 Watt limit with it's power spikes, just like the RX 480. Yet PCI-SIG certifies all these cards---why?
Because it isn't an issue!

Problem with what you say there is spikes, if you look at all gpu's they spike to 100+ watts all the time its just nature of Dc to DC switch's. Its the over all avg draw over time that is where it gets to be the problem. drawing 225watts for matter of ms will do no damage but pulling 100watts constant for say 2-3 min can cause it as the heat is able to build up and melt something. If you go watch the video Pcper did on the issue that is one the things they cover.
 
Problem with what you say there is spikes, if you look at all gpu's they spike to 100+ watts all the time its just nature of Dc to DC switch's. Its the over all avg draw over time that is where it gets to be the problem. drawing 225watts for matter of ms will do no damage but pulling 100watts constant for say 2-3 min can cause it as the heat is able to build up and melt something. If you go watch the video Pcper did on the issue that is one the things they cover.

If you watch the video PCPer did you'll notice their board was fine.
 
If you watch the video PCPer did you'll notice their board was fine.
did you also notice they were using an x99 high end board as well? more expensive higher end boards can handle it as they are overly designed that way its the cheap sub 100$ boards and older ones that are what people are worried about.
 
did you also notice they were using an x99 high end board as well? more expensive higher end boards can handle it as they are overly designed that way its the cheap sub 100$ boards and older ones that are what people are worried about.

The only board I have seen with any reports of any actual failure were a throw away Asrock 970 board that looked like it was drenched in coke at some point and coated in fur. Board failed after 7hrs straight of TW3 using an 8350@4.5 and 500w corsair CX PSU. I feel like we will see multiple reports like that. Cheap powersupplies with heavy vdroop on the 12v rail mean more current will be pulled across an already stressed motherboard 12v. Outside of that instance I have yet to see anything other than shut downs.

As I have already said multiple times dropping the consumption down a few watts isn't going to fix this, people with cheap motherboards are still going to have failures after long gaming sessions with these cards and any other high watt draw across PCI-e the 750ti's already did this in OEM units with VGA upgrades.
 
Back
Top