• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Official Statement from AMD on the PCI-Express Overcurrent Issue

Ok, so this thread seems to be spiraling out into a war. Should we lock and load?

In all seriousness, here is what it boils down to:

1: AMD decided to put a 6 pin instead of an 8 pin reference to look lower power instead of being smart and letting us have the clocking and less problems.
2: AMD needs to release a driver fix to stop the card from overdrawing from the PCIE and either change it to the 6 pin or limit it.
3: Even if you buy this card, your not going to kill your motherboard with it unless you have the most basic/cheap motherboard possible and even then I would be skeptical.

Fact is this should not be a problem but it is. Is it a big problem that is going to result in dead motherboards? No because motherboards especially in this day and age are pretty tough even on the cheap side. I have overloaded a motherboard's PCIE's before, it takes alot to actually do some damage to it. But the fact is AMD was beyond foolish to not only not put an 8 pin, but to let this pass through like this instead of allowing the 6 pin to take the brunt. PSU's in this day and age have an 8 pin minimum even on the most cheap entry level one you would want to buy to support your gaming rig (Speaking ~500watt). Either way though, this does not ruin the card or the value of what your getting, but it definitely makes after market variants look alot more appealing.
 
1: AMD decided to put a 6 pin instead of an 8 pin reference to look lower power instead of being smart and letting us have the clocking and less problems.

Ed from Sapphire had a cryptic answer while under NDA when what connector the RX 480 NITRO would have; he said it has an 8-pin but that you really don't need it. He seemed to suggest that plugging in the additional 2-pins was optional.
 
That's why AIB's have different hardware ID's to identify specific hardware, so that driver doesn't "assume" things like this, but it "knows" things like this.

Oh man you're fucking delusional. Source please.
 
Ed from Sapphire had a cryptic answer while under NDA when what connector the RX 480 NITRO would have; he said it has an 8-pin but that you really don't need it. He seemed to suggest that plugging in the additional 2-pins was optional.

Nothing cryptic on that, those 2 extra pins are just ground. Again it's quite safe drew more than 75W from 6-pin connector, if you have high end psu.

I thought AIBs were adding 8-pins for higher total power limits for overclocking?
I have not seen anything concrete that shows that an 8-pin would decrease the draw on the PCIE slot, as my understanding is that is regulated by the GPU itself.

That is correct, reference RX-480 has a solid vrm. It's just routed for 50-50 power distribution between pcie slot and pcie connector.
 
Son, you just went full retard. I think we're about done here.
Nope. I will have to try really really hard to reach drop to your level.

You still are really not getting how nvidia boost works...
Thanks to GPU boost, basically 0w.

Maybe I am missing something, but I don't see anyone explaining to me how you can get 20% extra performance and don't consume any more power. Can someone explain me that magic? The card consumes 74W at average and default clocks, it gets overclocked, it scores 20% higher in performance and I have to assume that power consumption on average remained at under 75W because of Nvidia boost? Oh, please explain.

I learned about PCI-E bus because of this:
View attachment 76280
...and this:
View attachment 76281

...and about 28 more reasons in my office - I fix this stuff occasionally, if you know what I mean.

Now, if it came to defensive insults, what makes you a specialist in this area?

P.S. Boards are not for sale! Can trade a Z77 for cheap air conditioning :toast:
When you don't have any arguments just throw degrees and hardware in the face of the other. That will make you look more credible I guess. You think you are the first person on the internet that starts a post with "You should hear me, I am an engineer" and then you can't believe the BS he writes. I am not saying that you are talking BS. I just say that taking pictures of your hardware doesn't makes you an expert. You think I bought my PC yesterday? And no I haven't thought about PCIe power draw and I bet 99% of those posting in here haven't either. The last time I was worrying about a graphics card and a bus, was when running a GeForce 2 MX64 with the AGP at 83MHz.

Reinforcements... :ohwell:
 
Nope. I will have to try really really hard to reach drop to your level.




Maybe I am missing something, but I don't see anyone explaining to me how you can get 20% extra performance and don't consume any more power. Can someone explain me that magic? The card consumes 74W at average and default clocks, it gets overclocked, it scores 20% higher in performance and I have to assume that power consumption on average remained at under 75W because of Nvidia boost? Oh, please explain.
:ohwell:When you don't have any arguments just throw degrees and hardware in the face of the other. That will make you look more credible I guess. You think you are the first person on the internet that starts a post with "You should hear me, I am an engineer" and then you can't believe the BS he writes. I am not saying that you are talking BS. I just say that taking pictures of your hardware doesn't makes you an expert. You think I bought my PC yesterday? And no I haven't thought about PCIe power draw and I bet 99% of those posting in here haven't either. The last time I was worrying about a graphics card and a bus, was when running a GeForce 2 MX64 with the AGP at 83MHz.

Reinforcements...

I don't think you would get that 20% performance out of it, unless you have truly amazing chip. Nvidia has power restrictions set in bios, if you don't ask more power to take while overclocking(=don't touch tdp percentages), it will throttle clocks to keep power in which is restricted by bios.
 
jeuIxhfkKLz9u.gif
 
Nope. I will have to try really really hard to reach drop to your level.
.............................................................

Sorry mate, but your following comment that @Assimilator has quoted, wasn't among your best ones:
john_ said:
You see, there are many things that the press will not tell you. You just learned about the PCIe bus power draw because the RX480 is an AMD card. If it was an Nvidia card, you wouldn't have known about it.

Seriously?!!
If it was an Nvidia card we would have never hear about it?:eek:
AMD managed to confuse the entire gaming community
with their propaganda Vs the GTX 970 memory size, & made the people believe that the card had less than advertised memory, and now you expect me to believe that if NV's cards had similar power issues (*which is something far greater than the memory size, since its a safety matter ), no one would know? !!o_O
 
Oh, please explain.
Ok, I'll be the one explaining this time.
The low power 950 is at all times limited to 75W power target ... at all times. Sample that @W1zzard reviewed probably had considerably better ASIC quality than average, meaning it was able to reach higher clocks on lower voltages than average sample. The rest is boost 2.0, power target is same 75W, clocks are offset by 200 Hz and the boost tightens the voltages to stay inside 75W and voila stable overclock. Every review has dynamic OC: clock vs. voltage table ... as you can see there are multiple clock samples for each voltage state.
 
Ok, I'll be the one explaining this time.
The low power 950 is at all times limited to 75W power target ... at all times. Sample that @W1zzard reviewed probably had considerably better ASIC quality than average, meaning it was able to reach higher clocks on lower voltages than average sample. The rest is boost 2.0, power target is same 75W, clocks are offset by 200 Hz and the boost tightens the voltages to stay inside 75W and voila stable overclock. Every review has dynamic OC: clock vs. voltage table ... as you can see there are multiple clock samples for each voltage state.

The think BiggieShady is that I am not talking about frequencies here. The card could boost to 2GHz and stay under 75W and under certain conditions. But I am not talking about frequencies, do I? I am talking about performance. If the card wasn't gaining 20% performance but 1-3%, I wouldn't be making any fuss about it.
 
But I am not talking about frequencies, do I? I am talking about performance.
If you are talking about performance then you are talking about frequency ... you are not gaining performance by dynamically adding compute units :rolleyes:
 
All this talk for no reason.
Everytning is back to square one.
Nothing solved, nothing learned.
Just fanboys fighting all over TPU.

Truth be told, nobody should compare AMD to Nvidia, and it is becase they have different aproach on the GPU Market.

I honestly owned 4 AMD Cards, none of which blew up my system (on a side note my old psu almost did, people know what i mean).

I guess this one wont either.
Jays2Cents said clearly in the review of this card: mIt makes systems unstable if they have low or mid class mobos"

It does not blow up hardware. And it never will. It is just excuse to make AMD look bad, just because of a small problem their product has.

So what? No company for any tipe of product has perfection, and nobody bitches over most of those brands and names.
 
If you are talking about performance then you are talking about frequency ... you are not gaining performance by dynamically adding compute units :rolleyes:
Does anyone understand basic things here?

By increasing the frequency you don't necessarily gain performance. If the card is limited in how much power it will take from the pcie bus, remaining under or at 75W, in will throttle. But if the results of overclocking the card are 20% extra performance, then the card doesn't stop at 75W, it asks and it gets more power from the pcie bus. Remember. At standard speeds, based on the review, the card is already at 74W average. Not talking about the peak at 79W. Let's ignore that. If at 100% performance you have an average power consumption of 74W, even if you keep the voltages stable, by increasing the frequency of the GPU AND the frequency of the GDDR5, you are going higher in power consumption. And probably power consumption increases more than 20% that is the performance gain. For Nvidia's Boost to do some magic to keep the card at 75W, it will have to drop voltages automatically at higher frequencies and the card to remain stable.
 
Ok, so this thread seems to be spiraling out into a war. Should we lock and load?

In all seriousness, here is what it boils down to:

1: AMD decided to put a 6 pin instead of an 8 pin reference to look lower power instead of being smart and letting us have the clocking and less problems.
2: AMD needs to release a driver fix to stop the card from overdrawing from the PCIE and either change it to the 6 pin or limit it.
3: Even if you buy this card, your not going to kill your motherboard with it unless you have the most basic/cheap motherboard possible and even then I would be skeptical.

Fact is this should not be a problem but it is. Is it a big problem that is going to result in dead motherboards? No because motherboards especially in this day and age are pretty tough even on the cheap side. I have overloaded a motherboard's PCIE's before, it takes alot to actually do some damage to it. But the fact is AMD was beyond foolish to not only not put an 8 pin, but to let this pass through like this instead of allowing the 6 pin to take the brunt. PSU's in this day and age have an 8 pin minimum even on the most cheap entry level one you would want to buy to support your gaming rig (Speaking ~500watt). Either way though, this does not ruin the card or the value of what your getting, but it definitely makes after market variants look alot more appealing.

Good intentions, not quite the most accurate info, though...

1: AMD decided to split the power supply 50/50 between the external power connector (happens to be 6-pin in this case) and the PCI-E slot. To illustrate:

front_full.jpg


This is a problem because while the official spec for the 6-pin connector is 75W it can realistically provide upwards of 200W continuously without any ill effects.
The PCI-E slot and the card's x16 connector have 5 (five) flimsy pins at their disposal for power transfer. Those cannot physically supply more than a bit above 1A each. The better ones can sometimes handle 1.2A before significantly accelerating oxidation (both due to heating and passing current) and thus increasing resistance, necessitating more amps to pass to supply enough power further increasing oxidation rate... It's a feedback loop eventually leading to failure.

2: AMD cannot fix this via drivers, as there are trace breaks with missing resistors and wires that would bridge the PCI-E slot supply to the 6-pin power connector. This would make the connector naturally preferable to the current flow as its path has a lower resistance and that's the path current prefers to take. It can only be permanently fixed by physical modification. No other methods. AMD can lower the total power draw and thus by extension relieve the stress on the PCI-E slot, but it will probably cost some of the GPU's performance. We'll see.

3: Buying and using this card won't kill your motherboard... straight away. Long-term consequences are unpredictable but cannot be positive. Would driving your car in first gear only, bumping into RPM limiter all the time kill your car? Well, not right away, but... Yeah. It's the same here, you're constantly at the realistic limit of an electromechanical system, constant stress is not going to make it work longer nor better, that's for sure.

The AIB partners would do well to design their PCBs such that the PCI-E slot only supplies power past 150W being drawn from the auxiliary power connector or something like that. Perhaps give one of the six phases to the slot, and the remaining five to the connector... Or better yet, power memory from the slot and GPU from the power connector exclusively. Breaking PCI-E spec that way is much less damaging due to the actual cpaabilities of the Molex Mini-Fit Jr. 2x3-pin connector that we like to call the 6-pin PCI-E power.
 
Last edited:
Maybe I am missing something, but I don't see anyone explaining to me how you can get 20% extra performance and don't consume any more power. Can someone explain me that magic? The card consumes 74W at average and default clocks, it gets overclocked, it scores 20% higher in performance and I have to assume that power consumption on average remained at under 75W because of Nvidia boost? Oh, please explain.

Because performance is not directly related to power draw. Raising clock speeds does very little to power draw, it is raising the voltage that increases power draw. On the GTX950 with the 6-pin, the GPU runs at 1.3v. On the GTX950 without the 6-pin the GPU runs at 1.0v. That is a massive difference, and the reason the card stays at 75w. It is also the reason that the 6-pinless GTX950 barely overclocks to match the stock speeds the 6-pin GTX950 runs. The GTX950 Strix with no overclock boosts to 1408MHz(@1.3v), the 6-pinless GTX950 with overclock only boosts to 1440Mhz(@1.0v). That 1.0v is why it stays under 75w, and GPU Boost will lower that voltage and the clock speeds if it needs to to stay under 75w.
 
Because performance is not directly related to power draw. Raising clock speeds does very little to power draw, it is raising the voltage that increases power draw. On the GTX950 with the 6-pin, the GPU runs at 1.3v. On the GTX950 without the 6-pin the GPU runs at 1.0v. That is a massive difference, and the reason the card stays at 75w. It is also the reason that the 6-pinless GTX950 barely overclocks to match the stock speeds the 6-pin GTX950 runs. The GTX950 Strix with no overclock boosts to 1408MHz(@1.3v), the 6-pinless GTX950 with overclock only boosts to 1440Mhz(@1.0v). That 1.0v is why it stays under 75w, and GPU Boost will lower that voltage and the clock speeds if it needs to to stay under 75w.

Power draw goes up with frequency, not as much as by increasing the voltage, but it does go up. And not by very little, you are wrong here, especially when you overclock both memory and GPU.

Please try to NOT ignore the fact that the average power draw in the review at defaults is at 74W. Even if the GTX 950 runs at 1.0V instead of 1.3V, in the end it consumes 74W on average. So even if the difference in voltage is massive, as you say, the card still uses 74W on average. So the starting line is there at 74W. The card overclocks really well in W1zzard's review and it gets 20% extra performance(I am keep writing this, everyone conveniently ignores it). You don;t get 20% extra performance with lower clocks and voltage. So, if the card is at 74W at defaults, for that 20% extra performance it probably jumps at 90W through the pcie bus. If it was staying at 75W, then there wouldn't have been any serious performance gains and W1zzard's conclusion would have been that the card is power limited.

Am I right @W1zzard ?
 
For CPUs and GPUs, the power dissipation increases linearly with frequency, and proportional to the square of the voltage. In simple terms, P = C*V²*F, where C = internal capacitance (specific to the individual specimen), V = voltage and F = frequency. This is an oversimplification but provides a nice model that's fairly accurate until you get to LN2 stuff...
 
@McSteel
Are you sure phases are physically tied to one and another power input? I wanted to ask just that if anyone can trace the wiring on the PCB...

Either way, if AMD limits power draw to actual 150W, that technically wouldn't really be cheating, they'd just be bringing it to what they've been advertising the whole time. Assuming they did it on purpose to boost framerate in reviews, hoping no one would notice it is just foolish seeing what kind of shitstorm everyone made out of this. And especially since all reviewers also tackle power consumption and that would also be a straight giveaway, like it was now.

So, calling it intentional, I'm not buying it. No one is this stupid.
 
Power draw goes up with frequency, not as much as by increasing the voltage, but it does go up. And not by very little, you are wrong here, especially when you overclock both memory and GPU.

Please try to NOT ignore the fact that the average power draw in the review at defaults is at 74W. Even if the GTX 950 runs at 1.0V instead of 1.3V, in the end it consumes 74W on average. So even if the difference in voltage is massive, as you say, the card still uses 74W on average. So the starting line is there at 74W. The card overclocks really well in W1zzard's review and it gets 20% extra performance(I am keep writing this, everyone conveniently ignores it). You don;t get 20% extra performance with lower clocks and voltage. So, if the card is at 74W at defaults, for that 20% extra performance it probably jumps at 90W through the pcie bus. If it was staying at 75W, then there wouldn't have been any serious performance gains and W1zzard's conclusion would have been that the card is power limited.

Am I right @W1zzard ?

No one is ignoring it. We just keep telling you it is happening with no extra power draw. You are ignoring what we keep telling you. Clock speeds do not affect power draw a noticeable amount, maybe 1w. Voltage affects power draw. GPU Boost guarantees the card stays within its power limit. NVidia learned from their mistakes already, they went through this growing phase with Fermi, and have developed a very good tech to guarantee cards don't go over their power limit.
 
@McSteel
Are you sure phases are physically tied to one and another power input? I wanted to ask just that if anyone can trace the wiring on the PCB...

Either way, if AMD limits power draw to actual 150W, that technically wouldn't really be cheating, they'd just be bringing it to what they've been advertising the whole time. Assuming they did it on purpose to boost framerate in reviews, hoping no one would notice it is just foolish seeing what kind of shitstorm everyone made out of this. And especially since all reviewers also tackle power consumption and that would also be a straight giveaway, like it was now.

So, calling it intentional, I'm not buying it. No one is this stupid.

Yeah, you can see that in this video. Ok, the guy in it may not hold a masters in electronics, but it's clear the power phases are completely separated, and the GPU simply draws in power 50/50 from them.
A bit more current is drawn from the slot than from the aux connector simply due to higher resistance of the slot power pins...

I'm sure @W1zzard could confirm if he could find a bit of free time to do it :)
 
No one is ignoring it. We just keep telling you it is happening with no extra power draw. You are ignoring what we keep telling you. Clock speeds do not affect power draw a noticeable amount, maybe 1w. Voltage affects power draw. GPU Boost guarantees the card stays within its power limit. NVidia learned from their mistakes already, they went through this growing phase with Fermi, and have developed a very good tech to guarantee cards don't go over their power limit.
In your dreams that thing you wrote and I putted in bold. In fact it would have been a dream of mine also to just increase frequencies in my hardware and expect only 1W more power consumption after getting 20% extra performance. Not to mention that in that case RX480 would have been close to 166W at any frequency, still it goes at 187W if I remember correctly. Doesn't it? Yes, yes I know. GPU Boost is a magical feature offering free performance with 1 extra watt.

No need to quote me again. Just see McSteel's post and stop there. Save both ourselves some time.
 
In your dreams that thing you wrote and I putted in bold. In fact it would have been a dream of mine also to just increase frequencies in my hardware and expect only 1W more power consumption after getting 20% extra performance. Not to mention that in that case RX480 would have been close to 166W at any frequency, still it goes at 187W if I remember correctly. Doesn't it? Yes, yes I know. GPU Boost is a magical feature offering free performance with 1 extra watt.

No need to quote me again. Just see McSteel's post and stop there. Save both ourselves some time.

With normal operation, when the clocks go up the voltage goes up with it. That is why W1z includes voltage/clock tables in his reviews. AMD had to increase the voltage on the RX 480 to keep it stable at the clock speeds they wanted(this is also probably why it overclocks so poorly at stock voltage). However, when W1z does his overclocking he does not increase voltage, he leaves it at the stock. So while he increases the clock speeds, the voltage stays the same, so the current going through the GPU stays the same. So you get no real power consumption increase.

In fact, one of the trick of overclocking nVidia cards is to actually lower the voltage to get higher clock speeds. If your card is stable, but hitting the power limit, you can lower the voltage and raise the clocks to get better performance. It is a commonly used trick, and one I had to use on my GTX970s.
 
Back
Top