• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Official Statement from AMD on the PCI-Express Overcurrent Issue

I don't remember, but is the actual PCIe drawing this much? Sure, it's 166W, but is it actually from PCIe or is it over 6pin? Everyone seems to just assume 6pin is absolutely strict 75W so it ha to be PCIe then. But is it? Who has actually measured it at PCIe? Can't remember the testers who would do this at the moment...

PCPer, they did a stock analysis, increasing the power limit analysis, and even debunking an analysis about GTX 960 STRIX : http://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480
 
Last edited:
Behold, it's "AMD fanboys" analysis when someone has to defend NVIDIA. But when it's the other way around, it was nothing, you know, NVIDIA is working on a driver fix, no need to make drama. But throw AMD in the same scenario and whole internet is losing their shit. Sometimes I'm ashamed for owning a NVIDIA card...
 
Behold, it's "AMD fanboys" analysis when someone has to defend NVIDIA. But when it's the other way around, it was nothing, you know, NVIDIA is working on a driver fix, no need to make drama. But throw AMD in the same scenario and whole internet is losing their shit. Sometimes I'm ashamed for owning a NVIDIA card...

Oh sorry, that's my personal opinion. Need to do a quick edit before the Nekker army coming back to haunt my sleep.
 
They both screw up. AMD and Nvidia.. Why does it always have to turn into a competition? lol..
I just think it's shitty, regardless of who did it.. Sounds like the power spec is way out.
Kind of a big deal, seeing as they kept screaming how little power it used..
 
I'm finding my lack of care and concern to be growing with this. It's a simple fix that is already being implemented by amd.
 
If AIB vendors are fixing the "out of spec PCIe limits" with an 8-pin or possibly 8 + 6-pin to reduce excessive power draw from the PCIe slot, then AMD shouldn't even come up with a statement that it will release a driver fix. Limiting the card's power even by a little affects a lot of aspect. @HD64G why not u bench your VGA before limiting it & then u throw a 10% reduction to it via tuning software & then run the same test again? We wanna see if limiting power does not reduce the card's performance, as per what you claimed.

The answer is in this article mate: http://semiaccurate.com/2016/07/01/investigating-thermal-throttling-undervolting-amds-rx-480/
 
Behold, it's "AMD fanboys" analysis when someone has to defend NVIDIA. But when it's the other way around, it was nothing, you know, NVIDIA is working on a driver fix, no need to make drama. But throw AMD in the same scenario and whole internet is losing their shit. Sometimes I'm ashamed for owning a NVIDIA card...

Well what do you expect when most of their fanbase are mostly manchild/literal kid

you cant have a good gaming experience if you dont have the GeForce GTX® logo/sticker in your PC afterall :^)

most probably doesnt care about efficiency either, or they are simply too new to remember the HD5xxx vs GTX 4xx series
 
Oh and one more thing when this driver pops up allowing them to limit pcie to an actual 75w everyone does understand the cheap low quality boards or old worn put boards are still going to pop right? This issue isn't going away. Junk was still never made to survive with actual full spec being pulled over pcie.

This doesn't even account the insane number of people who will never update past the driver that came on the DVD
 
Behold, it's "AMD fanboys" analysis when someone has to defend NVIDIA. But when it's the other way around, it was nothing, you know, NVIDIA is working on a driver fix, no need to make drama. But throw AMD in the same scenario and whole internet is losing their shit. Sometimes I'm ashamed for owning a NVIDIA card...

what are you talking about? what drama? what driver fix? have you read the link posted by @okidna? ( http://www.pcper.com/reviews/Graphi...s-Radeon-RX-480/Evaluating-ASUS-GTX-960-Strix ).
The GTX 960, was tested, and even while overclocked it was working excellent regarding its power delivery (*unlike the RX480).
 
This doesn't even account the insane number of people who will never update past the driver that came on the DVD

This is an issue that I found very frequently, friends or colleagues who complained about their poor FPS when playing a new game, they don't know that both AMD and NVIDIA now use a different approach when it comes to new game support.

what are you talking about? what drama? what driver fix? have you read the link posted by @okidna? ( http://www.pcper.com/reviews/Graphi...s-Radeon-RX-480/Evaluating-ASUS-GTX-960-Strix ).
The GTX 960, was tested, and even while overclocked it was working excellent regarding its power delivery.

No no no, that's my fault, not RejZor fault. I wrote "fanboy analysis" on my original post before removing it because it's my personal opinion, shouldn't wrote it in the first time.
 
No no no, that's my fault, not RejZor fault. I wrote "fanboy analysis" on my original post before removing it because it's my personal opinion, shouldn't wrote it in the first time.

Oh, apologies to RejZor then !;)
 
In that review average power was 74W. After overclocking the GPU at over 1400Mhz and memory at over 2000MHz, the results where close to 20% extra performance. I think I am going to doubt that someone gets a 20% extra performance and stays under the 75W limit. Probably the card goes to 90W(20% extra power for 20% extra performance), if not more considering that usually power consumption goes faster up, compared to performance. If they where getting 1-3% extra performance, I would have agreed with you.

Again, not how it works. GPU Boost is designed to adjust the clock speed to keep the card below the power limit. So even if you do overclock the GTX950, GPU boost guarantees that the card will stay right around 75w. The only way to go beyond the 75w would be to adjust the power limit in your overclocking software or by BIOS. Either way, at that point the user would be aware they are overloading the PCI-E slot.

Are those SSDs advertised as SLC SSDs? I believe not. Well, if they where Nvidia products, probably they would. And people would be happy to convince themselves that while being MLC, performing as SLC would made them equal to SLCs. And anyone saying the opposite, would have been a stupid fanboy that hates Nvidia and doesn't acknowledge Nvidia's superior engineering. "It is a good design".

Also SLC vs MLC is not just performance difference. If I am not mistaken SLCs are considered as having better longevity. The same applies to the 970. It is not just those slow 500 MBs. Also less cache, less ROPs, less memory bandwidth. Specs where completely wrong and we shouldn't be giving any excuses to companies.

Most of them do put blurbs in their advertising about this feature, yes.

Also, the advertising for the GTX970 was correct. They didn't advertise ROP count, they didn't advertise cache size. So you don't really have a point there.
 
Well what do you expect when most of their fanbase are mostly manchild/literal kid

you cant have a good gaming experience if you dont have the GeForce GTX® logo/sticker in your PC afterall :^)

most probably doesnt care about efficiency either, or they are simply too new to remember the HD5xxx vs GTX 4xx series


:eek:.......you dare mock us? At least get your facts right!!!!! It take a GeForce GTX logo, and Racing Stripes you barbarian!
 
:eek:.......you dare mock us? At least get your facts right!!!!! It take a GeForce GTX logo, and Racing Stripes you barbarian!

Probably trolling. If you look at his system he owns a GTX 970, so.......... !

P.S. Anyway, just like so many guys already said, i would like to emphasize on this video:
.Whoever wants to take a quick look, go at 20:50 and watch the estimation of the possible threat that the RX 480 is for our systems!!
 
Jesus, people think this will just fry the system after 3 gaming sessions. Sure it's not healthy if you use it for 3 years like this, but c'mon, the card was released what, 3 days ago? And everyone still going absolutely batshit insane over it despite fix being promised (which will most likely limit the card to actual 150W or revert the power delivery for 6pin to accept more and PCIe to be within the limits. In either cases you wouldn't actually be "losing" performance because 150W was advertised from the beginning. But oh well. It's page 7 already...
 
Jesus, people think this will just fry the system after 3 gaming sessions. Sure it's not healthy if you use it for 3 years like this, but c'mon, the card was released what, 3 days ago? And everyone still going absolutely batshit insane over it despite fix being promised (which will most likely limit the card to actual 150W or revert the power delivery for 6pin to accept more and PCIe to be within the limits. In either cases you wouldn't actually be "losing" performance because 150W was advertised from the beginning. But oh well. It's page 7 already...

Mate, i haven't paid 200€ on a high-end PSU, and 400€ for also a high-end UPS, only to let an RX 480 to endanger my system!!:mad:
But , hey, with your money you can do what you want.;)
 
Probably trolling. If you look at his system he owns a GTX 970, so.......... !

P.S. Anyway, just like so many guys already said, i would like to emphasize on this video:
.Whoever wants to take a quick look, go at 20:50 and watch the estimation of the possible threat that the RX 480 is for our systems!!

...so am I;)
 
Mate, i haven't paid 200€ on a high-end PSU, and 400€ for also a high-end UPS, only to let an RX 480 to endanger my system!!:mad:
But , hey, with your money you can do what you want.;)

[Drama Intensifies]

You don't even have RX480 and you're making it like it has already fried your PCIe circuitry...
 
Again, not how it works. GPU Boost is designed to adjust the clock speed to keep the card below the power limit. So even if you do overclock the GTX950, GPU boost guarantees that the card will stay right around 75w. The only way to go beyond the 75w would be to adjust the power limit in your overclocking software or by BIOS. Either way, at that point the user would be aware they are overloading the PCI-E slot.
You are avoiding to answer the question here. How can you have 100% performance at 74W and then get 120% performance and remain at 74W? Simple answer. You can't.

Most of them do put blurbs in their advertising about this feature, yes.

Also, the advertising for the GTX970 was correct. They didn't advertise ROP count, they didn't advertise cache size. So you don't really have a point there.
None advertise it as SLC. If you have any example of an MLC SSD that says "it uses SLC" you are free to show it.


In the end I feel like the stupid little fool, trying to have a honest conversation with people who will commit suicide before posting anything questionable about Nvidia.
 
[Drama Intensifies]

You don't even have RX480 and you're making it like it has already fried your PCIe circuitry...

A couple websites, one dusty motherboard, and everyone's PCs are on fire! What happened to the poster claiming the RX480 was sustaining 254 watts? Maybe that was at [H].

The one long-time poster I've seen with RX480 at [H] has 2 cards and stress tested them for hours without issue in one PC.
 
Last edited:
RejZoR said:
[Drama Intensifies]
You don't even have RX480 and you're making it like it has already fried your PCIe circuitry...

since i have so expensive protection equipment, that means that i hate taking risks........AT ALL !
( P.S. Of course you are correct. I don't own a RX 480, and the last few years, i'm nowhere near interested fo buy AMD's GPUs. Already explained myself at post #43 of this thread. If you like, take a look at it;) )
 
Jesus, people think this will just fry the system after 3 gaming sessions. Sure it's not healthy if you use it for 3 years like this, but c'mon, the card was released what, 3 days ago? And everyone still going absolutely batshit insane over it despite fix being promised (which will most likely limit the card to actual 150W or revert the power delivery for 6pin to accept more and PCIe to be within the limits. In either cases you wouldn't actually be "losing" performance because 150W was advertised from the beginning. But oh well. It's page 7 already...

Sure it won't fry your system after 3 gaming sessions, but then again, VW's diesel engines won't destroy the planet in 3 months either ;) It isn't a big issue as long as they actually fix it, but we'll need to see if they can actually limit single phases via software control and if they cannot, how a global TDP limitation will affect the value proposition of the card in terms of potential reduced performance...
 
VW outright lied and cheated intentionally. There is nothing to fix because what they've done was intentional. AMD simply cocked it up and they are already working on a fix for it. That's a big difference.
 
how many are affected? find me the number. we are talking about 16watt over the specs, 16 ffs.

index.php

If even one person was affected by the reckless overspecced wattage draw of the RX 480 simply because AMD didn't want to put a proper power source onto the card for PR reasons, then it is one too many. Science Studio has a video review posted on Youtube showing that the RX 480 isn't even playable with certain motherboards simply becasue it is WAY over spec. Gamers Nexus showed the card pulling 192W during testing. Don't link a chart showing calculated TDP and stand back and point saying THERE! This is the most ludicrous move in GPU history and it serves AMD right to have it blow up in their faces for doing it. Some cards do in fact go over spec, particularly when OC'd, the difference of course being that they pull that extra wattage through the 6 or 8 pin connector and not the PCIe slot. If you have a good power supply, it doesn't affect you. BUT with the RX 480, this is not the case and it's a problem that exists at the hardware level because of the way the phases are laid out. It's actually pulling more from hte PCIe slot than it is from the 6 pin connector.

Now, lets go back to the Gamers Nexus observed wattage pull when OC'd, 192W from a 150W power supply with MORE THAN HALF of that coming through the PCIe slot. That's a 128% jump over spec. That's like trying to pull 32 amps from a 25 amp wall socket. You could burn your house down if there were no fail-safes. Even a nonOC'd card will pull 86W from the PCIe slot, or 15% more than max. These aren't guidelines, they are absolute limits.

The RX 480 is the ONLY GPU in history to average more than 75W from the PCIe slot at stock clocks. The only way to prevent it is to limit wattage to 150W, really less becasue it pulls more from the PCIe slot than the 6 pin connector, so lets say 145W. That would require a 14.4% under-clock of the card. A card that already is 15% less powerful than a GTX 1060 which will be able to AT LEAST OC another 20%. So, now we're talking about a GTX 1060 that will be around 50% more powerful than a reference RX 480 and 25% more powerful than an OC'd AIB 480 for about $250. Being released at the same time as the AIB cards which will most likely cost $300.

This was a MUST WIN for AMD, but instead it became worst case scenario.
 
Last edited:
Back
Top