• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors

Status
Not open for further replies.

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,999 (1.07/day)
AMD will continue using traditional PCI Express power connectors for its upcoming Radeon RX 9000 series RDNA 4 graphics cards, according to recent information shared on the Chiphell forum. While there were some expectations that AMD would mimic NVIDIA's approach, which requires the newer 16-pin 12V-2×6 connector for its GeForce RTX 50 series, the latest information suggests a more traditional power approach. While AMD plans to release its next generation of graphics cards in the first quarter, most technical details remain unknown. The company's choice to stick with standard power connectors follows the pattern set by their recent Radeon RX 7900 GRE, which demonstrated that conventional PCI Express connectors can adequately handle power demands up to 375 W. The standard connectors eliminate the need for adapters, a feature AMD could highlight as an advantage. An earlier leak suggested that the Radeon RX 9070 XT can draw up to 330 W of power at peak load.

Intel reportedly cited similar reasons for using standard power connectors in their Arc "Battlemage" graphics cards, suggesting broader industry support for maintaining existing connection standards. NVIDIA's different approach reportedly requires all board partners to use the 12V-2×6 connector for the RTX 50 series, removing the option for traditional PCI Express power connectors. In contrast, AMD's decision gives its manufacturing partners more flexibility in their design choices, and MBA (Made by AMD) reference cards don't enforce the new 12V-2×6 power connector standard. Beyond the power connector details and general release timeframe pointing to CES, AMD has revealed little about the RDNA 4 architecture's capabilities. Only the reference card's physical appearance and naming scheme appear to be finalized, leaving questions about performance specifications unanswered, as early underwhelming performance leaks are somewhat unreliable until final drivers and final optimizations land.



View at TechPowerUp Main Site | Source
 
As long as one 8pin is enough all is good.
If you need 2 or 3 of those, better go with the new standard Imo.

Anyway, if AIB can choose what connector to use, I see no problem- all options will be available. No real right\wrong answer here.
 
Good, though I don't really mind the whatever-pin on my 4070S. You plug it in, and it works fine.


Why it matters: Since October, dozens of RTX 4090 owners have reported melting power adapter cables. Despite investigations from Nvidia and third parties, a definitive cause has yet to be determined. It was thought that pairing the GPU with an ATX 3.0 power supply was a safe solution… until now.


The problem is that the new power connector is an engineering mistake - both mechanically and electrically it doesn't qualify for the task at hand - to carry high currents safely.
The problem is that it is too small, too weak, too unstable.

There is a reason why the AMD engineers only use the approved 6-8-pin power connectors - they have been proved as safe for decades.
 
If the prices are good, Might end up with a Radeon gpu to replace my 3080.

My last Radeon GPU was a 4890 that I overclocked to 1Ghz LOL.
 
to replace my 3080.
Ain't gonna be much faster than that, even by today's ridiculous standards of +5% being a whopping upgrade. I'd rather skip this generation. 9070 XT is unlikely to be significantly faster than 3090 (3090 Ti if we're feeling really ambitious) and your 3080 isn't really far behind. More sense in waiting for 4080 series or better GPUs to become affordable.
 
As long as one 8pin is enough all is good.
In the perfect world, they would've been squeezing ~330 W off one 8-pin (AWG14) so add ~70 W from the PCI-e interface on top of that and only the hungriest GPUs would've needed more than one.
But I doubt it'll be enough for the 9070 XT.
A miserable 10 GB vs 60% more VRAM.
This matters in like four games and in five more if we talk absurd use cases (UHD+ texture packs and/or settings so high it's <15 FPS anyway) and 3080 has the edge to stay solid in every other title. Especially the ones where DLSS is the only upscaler that works correctly. I would've agreed if that was a comparison with an 8 GB GPU but 10 GB is nowhere near obsolete, also 320-bit bus really helps a lot.

The leaks we got suggest 9070 XT just barely outperforming 7900 GRE which is roughly 3090/3090 Ti area. This is faster than 3080, sure, but it's not a lot of difference.
 
This matters in like four games and in five more if we talk absurd use cases (UHD+ texture packs and/or settings so high it's <15 FPS anyway) and 3080 has the edge to stay solid in every other title.

Wrong. 10 GB is miserable even at 1440p.
Today the bare minimum is 14 GB, but in order to stay future-proof for at least 2-3 years you need 20 GB

Watch:

 
Ain't gonna be much faster than that, even by today's ridiculous standards of +5% being a whopping upgrade. I'd rather skip this generation. 9070 XT is unlikely to be significantly faster than 3090 (3090 Ti if we're feeling really ambitious) and your 3080 isn't really far behind. More sense in waiting for 4080 series or better GPUs to become affordable.
This thing will be barely any faster than the 4 years old 6900XT in raster, let alone the 7900XTX that it supposed to beat for half the price.
 
Wrong. 10 GB is miserable even at 1440p.
Today the bare minimum is 14 GB, but in order to stay future-proof for at least 2-3 years you need 20 GB

Watch:

All games from this video are perfectly playable on RTX 3070 Ti, an 8-GB GPU, at high settings with some ray tracing going on. VRAM allocation != VRAM usage. The fact the driver has allocated 14 GB doesn't mean the game will have issues if you have less than 14 GB VRAM. We'd see horrible benchmark results on 3080 otherwise but...
1735635245452.png

This is the most recent TPU GPU review. The hardest benchmarking mode possible, 2160p. No DLSS, everything on Ultra (RT off tho), no slacking. And still, 3080 is only 14% behind 3090. It wins against 7800 XT despite less VRAM. It doesn't trail behind 7900 GRE much, just a tiny gap of 7.5%.

I don't see how 10 GB is any problematic at pedestrian resolutions like 1440p. Just no way. Just go from Ultra textures to Medium-High and you'll find yourself with half your VRAM doing a whole lot of nothing, waiting for instructions, and the games won't look like garbage because textures are overtuned anyway. Yes, sure, having more is great but you're stretching it.
 
Why it matters: Since October, dozens of RTX 4090 owners have reported melting power adapter cables. Despite investigations from Nvidia and third parties, a definitive cause has yet to be determined. It was thought that pairing the GPU with an ATX 3.0 power supply was a safe solution… until now.


The problem is that the new power connector is an engineering mistake - both mechanically and electrically it doesn't qualify for the task at hand - to carry high currents safely.
The problem is that it is too small, too weak, too unstable.

There is a reason why the AMD engineers only use the approved 6-8-pin power connectors - they have been proved as safe for decades.
Old news, you seeing any reports the past months? No.
 
The 8GB GPUs are obsolete today.
How does my colleague play them then? xD

I also have a 12 GB GPU and I have never run out of VRAM playing whatever game. Perhaps once when I enabled settings that "run" @ 20 FPS on a 4090... Other than that, the "8 GB is obsolete" is only true in the sense the leather jacket guy is too greedy and provides too little generational uplift.
 
Why it matters: Since October, dozens of RTX 4090 owners have reported melting power adapter cables. Despite investigations from Nvidia and third parties, a definitive cause has yet to be determined. It was thought that pairing the GPU with an ATX 3.0 power supply was a safe solution… until now.


The article is from June 2023

Yes, they can't disappear:

April 2024.

Its not exactly looking like they're very frequent, what I do see, is loser-Youtuber-territory here. Big screamy face, all caps headline, outrage!

That being said, I do have a few boxes of popcorn waiting for the 5090 release.
 
a few boxes of popcorn waiting for the 5090 release
Doubt there'll be anything more cheeky than the price and the fact it can't run %game_name% at a 255 million resolution at beyond ultra settings and all the folks go apeshit around this.
 
Doubt there'll be anything more cheeky than the price and the fact it can't run %game_name% at a 255 million resolution at beyond ultra settings and all the folks go apeshit around this.
Stop sounding so rational, this is the internet
 
It will be. Because the VRAM bottleneck will be solved. A miserable 10 GB vs 60% more VRAM.
New AMD GPU is still slow and bad upgrade..
Buy a new Gpu to get more Vram only is just stupid whitout getting more performance.

Better to go 5070Ti to get performance boost
 
Last edited by a moderator:
"conventional PCI Express connectors can adequately handle power demands up to 375 W" - considering that no consumer card should eat more than 375 W, there should be no need of a 12-pin connector on a consumer card, ever.
 
I'm sorry but those videos are youtubers BS, and the same on the three videos.

There will always be cases because there are thousands of cards on the market, and there is always an error rate in any hardware. In the case of the 4090, there will always be a bad connector, but after some time, you can't talk about something generalized, but something that affected a small percentage of users and that today has no more relevance than for the 5 minutes of glory of some youtuber or some random post on Reddit.
 
@AusWolf
Fucking preach. And yet you would still run into people saying that “well, we can do cards pulling 600W and the cooling works, so why contain it limit ourselves, it’s performance”. I wouldn’t grab anything above 250W for myself, but hey, if people want space heaters it’s their choice.

As for the connector, I would trust W1zz over outrage grifters any day of the week - if he says that over dozens of cards and thousands of plug-unplug cycles he didn’t ran into any problems and none of his acquaintances/contacts did either then the whole thing is overblown and is just cases of user error and/or rare defective cards, which happens.
 
Why it matters: Since October, dozens of RTX 4090 owners have reported melting power adapter cables. Despite investigations from Nvidia and third parties, a definitive cause has yet to be determined. It was thought that pairing the GPU with an ATX 3.0 power supply was a safe solution… until now.


The problem is that the new power connector is an engineering mistake - both mechanically and electrically it doesn't qualify for the task at hand - to carry high currents safely.
The problem is that it is too small, too weak, too unstable.

There is a reason why the AMD engineers only use the approved 6-8-pin power connectors - they have been proved as safe for decades.
Since what October ? that piece of news is 1.5yrs old.
btw, all I said is that mine works fine, and you put that idiotic "laugh" reaction on it. sorry it didn't burn my pc down, which you would probably like.
 
Last edited:
Status
Not open for further replies.
Back
Top