Tuesday, September 15th 2020

AMD Radeon RX 6000 "Big Navi" RDNA2 Reference Design Pictured

AMD revealed its Radeon RX 6000 series graphics card reference design. This card will likely be AMD's flagship product based on its RDNA2 graphics architecture. The card features a refreshing new dual-slot, triple axial fan cooling solution that uses large new high-airflow fans that have webbed impellers, and an aluminium fin-stack heatsink that spans the entire length of this roughly-30 cm long card. A variation of the insert with the Radeon branding was teased last year. This is AMD's second reference design with triple axial fans, after the Radeon VII.

The card features two 8-pin PCIe power inputs right where you expect them. Display outputs include a pair of DisplayPorts, an HDMI, and a USB type-C. Since air exhaust is guided out of the top of the card with its fin-stack arrangement (and none from the rear I/O), AMD has a sealed I/O shield like the Radeon Fury. AMD partnered with Epic Games for a Fortnite treasure-hunt map that lets you see a 3D model of the card in from more angles. We'll spare you the treasure hunt with a video by Anshel Sag.
The Fortnite video follows.


The USB-C connector got us thinking AMD's rationale behind it. VirtualLink is dead in the water, with almost no takers from the VR HMD segment. When this card was in development, AMD would have seen clear signs of VirtualLink's lukewarm market response. On the other hand, there are plenty of USB-C Thunderbolt-compatible professional monitors. It makes us wonder if AMD just strapped on an Intel "Cactus Ridge" controller and gave this card Thunderbolt capability, with DisplayPort passthrough? A grand-unified type-C port would be that which has DisplayPort, Thunderbolt, and USB3/USB4 connectivity.
Sources: AMD Radeon RX (Twitter), via VideoCardz
Add your own comment

74 Comments on AMD Radeon RX 6000 "Big Navi" RDNA2 Reference Design Pictured

#51
Fluffmeister
Super XP
Regarding RDNA2, that architecture is meant to compete with Ampere. I wouldn't call that 2 years behind ya know. Lol
I said the 2080 Ti was two years old and EOL, which is true. Stop getting so defensive... gee.
Posted on Reply
#52
Super XP
Fluffmeister
I said the 2080 Ti was two years old and EOL, which is true. Stop getting so defensive... gee.
No not defensive at all. So AMD shouldn't have a problem beating it with a $450 or lower GPU.
Posted on Reply
#53
Fluffmeister
Super XP
No not defensive at all. So AMD shouldn't have a problem beating it with a $450 or lower GPU.
Let's hope so, it's sad to see a lack of competition for soo long.
Posted on Reply
#54
Super XP
Fluffmeister
Let's hope so, it's sad to see a lack of competition for soo long.
Agreed.
Posted on Reply
#56
Super XP
theoneandonlymrk
The 6900XT HBM rumour just got a boost off jay2cents , might be worth adding to the list.
Oh No, no more HBM. It ain't worth the extra cost. The HBM version coming is based on CDNA not RDNA and both are different architectures last I checked.
Posted on Reply
#57
theoneandonlymrk
Super XP
Oh No, no more HBM. It ain't worth the extra cost. The HBM version coming is based on CDNA not RDNA and both are different architectures last I checked.
I think it's the dog's balls mate, cost only limits it's greatness ,you put hbme2 on any design it would perform, my vega64 has been an absolute pleasure, it earned it's price mining, games at 4k on GtaV while most said it impossible (@60) and crunched many thousands of work units for F@H, if only every card had such a life imagine the petaflops f@h would have,probs a zeta+;p.
Posted on Reply
#58
ShurikN
theoneandonlymrk
The 6900XT HBM rumour just got a boost off jay2cents , might be worth adding to the list.
I just watched his video (with the leaked dual fan cooler) and I didn't hear him mention HBM.
The one who has been mentioning HBM was @buildzoid
Posted on Reply
#59
theoneandonlymrk
ShurikN
I just watched his video (with the leaked dual fan cooler) and I didn't hear him mention HBM.
The one who has been mentioning HBM was @buildzoid

You are right, he did mention the 6900XT though, cortex another tuber mentioned a hbme 6900XT many times, it's all rumours until it's out though.
Posted on Reply
#60
InVasMani
kayjay010101
It literally just means this: say a RDNA(1) card has a TDP of 200W and gets 100FPS in a game, then another card for RDNA2 card gets 150FPS at the same wattage in that specific scenario between those two cards. Another way to look at it is that you could get the same FPS but at 100W.
So yeah, without knowing what they're comparing between it's meaningless. We'll have to wait and see.
Regardless at least power/efficiency is going to increase at the same wattage requirements. Not everyone has a enormous power hungry PSU which is something to keep in mind.
Posted on Reply
#62
Wshlist
Xzibit
VR Headset ?

Nvidia tried it last Gen on the RTX cards Virtuallink alternative
Wikipedia says Virtuallink failed because there were signaling issues in practice.
Would be odd to come into the Virtuallink game both late and after it died.
But VR can still be aided by Type-c of course so that seems more likely, regular USB3.x with displaylink.
Posted on Reply
#63
mtcn77
theoneandonlymrk
you put hbme2 on any design it would perform, my vega64 has been an absolute pleasure, it earned it's price mining, games at 4k on GtaV while most said it impossible (@60)
True dat, don't mind the echoes of envy, vying for a drop of that 99p bandwidth. Real cards have GB/s#.
Posted on Reply
#64
BigBonedCartman
Epic was desperate for user logins since Apple removed Fortnite from the App Store so they made a deal with AMD to reveal these cards in game requiring user to login to access the in game location
Posted on Reply
#65
Caring1
ernorator
Two slot card - check
No blower - check
3 air pressure fans - check
no blocked sides like in VII - check

fk me, is it truly looking like a reference card from AMD with cooler that does not suck?
It blows.
Right at the glass side of the average gamers case where air can't flow.
Posted on Reply
#66
R0H1T
Super XP
Oh No, no more HBM. It ain't worth the extra cost. The HBM version coming is based on CDNA not RDNA and both are different architectures last I checked.
It is at the top end, somewhere around $1000 at least. So if the single die RX 69xx exists it'll likely compete with 3080 Super/Ti otherwise there's no pint having it HBM with low margins & mediocre performance or compete with cards up to vanilla 3080. There's little to no margin in there for AMD, that's not so say that AMD will indeed compete with the top end non Halo RTX cards just that high end memory (costs) & high end margins go together hand in hand.
Posted on Reply
#67
HenrySomeone
Vya Domus
Ah, because whatever Nvidia does dictates what everyone else should do as well, right ?
Well as the undisputed market leader of the last 14 years ... they sort of do (like it or not, the latter in your case obviously), so much so in fact that the inventive boys from team red shamelessly copied the founders edition cooler design from 2000 series, of course with an extra fan, because well, you know - you don't want your flagship to thermal throttle in 30 seconds...:D
R0H1T
It is at the top end, somewhere around $1000 at least. So if the single die RX 69xx exists it'll likely compete with 3080 Super/Ti otherwise there's no pint having it HBM with low margins & mediocre performance or compete with cards up to vanilla 3080. There's little to no margin in there for AMD, that's not so say that AMD will indeed compete with the top end non Halo RTX cards just that high end memory (costs) & high end margins go together hand in hand.
But (somewhat) competing with the regular/vanilla X80 series cards is the most what all AMD's HBM equipped cards so far have managed to do; Fury X was barely faster than 980 when you OCed both and Vega 64 was generally a bit slower than 1080 when you did the same. Expecting them to somehow magically measure up to the (supposed) 3080 Super/Ti after nearly a decade of not being able to do so (and continually getting worse) is wildly optimistic to say the least...
Posted on Reply
#68
Chrispy_
Damnit! :(
I wanted to see a blower.

Look, I know y'all hate blowers but AIB manufacturers are godawful at doing blowers properly. The industry (and some enthusiasts) NEED blowers, and decent quality ones, at that.

Kepler/Maxwell/Pascal had excellent reference blowers.
Polaris (RX480) had a great blower for it's price and power level, Vega actually had a good blower too but it got a bad rap because AMD juiced those cards FAR too hard for any single-fan solution; card-only consumption of up to 362W, and only one fan, what the hell were they thinking?!

Now, we have no reference blowers, and for people who need them it'll be the awful cheap plastic garbage that AIB partners put together for as little care and as much profit as possible.
Posted on Reply
#69
InVasMani
Caring1
It blows.
Right at the glass side of the average gamers case where air can't flow.
I'd have rotated the heatsink fin orientated 45 degrees and given the three fans a progressive 5, 10, and 15 degree fan angle tilt to push the air along out the rear of the case though bracket would need to be a more traditional style bracket. In the right case you could reverse the fans and have it suck that heat right outside the case door to be honest though. I'm not crazy about that cooling design decision I don't consider it ideal, but it can be made to work. Additionally in cases that allow for vertical mounting all that heat would just get blow directly upward though in either instance heat rises.
Posted on Reply
#70
Master Tom
lemoncarbonate
This conventional design looks so much better than RTX3000 imo.

Why 2x 8pin though? Can we get a card with good performance with just 1x 8pin like GTX 1000 series??
What is the problem with 2 connectors? Why is one connector better?
Posted on Reply
#71
R-T-B
mtcn77
Heh, you wouldn't believe me, then. Would you believe it now, @R-T-B? All aboard the hype train! Cho! Cho!
I don't care. Facts are facts and hype trains don't serve them.
Posted on Reply
#74
mtcn77
R-T-B
Really dude, I'm not losing sleep over any of this. Why so serious?
Some of us act like Nvidia is in on some special insider exclusivity dealings which I would love to believe, of course.
They wouldn't just let it be known to be a cut back, but how are we to know things of this nature anyway... Samsung is big and strong, even so much as to poach some talent whenever possible. It is a match made in heaven with or without this process working out.
Posted on Reply
Add your own comment