Monday, May 27th 2019

AMD Announces Radeon RX 5700 Based on Navi: RDNA, 7nm, PCIe Gen4, GDDR6

AMD at its 2019 Computex keynote today unveiled the Radeon RX 5000 family of graphics cards that leverage its new Navi graphics architecture and 7 nm silicon fabrication process. Navi isn't just an incremental upgrade over Vega with a handful new technologies, but the biggest overhaul to AMD's GPU SIMD design since Graphics CoreNext, circa 2011. Called RDNA or Radeon DNA, the new compute unit by AMD is a clean-slate SIMD design with a 1.25X IPC uplift over Vega, an overhauled on-chip cache hierarchy, and a more streamlined graphics pipeline.

In addition, the architecture is designed to increase performance-per-Watt by 50 percent over Vega. The first part to leverage Navi is the Radeon RX 5700. AMD ran a side-by-side demo of the RX 5700 versus the GeForce RTX 2070 at Strange Brigade, where NVIDIA's $500 card was beaten. "Strange Brigade" is one game where AMD fares generally well as it is heavily optimized for asynchonous compute. Navi also ticks two big technology check-boxes, PCI-Express gen 4.0, and GDDR6 memory. AMD has planned a July availability for the RX 5700, and did not disclose pricing.
Add your own comment

202 Comments on AMD Announces Radeon RX 5700 Based on Navi: RDNA, 7nm, PCIe Gen4, GDDR6

#126
Casecutter
So let me get this straight this RX 5700 could be the spiritual predecessor to the AMD "70" Series (Aka 570) that normally built from the gelding of the full-mainstream silicon?

Sure Strange Brigade is a "ringer" and AMD architecture poster child, but also a good all around projection of the capabilities of Vulkcan/DX12 "API Overhead" and I'm sure why AMD leads with it. They're promoting the obvious in that there's gaming engines out there that can unleash their particular architecture design. Nothing wrong about that...

It is interesting that what's been AMD second tier mainstream offering, is working over (sure the one title) a part Nvidia has promoted more-or-less entry enthusiast while such AMD "70-Series" have been task as "entry mainstream" more a-kin today to the GTX 1660. If the RX 5700 actually is 20% behind a 2070 that still has it like somewhere between the Vega 56/64, and I would consider there still a RX 5800 (full-part) still out there.

I figured Computex was just the top-level design and architectural keynotes, and honestly not taking the "marketing jargon" or any of this at face value. That said, I think AMD has a good blueprint and executed this Navi release in fairly clear-cut strategy, while holding to the schedule (we'll wait to see how well they can fill the channel). They'll have more information to drop at E3 (June 13-11), but I don't think we'll learn a-lot with 3 weeks until the actual NDA release July 7th.
Posted on Reply
#127
GoldenX
Looks like it's a "GCN compatible" arch, so, it's GCN, with new stuff.
Posted on Reply
#128
medi01
GoldenX, post: 4055991, member: 160319"
Looks like it's a "GCN compatible" arch, so, it's GCN, with new stuff.
One should be crazy to dump GCN as ISA.
Posted on Reply
#129
Vayra86
Casecutter, post: 4055969, member: 94772"
So let me get this straight this RX 5700 could be the spiritual predecessor to the AMD "70" Series (Aka 570) that normally built from the gelding of the full-mainstream silicon?

Sure Strange Brigade is a "ringer" and AMD architecture poster child, but also a good all around projection of the capabilities of Vulkcan/DX12 "API Overhead" and I'm sure why AMD leads with it. They're promoting the obvious in that there's gaming engines out there that can unleash their particular architecture design. Nothing wrong about that...

It is interesting that what's been AMD second tier mainstream offering, is working over (sure the one title) a part Nvidia has promoted more-or-less entry enthusiast while such AMD "70-Series" have been task as "entry mainstream" more a-kin today to the GTX 1660. If the RX 5700 actually is 20% behind a 2070 that still has it like somewhere between the Vega 56/64, and I would consider there still a RX 5800 (full-part) still out there.

I figured Computex was just the top-level design and architectural keynotes, and honestly not taking the "marketing jargon" or any of this at face value. That said, I think AMD has a good blueprint and executed this Navi release in fairly clear-cut strategy, while holding to the schedule (we'll wait to see how well they can fill the channel). They'll have more information to drop at E3 (June 13-11), but I don't think we'll learn a-lot with 3 weeks until the actual NDA release July 7th.
No, there is no AMD '70' series as you see it with Nvidia. AMD released Polaris and then started incremental updates to that design, mostly in terms of more power > more perf. Polaris is designed as a midrange chip from the get-go. Vega fed their upper half of the stack, development wise its disconnected from what happens with Polaris. Of course features exist on both product (lines) but the design is not the same; Vega has HBM, Polaris does not. Vega got other improvements, Polaris did not.

This also means your interpretation of the AMD naming scheme is not correct. Since the release of RX480, there have been new names but most of that has been rebranded or very minor improvements. Navi's new naming has no relation whatsoever to performance or place in the stack really, its just taking a look at Nvidia and slotting in on the right number. There will be a bigger Navi, but what it will do is a mystery, and AMD no longer has a structure you can rely on with their product stack. Gone are the HD-xx50 / xx70 days.

The key point being, we have no idea what the bigger chip will perform like.

With Nvidia, prior to Turing (but even now, really), while they do use multiple SKUs, these are all almost straight scaled versions of each other. Sometimes some trickery is applied (asymmetrical VRAM setup, usually found on midrange, not just GTX 970; Fermi and Kepler had them too) but you won't see a split halfway the stack using radically different tech. Turing is the exception with its RT components.
Posted on Reply
#130
HenrySomeone
cucker tarlson, post: 4054796, member: 173472"
strange brigade only? it must be really,really bad.
Yup - it'll be way behind 2070 in real life, probably behind 2060 in many games as well and almost always when both are OCed while costing more and having a lot higher power draw too, on 7nm no less, lmao! :D AMD RTG living up to its name once again - Another Massive Disappointment in Real Time Graphics
Posted on Reply
#132
HenrySomeone
Well, all their work on the gpus in the last couple of years is realistically only fit to throw out the window anyway, so no big loss there :p
Posted on Reply
#133
Casecutter
Vayra86, post: 4055999, member: 152404"
No, there is no AMD '70' series as you see it with Nvidia.
I was asking that as a question...

I agree the Nvidia 70 Series has always been in a completely different Product stack. While sure we can't say that the RX 5700 is more "a-kin" to what has been the a mainstream gelding size chip/offering; aka RX 570, R7 270, 7850, but what if that what is?

I believe we have no idea where the "RX 5700" aligns in AMD's product stack, or if it's suppose to be a part that actually contests Nvidia's "entry enthusiast" offering. The number means nothing it just a place holder that some version of a Navi that scrimmages a 2070 in Strange Bridge... It means nothing until we know.
Posted on Reply
#134
Vayra86
Casecutter, post: 4056036, member: 94772"
I was asking that as a question...

I agree the Nvidia 70 Series has always been in a completely different Product stack. While sure we can't say that the RX 5700 is more "a-kin" to what has been the a mainstream gelding size chip/offering; aka RX 570, R7 270, 7850, but what if that what is?

I believe we have no idea where the "RX 5700" aligns in AMD's product stack, or if it's suppose to be a part that actually contests Nvidia's "entry enthusiast" offering. The number means nothing it just a place holder that some version of a Navi that scrimmages a 2070 in Strange Bridge... It means nothing until we know.
Yes, I think the same, and sorry for misinterpreting your question as a conclusion :)

There is really no telling. I'm quite sure they can pull a bigger Navi 20 out of this node that performs a good margin above this one - but how big of a margin? 30%? 50%? And even an optimistic scenario would give them only slightly under or over 2080ti performance. But on the other hand, we haven't seen any AMD GPU surpass GTX 1080ti performance and that card is out there for quite some time now. So far, even Navi stalls completely at the same-ish perf level as Vega 56. In reality, all we've really seen thus far, is rebadged Vega performance - even Radeon VII is just a Vega shrink. Navi's biggest achievement is the move to GDDR6.

This is the pessimistic version of AMD's roadmap though, and its based on what we've seen the past few years. Given Zen's success, who knows, things may get better.
Posted on Reply
#135
Valantar
Frick, post: 4055706, member: 23907"
This really depends on if you're quoting the Bible or Händel.

Anyway. Is this brand new or not? I see many conflicting arguments. Me I'm cautiously optimistic, in this context defined as "might perform almost as good as they say in a best case scenario".
Whichever one says I'm right, obviously ;)

For the second part, I guess we'll know in a couple of weeks? I've got my fingers crossed.
Posted on Reply
#136
bug
Vindicator, post: 4055926, member: 187993"
https://www.anandtech.com/show/11003/hdmi-21-announced-8kp60-48gbps-cable
It was announced publicly in January 2017, nearly 2½ years ago, and that's to the public. Who knows how long it was developed and discussed behind the scenes prior to this. I am extremely bummed HDMI 2.1 is apparently not supported by these cards. I expected the PCIE 4.0 announcement to be the perfect time for AMD to really jump ahead with their GPU feature support. Such a missed opportunity imo.
It was announced in January, but it wasn't set in stone until November that year. Still, it could have been implemented.
The thing is, HDMI 2.1 comes with VRR. And since that's probably incompatible with whatever magic AMD worked to implement their own VRR over HDMI, it could be a problem to implement.
Posted on Reply
#137
Casecutter
Casecutter, post: 4055969, member: 94772"
It is interesting that what's been AMD second tier mainstream offering,
I think this is where I went off course I should have said "It would be interesting to see what could be AMD second tier mainstream offering."

Vayra86, post: 4056043, member: 152404"
And even an optimistic scenario would give them only slightly under or over 2080ti performance. But on the other hand, we haven't seen any AMD GPU surpass GTX 1080ti performance and that card is out there for quite some time now.
I'm not expecting any "big" Navi that is meant to contest Nvidia's top-shelf Pro-Enthusiast offerings. I think they'll work to get market share, with at some point two Navi's, that best make use of their 7nm wafer starts and yields, the bigger the chip the worse that gets. AMD sideline that "upper echelon" pursuit, and will bench it say another year and use Arcturus (or whatever next) to get back into that segment.
Posted on Reply
#138
FordGT90Concept
"I go fast!1!11!1!"
bug, post: 4056059, member: 157434"
It was announced in January, but it wasn't set in stone until November that year. Still, it could have been implemented.
The thing is, HDMI 2.1 comes with VRR. And since that's probably incompatible with whatever magic AMD worked to implement their own VRR over HDMI, it could be a problem to implement.
There's a lot of sticking points on the GPU side:
-"Ultra High Speed"--GPU has to be able to produce 48 Gbps signal (Navi doesn't target that market).
-Dynamic HDR--don't think DisplayPort supports this. It will take a lot of R&D to implement.
-Enhanced Audio Return Channel--not sure how difficult this is to implement. Dolby isn't exactly popular on computers: huge preference towards uncompressed PCM which is lossless. It might require paying Dolby too which could mean NVIDIA/AMD/Intel will never be compliant here.

I think VRR, at least for AMD, is an easy one. They can probably make it HDMI 2.1 compliant with a driver patch because GCN apparently has a lot of granularity control over its HDMI protocol. Everything else is theoretically pretty easy (low latency) or already done (DSC).


Remember, GPUs in general are usually quite a ways behind TVs in implementing HDMI standards. HDMI was always designed to put the burden of design on the source, not the destination.
Posted on Reply
#139
Vindicator
FordGT90Concept, post: 4056088, member: 60463"
There's a lot of sticking points on the GPU side:
-"Ultra High Speed"--GPU has to be able to produce 48 Gbps signal (Navi doesn't target that market).
-Dynamic HDR--don't think DisplayPort supports this. It will take a lot of R&D to implement.
-Enhanced Audio Return Channel--not sure how difficult this is to implement. Dolby isn't exactly popular on computers: huge preference towards uncompressed PCM which is lossless. It might require paying Dolby too which could mean NVIDIA/AMD/Intel will never be compliant here.

I think VRR, at least for AMD, is an easy one. They can probably make it HDMI 2.1 compliant with a driver patch because GCN apparently has a lot of granularity control over its HDMI protocol. Everything else is theoretically pretty easy (low latency) or already done (DSC).


Remember, GPUs in general are usually quite a ways behind TVs in implementing HDMI standards. HDMI was always designed to put the burden of design on the source, not the destination.
I'm looking at this a very different way. I fully believe they can do it but are holding out to justify selling cards down the road that have little to no performance increase.

Only one of 4 HDMI 2.1 connectors on the 2019 OLEDs have eARC compatibility, which should mean this doesn't have to be on the list for GPU manufacturers to support 2.1.
Worried about 48gbits per second? Well that fancy PCIE 4.0 connector can do 256gbits/second so I hightly doubt the bandwidth is the problem holding this back.
HDR is already supported and there's no differentiation between different HDR styles in Windows yet, so this is something that, considering it's software level, would make sense that it would be a possible software update to support, but the connector itself wouldn't be held back in the meantime.

GPUs, in the past, have been far far far ahead of what the vast majority of hardware on the market is capable of. I still remember my TNT2 Ultra that supported 240hz and that was in the late 90s. It could also do 1920x1200. In the 90s. The first 1080P TVs (That I remember) came out just before PS3 in 2006. That means GPUs were ahead by at least 7-8 years back then when compared to where TVs were at.
Posted on Reply
#140
FordGT90Concept
"I go fast!1!11!1!"
Vindicator, post: 4056111, member: 187993"
I'm looking at this a very different way. I fully believe they can do it but are holding out to justify selling cards down the road that have little to no performance increase.
Look how long it took AMD to implement HDMI 2.0. People were disappointed Fiji shipped with HDMI 1.4. Polaris (June 29, 2016) was the first to support HDMI 2.0 which was 2.75 years after the specification was released (September 4, 2013).

Vindicator, post: 4056111, member: 187993"
Worried about 48gbits per second? Well that fancy PCIE 4.0 connector can do 256gbits/second so I hightly doubt the bandwidth is the problem holding this back.
They are completely unrelated technologies.

Vindicator, post: 4056111, member: 187993"
GPUs, in the past, have been far far far ahead of what the vast majority of hardware on the market is capable of.
On the DisplayPort side, yes, because DisplayPort puts GPU design first. HDMI puts display and media manufacturers first which makes GPU support convoluted.

Vindicator, post: 4056111, member: 187993"
I still remember my TNT2 Ultra that supported 240hz and that was in the late 90s. It could also do 1920x1200. In the 90s.
VGA (analog) didn't have hard limits like digital signals do today. TVs were all built to NTSC or PAL standards which were basically 4:3 with 480 or 625 interlaced scan lines (respectively)... It was the drive to digital ATSC/DVB that created HDMI.

Again, Arcturus will most likely support HDMI 2.1. Navi will not.
Posted on Reply
#141
GoldenX
And that's why VGA is the best output.
Posted on Reply
#142
EarthDog
GoldenX, post: 4056152, member: 160319"
And that's why VGA is the best output.
lol, for a potato resolution. :p

I dont think it can do much over 2k (2048x1080).. pretty sure it cant reach 2560x1440 60hz?
Posted on Reply
#143
FordGT90Concept
"I go fast!1!11!1!"
GoldenX, post: 4056152, member: 160319"
And that's why VGA is the best output.
For CRTs, yes; for LCDs, no. The VGA-in on LCDs have to approximate everything. A 4K analog image on LCD would lose its sharpness. It simply can't convey that much data with the clarity necessary so...everything gets muddy.

CRTs were never digital in the first place so the signal had to be converted to analog at some point (either in the GPU or in the display).

No reason why an 8K CRT couldn't be made today that accepts a DisplayPort or HDMI connector and RAMDACs it into VGA internally. Would look better than sending analog over VGA anyway because less noise.
Posted on Reply
#144
GoldenX
It was sarcasm...
Anyway, I don't expect it, but those Zen+ APUs could have Navi inside.
Posted on Reply
#145
eldakka
FordGT90Concept, post: 4054803, member: 60463"
It was announced in January, too late to put into Navi. Arcturus might have it.


I want to know how many transistors it has.
What is the "it" you refer to?

If you mean HDMI 2.1, it was announced in January 2017 and released in November the same year.

HDMI 2.1 TVs are on the market right now - LG "9" series.
Posted on Reply
#146
Valantar
Vindicator, post: 4056111, member: 187993"
I'm looking at this a very different way. I fully believe they can do it but are holding out to justify selling cards down the road that have little to no performance increase.
Has anyone in the world ever bought a new GPU with 0% performance increase due to it having a new display output? That seems like a particularly silly idea.
bug, post: 4056059, member: 157434"
It was announced in January, but it wasn't set in stone until November that year. Still, it could have been implemented.
The thing is, HDMI 2.1 comes with VRR. And since that's probably incompatible with whatever magic AMD worked to implement their own VRR over HDMI, it could be a problem to implement.
VRR in HDMI 2.1 is AFAIK an adaptation of the VESA DP AS standard. "FS over HDMI" should already be compliant with this - at worst it'll need a driver update.

And as Ford said above, the time from a new HDMI standard is launched until it reaches PC hardware has always been very long. That seems to be how the HDMI consortium works.
GoldenX, post: 4056183, member: 160319"
It was sarcasm...
Anyway, I don't expect it, but those Zen+ APUs could have Navi inside.
No, they don't. They're already out in laptops. The die is known, the GPU spec is known, and it's Vega 10 with a clock bump. If they were Navi, this would show in drivers (in particular: in needing entirely bespoke drivers). Of course, the fact that they haven't launched the desktop APUs yet makes me slightly hopeful that they'll just hold off until the next generation of MCM APUs are ready some time in (very) late 2019 or early 2020 - once there's a known good Navi die that will fit the package available in sufficient quantities that it won't gimp GPU sales. Frankly, I'd prefer that over a clock-bumped 3200G/3400G. Maybe they could even bring the model names and CPU architectures in line by doing this?
Posted on Reply
#147
FordGT90Concept
"I go fast!1!11!1!"
3200G/3400G are on 12nm, yeah? Makes sense that a Zen 2 would get a Navi GPU in either 3300G/3500G or bumping it up to 4200G/4400G on 7 nm. PS5's CPU practically already is this without SMT (8-core with Navi) and probably with a different memory architecture (probably 16 GiB GDDR6).
Posted on Reply
#148
Valantar
FordGT90Concept, post: 4056279, member: 60463"
3200G/3400G are on 12nm, yeah? Makes sense that a Zen 2 would get a Navi GPU in either 3300G/3500G or bumping it up to 4200G/4400G on 7 nm. PS5's CPU practically already is this without SMT (8-core with Navi) and probably with a different memory architecture (probably 16 GiB GDDR6).
Well, technically 3200G/3400G don't exist (yet?), but the mobile 3000-series APUs are all 12nm Zen+. It would sure be interesting if they launched them as low-end options, and then surprised us with, say, an R5 3500G (6c12t + Navi 16-20?) and R7 3700G (8c16t + Navi 20-24?) later in the year. I doubt we'd see these before B550 motherboards, though, as most X570 boards seem to lack display outputs.
Posted on Reply
#150
Valantar
P4-630, post: 4057472, member: 22154"
AMD Radeon RX 5000 is hybrid with elements of GCN - "pure" RDNA only in 2020 - Sweclockers
Amd/comments/but81o
https://www.sweclockers.com/nyhet/27618-amd-radeon-rx-5000-ar-hybrid-med-inslag-av-gcn-renodlad-rdna-forst-ar-2020

Navi die:

https://nl.hardware.info/nieuws/65723/computex-rx-5000-serie-van-amd-wordt-hybride-van-gcn-en-rdna-pure-rdna-komt-met-navi-20
Interesting!

Worth clarifying for the non-Swedophones(?) out there: according to this, Navi 20 ("big Navi") is supposed to be "pure" RDNA, and launch in early 2020. In other words, this is not a "half now, half next generation" situation as the title might make it seem. Still odd to make a hybrid like this, but I guess the architectures are modular enough to plug-and-play the relevant blocks on a driver level as well. This also clarifies the kinda-weird mismatch between RDNA being the architecture for "gaming in the next decade" while there being a "next-gen" arch on the roadmaps for 2020.

I wonder what implications this might have for performance and driver support. One might assume that these first cards will lose driver support earlier, but then again considering how prevalent GCN is I can't see that being for another 5 years or so anyway, by which time they'll be entirely obsolete. Performance enhancements and driver tuning might taper off more quickly, though, unless the relevant parts are RDNA and not GCN.
Posted on Reply
Add your own comment