• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA2 Graphics Architecture Detailed, Offers +50% Perf-per-Watt over RDNA

Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
It's certainly interesting reading the two threads, one is haha never gonna happen leather jacket man, the other is... awesome take that leather jacket man.

Nice features though, welcome to 2018.
 
Joined
Mar 23, 2005
Messages
4,061 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
You really think so?
Going by YouTube analysis by various techies yes I think so.
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Going by YouTube analysis by various techies yes I think so.
Will you elaborate on what these YTs said to make you feel this way?

....especially in light of the link I just provided?

If we know their Navi/RDNA/7nm is less efficient than Nvidiz now...assuming both of those articles are true.... why would they be worried about maintaining their efficiency over AMD gpus?

Which is more realistic to you for the 50% increase? An new arch with a die shrink, or an update arch on the same process? I think both will get there, however nvidia isnt worried about this..
 
Last edited:

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,747 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It's certainly interesting reading the two threads, one is haha never gonna happen leather jacket man, the other is... awesome take that leather jacket man.

Nice features though, welcome to 2018.

Of course, it's been like that for a while here at TPU, Nvidia is the company people love to hate while AMD as the underdog gets off light. Fair few examples floating around where similar things happen or are claimed, Nvidia gets sh*t on and AMD get's excitement and praise.

I realllllly want to see AMD pull the rabbit out of the hat on this on, I want the competition to be richer and I am craving a meaningful upgrade to my GTX1080 that has RTRT and VRS. I will buy the most compelling offering from either camp, it just has to be compelling. Really not in the mood for another hot, loud card, with coil whine and driver issues. If I can buy a 2080Ti perf or higher card for ~$750 USD or less that ticks those boxes, happy days.

Truly AMD, I am rooting for you, do what you did with Zen!
 
Joined
May 31, 2016
Messages
4,324 (1.50/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
1. I was referring to Radeon VII
2. I was referring to perf/watt.
3. GDDR6 (for 16 GBps 2.5w each x 8 chips) and HBM v2 (e.g `~20 watts Vega Frontier 16 GB) power consumption difference is minor when compared to GPUs involved.

16 GB HBM v2 power consumption is lower when compared to GDDR6 16 chip 16GB Clamshell Mode which is irrelevant for RX-5700 XT's 8 chips GDDR6-14000.
Not so sure about that. HBM2 uses half the power than GDDR6 considering same capacity. If in your eyes it is minor then fine but it is still a difference which you haven't considered. I'm saying your comparison is not accurate. Also you are not comparing chip vs chip but card vs card and that is entirely different thing.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Will you elaborate on what these YTs said to make you feel this way?

I think the words 'great' and '50%' were used in the same video.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
The only thing that would worry Nvidia is if their next generation somehow gets delayed, but there are no indicators of that yet.

Still, up to 50% is damn impressive without a node change (remember what changed from 14nm to the tweaked "12nm"? Yeah, near nothing). Here's hoping the minimum increase (for common workloads) is well above 30%. 40% would still make for a very good ~275W card (especially if they use HBM), though obviously we all want as fast as possible :p
As I pointed out, it depends how you compare. If you selectively compare with a previous chip with higher clocks, then you can get numbers like this easily.
To achieve a 50% efficiency gain in average between Navi 1x and Navi 2x would be a huge achievement, and is fairly unlikely. It's hard to predict the gains from a refined node, but we have seen in the past that refinements can do good improvements, like Intel's 14nm+/14nm++, but still far away from reaching 50%.

And as always, any node advancements will be available to Nvidia as well.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
As I pointed out, it depends how you compare. If you selectively compare with a previous chip with higher clocks, then you can get numbers like this easily.
... which is why I said I hoped for relatively high minimum perf/W gains also, and not just peak.
To achieve a 50% efficiency gain in average between Navi 1x and Navi 2x would be a huge achievement, and is fairly unlikely. It's hard to predict the gains from a refined node, but we have seen in the past that refinements can do good improvements, like Intel's 14nm+/14nm++, but still far away from reaching 50%.
Preaching to the choir here man. Though there haven't been any real efficiency gains on Intel 14nm since Skylake, just clock scaling improvements (and later node revisions actually sacrifice efficiency to achieve that). Still an achievement hitting those clocks, but the sacrifices involved have been many and large.
 
Joined
Mar 23, 2005
Messages
4,061 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Will you elaborate on what these YTs said to make you feel this way?

....especially in light of the link I just provided?

If we know their Navi/RDNA/7nm is less efficient than Nvidiz now...assuming both of those articles are true.... why would they be worried about maintaining their efficiency over AMD gpus?

Which is more realistic to you for the 50% increase? An new arch with a die shrink, or an update arch on the same process? I think both will get there, however nvidia isnt worried about this..

I'm not going to dig into all his videos to find the various quotes he mentions, but this is one YouTuber that claims this based on sources. Probably an over exaggeration but RDNA2 IS going to challenge Nvidia, which will affect its overall sales. So in that respect, I am sure they are curious about this Big Navi.
Moore's Law Is Dead
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I'm not going to dig into all his videos to find the various quotes he mentions,
Kind of a shame. You made some claims but, put the effort on others to find them? I'll pass. :)

I am sure they are curious about this Big Navi.
Curious...sure. Always. You have to keep an eye on the competition. But that is quite a bit different than "worried". ;)
 
Joined
Feb 13, 2012
Messages
522 (0.12/day)
And expecting AMD to double and then triple the performance in two years wasn't a clue either? :p

Well it wasn't a clue because I thought it's doable. NAVI 1x is a 250mm2 chip which is small considering you could probably go up to 750-800mm2 (unlikely tho). But then 5nm EUV should be around by that time.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I'm not going to dig into all his videos to find the various quotes he mentions, but this is one YouTuber that claims this based on sources. Probably an over exaggeration but RDNA2 IS going to challenge Nvidia, which will affect its overall sales. So in that respect, I am sure they are curious about this Big Navi.
Moore's Law Is Dead
I hope you're not basing your expectations of RDNA2 on this random nobody. This guy claimed last year that AMD were holding big Navi back because they didn't need to release it (facepalm), claiming that AMD were renaming chips codenames to excuse his mispredictons (which they would never do), and that Navi 12 was coming in 2019 to crush RTX 2080 Super, and that was just from a single of his BS videos.

Don't get me wrong though, I hope RDNA2 is as good as possible. But please don't spread the nonsense these losers on YouTube are pulling out of their behinds. ;)

Well it wasn't a clue because I thought it's doable. NAVI 1x is a 250mm2 chip which is small considering you could probably go up to 750-800mm2 (unlikely tho). But then 5nm EUV should be around by that time.
It's also a 250mm² chip that draws ~225W ;)

Building big chips is not the problem, but doing big chips with high clocks though, that would require a much more efficient architecture.
 
Joined
Mar 23, 2005
Messages
4,061 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Kind of a shame. You made some claims but, put the effort on others to find them? I'll pass. :)

Curious...sure. Always. You have to keep an eye on the competition. But that is quite a bit different than "worried". ;)
When I comment with such information, you should take it as fact. I have no reason to BS. And I was watching YouTube on my big screen TV after work one day and heard the individual say what I stated. I'm not going to take a notepad and start writing down what I hear. Lol
Would you?

I hope you're not basing your expectations of RDNA2 on this random nobody. This guy claimed last year that AMD were holding big Navi back because they didn't need to release it (facepalm), claiming that AMD were renaming chips codenames to excuse his mispredictons (which they would never do), and that Navi 12 was coming in 2019 to crush RTX 2080 Super, and that was just from a single of his BS videos.

Don't get me wrong though, I hope RDNA2 is as good as possible. But please don't spread the nonsense these losers on YouTube are pulling out of their behinds. ;)


It's also a 250mm² chip that draws ~225W ;)

Building big chips is not the problem, but doing big chips with high clocks though, that would require a much more efficient architecture.
I've also heard RedTagGaming and Gamer Meld YouTube channels that seem quite exited about RDNA2 based on what there sources have hinted. I'm keeping my expectations conservative. Though, I have a strong gut feeling RDNA2 is the real deal and not just another Vega like GPU.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I've also heard RedTagGaming and Gamer Meld YouTube channels that seem quite exited about RDNA2 based on what there sources have hinted. I'm keeping my expectations conservative. Though, I have a strong gut feeling RDNA2 is the real deal and not just another Vega like GPU.
Which are yet more channels which fall into the bucket of less "competent" "tech" YouTube channels. I would advice to avoid such channels unless you do it for amusement or looking for sources of false rumors. These channels serve one of two purposes; serve people the "news" they want to hear (in the echo chambers), or to shape public opinion. If you listen to more than a few episodes you'll see all of these are all over the place, are inconsistent with themselves, and fail to master any deeper technical knowledge. Some of these provide their own "leaks", while others just recite pretty much everything they can scrape of the web.

Speculation is of course fine, and many of us enjoy discussing potential hardware, myself included, but speculation should be labeled as such, not be labeled as "leaks" when it's not. Whenever we see leaks we should always check if it passes some basic "smell tests";
  • Who is the source and does it have a good track record? Always see where the leak originates; if it's from WCCFTech, VideoCardz, FudZilla or somewhere random, then it's fairly certainly fake, random twitter/forum posts often is fake, but can occasionally be true, etc. "Leaks" from official drivers, compilers, official papers etc. are pretty solid. Some sources are also know to have a certain bias, even though they can have elements of truth to their claims.
  • Is the nature of the "leak" something which can be known, or is likely to be known outside a few core engineers? Example: Clock speeds are never set in stone until they have the final stepping shortly ahead of a release, so when someone posts a table of clock speeds of CPUs/GPUs 6-12 monts ahead, you can know it's BS.
  • Is the specificity of the leak something that is sensitive? If the details is only known to a few people under NDA, then those leaking it will risk losing their job and potential lawsuits, how many are willing to do that to serve a random YouTube channel or webpage? What is their motivation?
  • Is the scope of the leak(s) likely at all? Some of these channels claims to have dozens of sources inside Intel/AMD/Nvidia, seriously a random guy in his basement have such good sources? Some of these claims to even have single sources who provides sensitive NDA'ed information from both Intel and AMD about products 1+ years away, there is virtually no chance this claim is true, and is an immediate red flag to me.

Unfortunately, most "leaks" are either qualified guesses or pure BS, sometimes an accumulation of both (either intentionally or not). Perhaps sometime you should look back after a product release and evaluate the accuracy and the timeline of the leaks. The general trend is usually that early leaks are usually only true about "big" features, early "specific"(clocks, TDP, shader count(GPUs)) leaks are usually fake. Then usually there is a spike in leaks around the time the first engineering samples arrives, various leaked benchmarks, etc. but clocks are still all over the place. Then there is another spike when board partners get their hands on it, then the accuracy increases a lot, but there is still some variance. Then usually a few weeks ahead of release, we get pretty much precise details.

Edit:
Rumors about Polaris, Vega, Vega 2x and Navi 1x have pretty much started out the same way; very unrealistic initially, and then pessimistic close to the actual release. Let's hope Navi 2x delivers, but please don't drive the hype too high.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
It's also a 250mm² chip that draws ~225W ;)

Building big chips is not the problem, but doing big chips with high clocks though, that would require a much more efficient architecture.
Not that difficult - there's not much reason to push a big chip that far up the efficiency curve, and seeing just how much power can be saved on Navi by downclocking just a little, it's not too big a stretch of the imagination to see a 500mm² chip at, say, 200-300MHz less stay below 300W, especially if it uses HBM2. Of course AMD did say that they would be increasing clocks with RDNA2 while still improving efficiency, which really makes me wonder what kind of obvious fixes they left for themselves when they designed RDNA (1). Even with a tweaked process node, that is a big ask.
 

Xmpere

New Member
Joined
Jul 22, 2019
Messages
14 (0.01/day)
this super xp guy is just a AMD fanboy. Anyone who is a fanboy/bias towards to a company, it statements renders invalid.
 
Joined
Mar 23, 2005
Messages
4,061 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
this super xp guy is just a AMD fanboy. Anyone who is a fanboy/bias towards to a company, it statements renders invalid.
You claiming I am a fanboy renders your statement invalid. Not to mention, I've been here since 2005. YOU?

Which are yet more channels which fall into the bucket of less "competent" "tech" YouTube channels. I would advice to avoid such channels unless you do it for amusement or looking for sources of false rumors. These channels serve one of two purposes; serve people the "news" they want to hear (in the echo chambers), or to shape public opinion. If you listen to more than a few episodes you'll see all of these are all over the place, are inconsistent with themselves, and fail to master any deeper technical knowledge. Some of these provide their own "leaks", while others just recite pretty much everything they can scrape of the web.

Speculation is of course fine, and many of us enjoy discussing potential hardware, myself included, but speculation should be labeled as such, not be labeled as "leaks" when it's not. Whenever we see leaks we should always check if it passes some basic "smell tests";
  • Who is the source and does it have a good track record? Always see where the leak originates; if it's from WCCFTech, VideoCardz, FudZilla or somewhere random, then it's fairly certainly fake, random twitter/forum posts often is fake, but can occasionally be true, etc. "Leaks" from official drivers, compilers, official papers etc. are pretty solid. Some sources are also know to have a certain bias, even though they can have elements of truth to their claims.
  • Is the nature of the "leak" something which can be known, or is likely to be known outside a few core engineers? Example: Clock speeds are never set in stone until they have the final stepping shortly ahead of a release, so when someone posts a table of clock speeds of CPUs/GPUs 6-12 monts ahead, you can know it's BS.
  • Is the specificity of the leak something that is sensitive? If the details is only known to a few people under NDA, then those leaking it will risk losing their job and potential lawsuits, how many are willing to do that to serve a random YouTube channel or webpage? What is their motivation?
  • Is the scope of the leak(s) likely at all? Some of these channels claims to have dozens of sources inside Intel/AMD/Nvidia, seriously a random guy in his basement have such good sources? Some of these claims to even have single sources who provides sensitive NDA'ed information from both Intel and AMD about products 1+ years away, there is virtually no chance this claim is true, and is an immediate red flag to me.

Unfortunately, most "leaks" are either qualified guesses or pure BS, sometimes an accumulation of both (either intentionally or not). Perhaps sometime you should look back after a product release and evaluate the accuracy and the timeline of the leaks. The general trend is usually that early leaks are usually only true about "big" features, early "specific"(clocks, TDP, shader count(GPUs)) leaks are usually fake. Then usually there is a spike in leaks around the time the first engineering samples arrives, various leaked benchmarks, etc. but clocks are still all over the place. Then there is another spike when board partners get their hands on it, then the accuracy increases a lot, but there is still some variance. Then usually a few weeks ahead of release, we get pretty much precise details.

Edit:
Rumors about Polaris, Vega, Vega 2x and Navi 1x have pretty much started out the same way; very unrealistic initially, and then pessimistic close to the actual release. Let's hope Navi 2x delivers, but please don't drive the hype too high.
Thanks for the information. Most of the so called Rumors from Wccftech is regurgitation off VideoCardz and most VideoCardz rumors comes from Twitter.
As for Fudzilla, I would take them a lot more serious over the 2 mentioned. Fudzilla used to be part of Mike Magee's group which wrote for The Inquirer.net (No longer around). Also Charlie Demerjian of SemiAccurate was also part of Mike Magee's group. My point was Mike had real industry sources and was well respected in the computer tech industry. I believe he's been retired for years now. So Fudzilla & SemiAccurate may not get it right all the time, they get pretty close to to the actual truth, because nothing in rumor ever comes 100% accurate. Companies always make last minute changes to products.

Not that difficult - there's not much reason to push a big chip that far up the efficiency curve, and seeing just how much power can be saved on Navi by downclocking just a little, it's not too big a stretch of the imagination to see a 500mm² chip at, say, 200-300MHz less stay below 300W, especially if it uses HBM2. Of course AMD did say that they would be increasing clocks with RDNA2 while still improving efficiency, which really makes me wonder what kind of obvious fixes they left for themselves when they designed RDNA (1). Even with a tweaked process node, that is a big ask.
RDNA1 was just to get a new 7nm hybrid graphics chip that competes well out the door. Testing the waters of RDNA1 design. One example is for GCN, 1 instruction is issued every 4 cycles. With this RDNA hybrid, 1 instruction is issued every 1 cycle, making it much more efficient.
RDNA2 is the real deal according to AMD. I believe they will release a 280W max version, where they will still be able to achieve at least 25%-40% performance improvement over the RTX 2080-Ti. RDNA2 is an Ampere competitor.
 
Last edited:
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
You claiming I am a fanboy renders your statement invalid. Not to mention, I've been here since 2005. YOU?


Thanks for the information. Most of the so called Rumors from Wccftech is regurgitation off VideoCardz and most VideoCardz rumors comes from Twitter.
As for Fudzilla, I would take them a lot more serious over the 2 mentioned. Fudzilla used to be part of Mike Magee's group which wrote for The Inquirer.net (No longer around). Also Charlie Demerjian of SemiAccurate was also part of Mike Magee's group. My point was Mike had real industry sources and was well respected in the computer tech industry. I believe he's been retired for years now. So Fudzilla & SemiAccurate may not get it right all the time, they get pretty close to to the actual truth, because nothing in rumor ever comes 100% accurate. Companies always make last minute changes to products.


RDNA1 was just to get a new 7nm hybrid graphics chip that competes well out the door. Testing the waters of RDNA1 design. One example is for GCN, 1 instruction is issued every 4 cycles. With this RDNA hybrid, 1 instruction is issued every 1 cycle, making it much more efficient.
RDNA2 is the real deal according to AMD. I believe they will release a 280W max version, where they will still be able to achieve at least 25%-40% performance improvement over the RTX 2080-Ti. RDNA2 is an Ampere competitor.
Sorry... what does when you signed up to this site have to do with anything? Seems similar to equating knowledge with post count.... :(

Anyway, just to get to 2080ti FE speeds from their current 5700 xt flagship is 46%. To go another 25-40% faster that would be a 71-86% increase. Have we ever seen that in the history of gpus? A 71% increase from previous gen flagship to current gen flagship?

You've sure got a lot of faith in this architecture with about the only thing going for it is AMD marketing...

If ampre comes in like Turing did over kepler (25%) that's the bottom end of your goal with their new gpu performing 71% faster than it's current flagship. That's a ton, period, not to mention on the same node.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Sorry... what does when you signed up to this site have to do with anything? Seems similar to equating knowledge with post count.... :(

Anyway, just to get to 2080ti FE speeds from their current 5700 xt flagship is 46%. To go another 25-40% faster that would be a 71-86% increase. Have we ever seen that in the history of gpus? A 71% increase from previous gen flagship to current gen flagship?

You've sure got a lot of faith in this architecture with about the only thing going for it is AMD marketing...

If ampre comes in like Turing did over kepler (25%) that's the bottom end of your goal with their new gpu performing 71% faster than it's current flagship. That's a ton, period, not to mention on the same node.
The 5700 XT is a "flagship" GPU only in terms of being the fastest SKU made this generation. Otherwise it really isn't (and isn't meant to be) - not in die size, not in performance, not in power draw, and certainly not in price. The 5700 XT was designed to be an upper mid-range GPU, which is what it is. That they managed that with just 40 CUs and power headroom to spare tells us that they definitely have room to grow upwards unlike the previous generations (especially as RDNA is no longer architecturally limited to 64 CUs). So there's no reason to extrapolate AMD being unable to compete in higher tiers from the positioning of the 5700 XT - quite the opposite. They likely just wanted to make the first RDNA chips high volume sellers rather than expensive and low-volume flagship level SKUs (on a limited and expensive 7nm node). Now that the arch is further matured, Apple has moved on from 7nm freeing up capacity for AMD, and they have even more money to spend, there's definitely a proper flagship coming.
 
Joined
Mar 23, 2005
Messages
4,061 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Sorry... what does when you signed up to this site have to do with anything? Seems similar to equating knowledge with post count.... :(

Anyway, just to get to 2080ti FE speeds from their current 5700 xt flagship is 46%. To go another 25-40% faster that would be a 71-86% increase. Have we ever seen that in the history of gpus? A 71% increase from previous gen flagship to current gen flagship?

You've sure got a lot of faith in this architecture with about the only thing going for it is AMD marketing...

If ampre comes in like Turing did over kepler (25%) that's the bottom end of your goal with their new gpu performing 71% faster than it's current flagship. That's a ton, period, not to mention on the same node.
He called me a fanboy that has absolutely no relevance to the topic at hand? Or perhaps he never knew I have a high end Intel & Nvidia gaming laptop because AMD graphics didn't cut at the time I purchased it in 2018.

With regards to the 3080-Ti and Big Navi performance numbers, it's all up in the air speculation. Some think RDNA2 (Big Navi) is going to compete with the 2080-TI and others believe AMD is targeting the 3080-Ti. In order for AMD to target Nvidia's speculative 3080-Ti, they are probably going to compare Nvidia's performance improvements per generation to have an idea on how fast RDNA2 needs to be. I don't think AMD will push it to the limits, I think they are working more on power efficiency and performance efficiency when they designed RDNA2. I know this is marketing, but Micro-Architecture Innovation = Improved Per-per-Clock (IPC), Logic Enhancement = Reduce Complexity and Switching Power & Physical Optimizations = Increase Clock Speed.

What does all these enhancements have in common? Gaming Consoles

The 5700 XT is a "flagship" GPU only in terms of being the fastest SKU made this generation. Otherwise it really isn't (and isn't meant to be) - not in die size, not in performance, not in power draw, and certainly not in price. The 5700 XT was designed to be an upper mid-range GPU, which is what it is. That they managed that with just 40 CUs and power headroom to spare tells us that they definitely have room to grow upwards unlike the previous generations (especially as RDNA is no longer architecturally limited to 64 CUs). So there's no reason to extrapolate AMD being unable to compete in higher tiers from the positioning of the 5700 XT - quite the opposite. They likely just wanted to make the first RDNA chips high volume sellers rather than expensive and low-volume flagship level SKUs (on a limited and expensive 7nm node). Now that the arch is further matured, Apple has moved on from 7nm freeing up capacity for AMD, and they have even more money to spend, there's definitely a proper flagship coming.
Agreed.
I have a suspicion, what ZEN2 did to the market, RDNA2 will also have a similar effect. And it's a much needed effect, as we need better competition to help drive resonable GPU pricing once again.
 
Joined
Apr 8, 2010
Messages
991 (0.19/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
I have a suspicion, what ZEN2 did to the market, RDNA2 will also have a similar effect. And it's a much needed effect, as we need better competition to help drive resonable GPU pricing once again.
If it does what the HD4870/50 did, that will be incredible
 
Joined
May 31, 2016
Messages
4,324 (1.50/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Sorry... what does when you signed up to this site have to do with anything? Seems similar to equating knowledge with post count.... :(

Anyway, just to get to 2080ti FE speeds from their current 5700 xt flagship is 46%. To go another 25-40% faster that would be a 71-86% increase. Have we ever seen that in the history of gpus? A 71% increase from previous gen flagship to current gen flagship?

You've sure got a lot of faith in this architecture with about the only thing going for it is AMD marketing...

If ampre comes in like Turing did over kepler (25%) that's the bottom end of your goal with their new gpu performing 71% faster than it's current flagship. That's a ton, period, not to mention on the same node.
pack 5700xt chip in one die :) 500mm2 and you should be ok. I know it may not work like that but who knows? Besides the RDNA2 will offer a bit more horse power due to some improvements so it is possible. 500mm2 chip is not as big as NV's 2080Ti 754mm2 though. I get what you are saying the 5700xt is AMD's flagship the best released so far but with the 251mm2 size it is fairly small wouldn't you say? Flagship released and capabilities of the architecture are two different things.
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
The 5700 XT is a "flagship" GPU only in terms of being the fastest SKU made this generation. Otherwise it really isn't (and isn't meant to be) - not in die size, not in performance, not in power draw, and certainly not in price. The 5700 XT was designed to be an upper mid-range GPU, which is what it is. That they managed that with just 40 CUs and power headroom to spare tells us that they definitely have room to grow upwards unlike the previous generations (especially as RDNA is no longer architecturally limited to 64 CUs). So there's no reason to extrapolate AMD being unable to compete in higher tiers from the positioning of the 5700 XT - quite the opposite. They likely just wanted to make the first RDNA chips high volume sellers rather than expensive and low-volume flagship level SKUs (on a limited and expensive 7nm node). Now that the arch is further matured, Apple has moved on from 7nm freeing up capacity for AMD, and they have even more money to spend, there's definitely a proper flagship coming.
He called me a fanboy that has absolutely no relevance to the topic at hand? Or perhaps he never knew I have a high end Intel & Nvidia gaming laptop because AMD graphics didn't cut at the time I purchased it in 2018.

With regards to the 3080-Ti and Big Navi performance numbers, it's all up in the air speculation. Some think RDNA2 (Big Navi) is going to compete with the 2080-TI and others believe AMD is targeting the 3080-Ti. In order for AMD to target Nvidia's speculative 3080-Ti, they are probably going to compare Nvidia's performance improvements per generation to have an idea on how fast RDNA2 needs to be. I don't think AMD will push it to the limits, I think they are working more on power efficiency and performance efficiency when they designed RDNA2. I know this is marketing, but Micro-Architecture Innovation = Improved Per-per-Clock (IPC), Logic Enhancement = Reduce Complexity and Switching Power & Physical Optimizations = Increase Clock Speed.

What does all these enhancements have in common? Gaming Consoles


Agreed.
I have a suspicion, what ZEN2 did to the market, RDNA2 will also have a similar effect. And it's a much needed effect, as we need better competition to help drive resonable GPU pricing once again.
pack 5700xt chip in one die :) 500mm2 and you should be ok. I know it may not work like that but who knows? Besides the RDNA2 will offer a bit more horse power due to some improvements so it is possible. 500mm2 chip is not as big as NV's 2080Ti 754mm2 though. I get what you are saying the 5700xt is AMD's flagship the best released so far but with the 251mm2 size it is fairly small wouldn't you say? Flagship released and capabilities of the architecture are two different things.
Semantics of a flagship aside, I see is a 225W 'flagship' 7nm part that is 2% (1440p) faster than a 175W 12nm part (rtx 2070).

The improvements they need to make to match ampre, both in raw performance and ppw (note hat is matching ampre using last generation's paltry 25% gain - remember they added ray tracing and tensor core hardware), is 71%. That's a ton. Only time will tell, and I hope your glass half full attitude pans out to reality, but I'm not holding my breath. I think they will close the gap, but will fall well short of ampre's consumer flagship. At best I see it splitting the difference between 2080ti and ampre. I think it will end up a lot closer to 2080ti than ampre. They have a lot of work to do.

Remember, both amd and nvidia touted 50% ppw gains... if both are true, how can they catch up?
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
pack 5700xt chip in one die :) 500mm2 and you should be ok. I know it may not work like that but who knows? Besides the RDNA2 will offer a bit more horse power due to some improvements so it is possible. 500mm2 chip is not as big as NV's 2080Ti 754mm2 though. I get what you are saying the 5700xt is AMD's flagship the best released so far but with the 251mm2 size it is fairly small wouldn't you say? Flagship released and capabilities of the architecture are two different things.
For that you'd also need a 512-bit memory bus, which ... well, is both expensive, huge, and power hungry. Not a good idea (as the 290(X)/390(X) showed us).
Semantics of a flagship aside, I see is a 225W 'flagship' 7nm part that is 2% (1440p) faster than a 175W 12nm part.

The improvements they need to make to match ampre, both in raw performance and ppw (note hat is matching ampre using last generation's paltry 25% gain - remember they added ray tracing and tensor core hardware), is 71%. That's a ton. Only time will tell, and I hope your glass half full attitude pans out to reality, but I'm not holding my breath. I think they will close the gap, but will fall well short of ampre's consumer flagship. At best I see it splitting the difference between 2080ti and ampre. I think it will end up a lot closer to 2080ti than ampre. They have a lot of work to do.
What GPU are you comparing to? If we go by TPU's review, the average gaming power draw of the 5700 XT is 219W, with the 2070 at 195W and the 2060S at 184W. I'm assuming you're pointing to the 2070 as it's 2% slower in the same review. Nice job slightly bumping up AMD's power draw and lowering Nvidia's by a full 10%, though. That's how you make a close race (219W-194W=24W) look much worse (225W-175W=50W).

Edit: ah, I see you edited in the 2070 as the comparison. Your power draw number is still a full 20W too low though.
 
Top