Monday, July 26th 2021

Next-Gen AMD Radeon RDNA3 Flagship To Feature 15,360 Stream Processors?

AMD's next generation RDNA3 graphics architecture generation could see a near-quadrupling in raw SIMD muscle over the current RDNA2, according to a spectacular rumor. Apparently, the company will deploy as many as 15,360 stream processors (quadruple that of a Radeon RX 6800), and spread across 60 WGPs (Workgroup Processors), and do away with the compute unit. This is possibly because the RDNA3 compute unit won't be as independent as the ones on the original RDNA or even RDNA2, which begins to see groups of two CUs share common resources.

Another set of rumors suggest that AMD won't play NVIDIA's game of designing GPUs with wide memory bus widths, and instead build on its Infinity Cache technology, by increasing the on-die cache size and bandwidth, while retaining "affordable" discrete memory bus widths, such as 256-bit. As for the chip itself, it's rumored that the top RDNA3 part, the so-called "Navi 31," could feature a multi-chip module design (at least two logic dies), each with 30 WGPs. Each of the two is expected to be built on a next-gen silicon fabrication node that's either TSMC N5 (5 nm), or a special 6 nm node TSMC is designing for AMD. Much like the next-generation "Lovelace" architecture by NVIDIA, AMD's RDNA3 could see the light of the day only in 2022.
Sources: kopite7kimi (Twitter), KittyYYuko (Twitter), Greymon55 (Twitter), WCCFTech
Add your own comment

40 Comments on Next-Gen AMD Radeon RDNA3 Flagship To Feature 15,360 Stream Processors?

#1
tehehe
What I'm most interested in is how current GPU pricing situation will influence future gpu prices. We may be in for a rude awakening...
Posted on Reply
#2
Vayra86
teheheWhat I'm most interested in is how current GPU pricing situation will influence future gpu prices. We may be in for a rude awakening...
Its going to be closely related to fab capacity in a global sense. If they can make chips, they will want to sell them all. So it can quickly turn into a more healthy environment too, but anytime soon? Hmmm. It just takes effin long to increase production, we're still only talking about 1-3 years for this situation but meaningful changes happen in twice that time.
Posted on Reply
#3
Chomiq
With GDDR6 shortages and all that I do wonder when will they switch to GDDR7. What's the ETA on the modules for that?
Posted on Reply
#4
Shou Miko
Vayra86Its going to be closely related to fab capacity in a global sense. If they can make chips, they will want to sell them all. So it can quickly turn into a more healthy environment too, but anytime soon? Hmmm. It just takes effin long to increase production, we're still only talking about 1-3 years for this situation but meaningful changes happen in twice that time.
A new fab is in process of being build I don't think that it's done yet but when it is, it will take a couple of years at least with luck before it runs at 100% capacity sadly.
Posted on Reply
#5
Mysteoa
At this stage, I can only say: Cool.
Posted on Reply
#6
z1n0x
ChomiqWith GDDR6 shortages and all that I do wonder when will they switch to GDDR7. What's the ETA on the modules for that?
I haven't heard anything about GDDR7, it's most probably years away. 2023 at the earliest, more likely 2025.
Posted on Reply
#7
Hossein Almet
Who know, with RDNA3 AMD may do an 'Intel' to Nvidia?
Posted on Reply
#8
medi01
Most likely AMD is getting into "if NV can claim twice the number of shaders, so could we".
Hossein AlmetWho know, with RDNA3 AMD may do an 'Intel' to Nvidia?
In my books it already did. Look at mem configs on NV lineup, apparently they were forced to drop a tier on GPUs.
Crypto bubble is the reason it didn't hurt.
Posted on Reply
#9
jesdals
I can only say - show me the cards - not all those speculations...
Posted on Reply
#10
mechtech
That’s it?!
I want a whole 300mm wafer as a single chip

mmmmm. Transistors
Posted on Reply
#11
defaultluser
z1n0xI haven't heard anything about GDDR7, it's most probably years away. 2023 at the earliest, more likely 2025.
Yeah, the earliest available validation spec was released last year:

hardforum.com/threads/h-exclusive-gddr7-hints.1994957/

Will probably end-up in products the generation after this.

I think the funniest part of this will be: NVIDIA being forced to reuse another "X" GDDR tech revision for a second architecture (I hope they can at-least solve thew memory density and power issues).

I think that cache causes too many problems when you are performing random compute (so we will have to see if AMD continues their gaming-ony strategy for these parts)
Posted on Reply
#12
Quicks
Performance speak louder than stats these days. Look at Nvidia that double or even triple their shaders, but its weaker. Will wait for benchmarks to confirm.
Posted on Reply
#13
dragontamer5788
btarunrApparently, the company will deploy as many as 15,360 stream processors (quadruple that of a Radeon RX 6800), and spread across 60 WGPs (Workgroup Processors), and do away with the compute unit.
RDNA never had compute units. RDNA has been WGP-based since NAVI 1.0 (5xxx series).

RDNA was originally marketed as WGP (aka: Dual compute units). Maybe they're going to stop calling them "dual compute units" now. (60WGPs == 120CU). Which always was a "Bridge" term to help people from GCN/Vega era understand the performance of RDNA+ designs.
QuicksPerformance speak louder than stats these days. Look at Nvidia that double or even triple their shaders, but its weaker. Will wait for benchmarks to confirm.
These shader counts / core-counts are completely useless unless you're a very well studied GPU programmer. AMD 6-core Bulldozer is completely different from AMD 6-core Zen3. GPU-differences are even more pronounced.

NVidia changed the definition of "shader" in Volta/Turing: arguably double-counting because of the INT32 vs FLOAT32 simultaneous performance doo-hickey they implemented. Its all marketing. The programmers care, but the users / computer builders don't really care. Just look at FPS numbers and/or frame-times / benchmarks. The raw performance of the hardware doesn't even matter: NVidia's compiler is leagues better than AMD's compiler, so AMD has to make it up with stronger underlying hardware.
Posted on Reply
#14
napata
Hossein AlmetWho know, with RDNA3 AMD may do an 'Intel' to Nvidia?
Let's hope not. I feel underdog AMD was the only thing holding GPU prices down. Once they start outperforming Nvidia it's a race to the top with prices, especially when I look at how they handled the CPU market.
Posted on Reply
#15
ODOGG26
napataLet's hope not. I feel underdog AMD was the only thing holding GPU prices down. Once they start outperforming Nvidia it's a race to the top with prices, especially when I look at how they handled the CPU market.
They are a business after all. The problem is people want AMD to remain second rate so they can keep prices down while they buy Nvidia GPUs instead. How does that help AMD create better products if everyone wants them to remain the budget brand while still complaining that their products are second rate. People are so fickle. But if Nvidia and Intel can charge high prices while on top then so can AMD.

I will never complain about AMD raising prices because its been done by the competitors for far too long now. Heck people had a problem with a $50 price increase for a better product of the same level. Never understood the rage about paying $50 more to go from 3600x to 5600x. Like the performance difference is crazy.

Btw, I think they handled the CPU market quite fine. They need all the money they can get to battle 2 giants.
Posted on Reply
#16
yotano211
ChomiqWith GDDR6 shortages and all that I do wonder when will they switch to GDDR7. What's the ETA on the modules for that?
Rumor is next week.

*my opinion
Posted on Reply
#17
Richards
They can't even supply the laptop oem's with rdna 2 chips..and theý are already rushing to launch rdna 3 with the grossly overpriced node its gonna be expensive
Posted on Reply
#18
N3M3515
ODOGG26They are a business after all. The problem is people want AMD to remain second rate so they can keep prices down while they buy Nvidia GPUs instead. How does that help AMD create better products if everyone wants them to remain the budget brand while still complaining that their products are second rate. People are so fickle. But if Nvidia and Intel can charge high prices while on top then so can AMD.

I will never complain about AMD raising prices because its been done by the competitors for far too long now. Heck people had a problem with a $50 price increase for a better product of the same level. Never understood the rage about paying $50 more to go from 3600x to 5600x. Like the performance difference is crazy.

Btw, I think they handled the CPU market quite fine. They need all the money they can get to battle 2 giants.
AMD or Nvidia or Intel are not to blame for high prices, the people that accept those ridiculous prices and buy the products are the ones that ruin it for everyone else.
Posted on Reply
#19
ZoneDymo
N3M3515AMD or Nvidia or Intel are not to blame for high prices, the people that accept those ridiculous prices and buy the products are the ones that ruin it for everyone else.
^
Posted on Reply
#20
MentalAcetylide
N3M3515AMD or Nvidia or Intel are not to blame for high prices, the people that accept those ridiculous prices and buy the products are the ones that ruin it for everyone else.
With that kind of reasoning, we could wind up with enough episodes of Dual Survival to last a few thousand seasons, at the least, if we applied it to everything else.
I think the price of vehicles is ridiculous for the amount of miles per gallon we still get with them, but should I blame it on everyone I see driving on the road while riding around on a bicycle?
Posted on Reply
#21
InVasMani
I wonder if any of this is AMD trying to adjust and optimize RDNA design better around RGBA and with variable FP formats. I know AMD defiantly has made adjustments around RDNA/RDNA2 to better optimize the overall architecture to be better adaptive and flexible in the right area's so it's possibly a extension upon that taking it steps further. If that's the case it could lead to much more efficient designed variable rate shading which is what should be a strong focus of improvement. The emphasis should be to eek out better perform and efficiency with less overall hardware with a stronger emphasis on improved efficiency. That'll lead to more GPU in more consumers systems, but also it'll improve market segments APU performance will increase and mobile parts will get better in turn. That's my optimistic hopes for RNDA3.

It would be great if AMD actually launched the lower and mid range RNDA3 cards first this time around. I might eat into RNDA2 sales for RX6600 series a bit, but it would lead to a wave of efficiency increase in the right area that has the largest positive environmental impact by improving people's outdated graphics that they've clung to longer than expected due to the mining boom. It would also slow the mining boom down a bit than doing it the opposite way that has been the case for awhile.
Posted on Reply
#22
Unregistered
ODOGG26The problem is people want AMD to remain second rate so they can keep prices down while they buy Nvidia GPUs instead.
Absolutely nailed it!
Posted on Edit | Reply
#23
N3M3515
MentalAcetylideWith that kind of reasoning, we could wind up with enough episodes of Dual Survival to last a few thousand seasons, at the least, if we applied it to everything else.
I think the price of vehicles is ridiculous for the amount of miles per gallon we still get with them, but should I blame it on everyone I see driving on the road while riding around on a bicycle?
I think of it this way, if the market does not like the prices and the majority does not buy, then prices will go down. I for example won't buy a 6600XT for more than $300 and that's after verifying that it is considerably faster(30 - 35%) than the 5600XT.

Hey, if nvidia/amd/intel set prices and a lot of people buy, what incentive do they have to lower them?, you can't blame them. Like someone said, their business is earning as much money as they can.

Why do you think gpus like rtx 3090 exist, because people buy them.

Why do you think more and more sexy women are creating onlyfans accounts, because there are TRUCKLOADS of men willing to pay. It's not their fault (to a certain extent). If men didn't pay for it, it would not exist.
Posted on Reply
#24
delshay
End of HBM on gaming card. Nano card is also dead. Oh dear.
Posted on Reply
#25
mtcn77
delshayEnd of HBM on gaming card. Nano card is also dead. Oh dear.
Hbm wasn't so bad if not for AMD's backend first strategy which limited the card's reach. It had to wait for optimised engines which cooled its steam.
Posted on Reply
Add your own comment
Apr 23rd, 2024 12:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts