• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-Gen AMD Radeon RDNA3 Flagship To Feature 15,360 Stream Processors?

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,880 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD's next generation RDNA3 graphics architecture generation could see a near-quadrupling in raw SIMD muscle over the current RDNA2, according to a spectacular rumor. Apparently, the company will deploy as many as 15,360 stream processors (quadruple that of a Radeon RX 6800), and spread across 60 WGPs (Workgroup Processors), and do away with the compute unit. This is possibly because the RDNA3 compute unit won't be as independent as the ones on the original RDNA or even RDNA2, which begins to see groups of two CUs share common resources.

Another set of rumors suggest that AMD won't play NVIDIA's game of designing GPUs with wide memory bus widths, and instead build on its Infinity Cache technology, by increasing the on-die cache size and bandwidth, while retaining "affordable" discrete memory bus widths, such as 256-bit. As for the chip itself, it's rumored that the top RDNA3 part, the so-called "Navi 31," could feature a multi-chip module design (at least two logic dies), each with 30 WGPs. Each of the two is expected to be built on a next-gen silicon fabrication node that's either TSMC N5 (5 nm), or a special 6 nm node TSMC is designing for AMD. Much like the next-generation "Lovelace" architecture by NVIDIA, AMD's RDNA3 could see the light of the day only in 2022.



View at TechPowerUp Main Site
 
What I'm most interested in is how current GPU pricing situation will influence future gpu prices. We may be in for a rude awakening...
 
What I'm most interested in is how current GPU pricing situation will influence future gpu prices. We may be in for a rude awakening...

Its going to be closely related to fab capacity in a global sense. If they can make chips, they will want to sell them all. So it can quickly turn into a more healthy environment too, but anytime soon? Hmmm. It just takes effin long to increase production, we're still only talking about 1-3 years for this situation but meaningful changes happen in twice that time.
 
With GDDR6 shortages and all that I do wonder when will they switch to GDDR7. What's the ETA on the modules for that?
 
Its going to be closely related to fab capacity in a global sense. If they can make chips, they will want to sell them all. So it can quickly turn into a more healthy environment too, but anytime soon? Hmmm. It just takes effin long to increase production, we're still only talking about 1-3 years for this situation but meaningful changes happen in twice that time.

A new fab is in process of being build I don't think that it's done yet but when it is, it will take a couple of years at least with luck before it runs at 100% capacity sadly.
 
With GDDR6 shortages and all that I do wonder when will they switch to GDDR7. What's the ETA on the modules for that?
I haven't heard anything about GDDR7, it's most probably years away. 2023 at the earliest, more likely 2025.
 
Who know, with RDNA3 AMD may do an 'Intel' to Nvidia?
 
Most likely AMD is getting into "if NV can claim twice the number of shaders, so could we".

Who know, with RDNA3 AMD may do an 'Intel' to Nvidia?
In my books it already did. Look at mem configs on NV lineup, apparently they were forced to drop a tier on GPUs.
Crypto bubble is the reason it didn't hurt.
 
I can only say - show me the cards - not all those speculations...
 
That’s it?!
I want a whole 300mm wafer as a single chip

mmmmm. Transistors
 
I haven't heard anything about GDDR7, it's most probably years away. 2023 at the earliest, more likely 2025.


Yeah, the earliest available validation spec was released last year:


Will probably end-up in products the generation after this.

I think the funniest part of this will be: NVIDIA being forced to reuse another "X" GDDR tech revision for a second architecture (I hope they can at-least solve thew memory density and power issues).

I think that cache causes too many problems when you are performing random compute (so we will have to see if AMD continues their gaming-ony strategy for these parts)
 
Performance speak louder than stats these days. Look at Nvidia that double or even triple their shaders, but its weaker. Will wait for benchmarks to confirm.
 
Apparently, the company will deploy as many as 15,360 stream processors (quadruple that of a Radeon RX 6800), and spread across 60 WGPs (Workgroup Processors), and do away with the compute unit.

RDNA never had compute units. RDNA has been WGP-based since NAVI 1.0 (5xxx series).

RDNA was originally marketed as WGP (aka: Dual compute units). Maybe they're going to stop calling them "dual compute units" now. (60WGPs == 120CU). Which always was a "Bridge" term to help people from GCN/Vega era understand the performance of RDNA+ designs.

Performance speak louder than stats these days. Look at Nvidia that double or even triple their shaders, but its weaker. Will wait for benchmarks to confirm.

These shader counts / core-counts are completely useless unless you're a very well studied GPU programmer. AMD 6-core Bulldozer is completely different from AMD 6-core Zen3. GPU-differences are even more pronounced.

NVidia changed the definition of "shader" in Volta/Turing: arguably double-counting because of the INT32 vs FLOAT32 simultaneous performance doo-hickey they implemented. Its all marketing. The programmers care, but the users / computer builders don't really care. Just look at FPS numbers and/or frame-times / benchmarks. The raw performance of the hardware doesn't even matter: NVidia's compiler is leagues better than AMD's compiler, so AMD has to make it up with stronger underlying hardware.
 
Who know, with RDNA3 AMD may do an 'Intel' to Nvidia?
Let's hope not. I feel underdog AMD was the only thing holding GPU prices down. Once they start outperforming Nvidia it's a race to the top with prices, especially when I look at how they handled the CPU market.
 
Let's hope not. I feel underdog AMD was the only thing holding GPU prices down. Once they start outperforming Nvidia it's a race to the top with prices, especially when I look at how they handled the CPU market.
They are a business after all. The problem is people want AMD to remain second rate so they can keep prices down while they buy Nvidia GPUs instead. How does that help AMD create better products if everyone wants them to remain the budget brand while still complaining that their products are second rate. People are so fickle. But if Nvidia and Intel can charge high prices while on top then so can AMD.

I will never complain about AMD raising prices because its been done by the competitors for far too long now. Heck people had a problem with a $50 price increase for a better product of the same level. Never understood the rage about paying $50 more to go from 3600x to 5600x. Like the performance difference is crazy.

Btw, I think they handled the CPU market quite fine. They need all the money they can get to battle 2 giants.
 
With GDDR6 shortages and all that I do wonder when will they switch to GDDR7. What's the ETA on the modules for that?
Rumor is next week.

*my opinion
 
They can't even supply the laptop oem's with rdna 2 chips..and theý are already rushing to launch rdna 3 with the grossly overpriced node its gonna be expensive
 
They are a business after all. The problem is people want AMD to remain second rate so they can keep prices down while they buy Nvidia GPUs instead. How does that help AMD create better products if everyone wants them to remain the budget brand while still complaining that their products are second rate. People are so fickle. But if Nvidia and Intel can charge high prices while on top then so can AMD.

I will never complain about AMD raising prices because its been done by the competitors for far too long now. Heck people had a problem with a $50 price increase for a better product of the same level. Never understood the rage about paying $50 more to go from 3600x to 5600x. Like the performance difference is crazy.

Btw, I think they handled the CPU market quite fine. They need all the money they can get to battle 2 giants.
AMD or Nvidia or Intel are not to blame for high prices, the people that accept those ridiculous prices and buy the products are the ones that ruin it for everyone else.
 
AMD or Nvidia or Intel are not to blame for high prices, the people that accept those ridiculous prices and buy the products are the ones that ruin it for everyone else.

^
 
AMD or Nvidia or Intel are not to blame for high prices, the people that accept those ridiculous prices and buy the products are the ones that ruin it for everyone else.
With that kind of reasoning, we could wind up with enough episodes of Dual Survival to last a few thousand seasons, at the least, if we applied it to everything else.
I think the price of vehicles is ridiculous for the amount of miles per gallon we still get with them, but should I blame it on everyone I see driving on the road while riding around on a bicycle?
 
I wonder if any of this is AMD trying to adjust and optimize RDNA design better around RGBA and with variable FP formats. I know AMD defiantly has made adjustments around RDNA/RDNA2 to better optimize the overall architecture to be better adaptive and flexible in the right area's so it's possibly a extension upon that taking it steps further. If that's the case it could lead to much more efficient designed variable rate shading which is what should be a strong focus of improvement. The emphasis should be to eek out better perform and efficiency with less overall hardware with a stronger emphasis on improved efficiency. That'll lead to more GPU in more consumers systems, but also it'll improve market segments APU performance will increase and mobile parts will get better in turn. That's my optimistic hopes for RNDA3.

It would be great if AMD actually launched the lower and mid range RNDA3 cards first this time around. I might eat into RNDA2 sales for RX6600 series a bit, but it would lead to a wave of efficiency increase in the right area that has the largest positive environmental impact by improving people's outdated graphics that they've clung to longer than expected due to the mining boom. It would also slow the mining boom down a bit than doing it the opposite way that has been the case for awhile.
 
With that kind of reasoning, we could wind up with enough episodes of Dual Survival to last a few thousand seasons, at the least, if we applied it to everything else.
I think the price of vehicles is ridiculous for the amount of miles per gallon we still get with them, but should I blame it on everyone I see driving on the road while riding around on a bicycle?
I think of it this way, if the market does not like the prices and the majority does not buy, then prices will go down. I for example won't buy a 6600XT for more than $300 and that's after verifying that it is considerably faster(30 - 35%) than the 5600XT.

Hey, if nvidia/amd/intel set prices and a lot of people buy, what incentive do they have to lower them?, you can't blame them. Like someone said, their business is earning as much money as they can.

Why do you think gpus like rtx 3090 exist, because people buy them.

Why do you think more and more sexy women are creating onlyfans accounts, because there are TRUCKLOADS of men willing to pay. It's not their fault (to a certain extent). If men didn't pay for it, it would not exist.
 
Last edited:
Back
Top