• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA 4 GPU Memory and Infinity Cache Configurations Surface

RDNA3 had pretty decent uplift over the 6900xt in RT, where are you seeing the lack of improvement?

RDNA3 largely matched RTX 3090 in RT, while RDNA2 lagged behind it a fair bit.

Core counts and cache isn't everything that affects RT performance..
rDNA3 is an architecture, the 6900xt is a chip.

Which rDNA3 are you referring to? Because the 7800xt has roughly the same core config as a 6900xt, and near identical raytracing performance in most games.

Each CU contained raytracing hardware. more CUs means more raytracing performance. Per CU, rDNA3 had almost no performance improvements. Go look at TPU's 7800xt performance review, average of a whopping 3% faster in RT then the 6900xt.

 
Because the 7800xt has roughly the same core config as a 6900xt, and near identical raytracing performance in most games.

It does not.

The RX 6900 XT has 5120 shaders

The RX 7800 XT has 3840 shaders.

The RX 7800 XT performs nearly the same as the RX 6900 XT but with just 75% of the shaders and roughly 80% of the power consumption.
 
7900 xtx in rasterization matches +/- 5% 4090
7900 xtx enabled raytracing +/-5% to a 3090

Adding another raytracing unit does not seem wise as RDNA3 has a hardtimd filling up its improved RT units with BHV trassveral additions. Beside that it also doesn't even both using it 2 issue per-clock addition either ubless specifically coded for it.

7900XTX is much closer to 4080S raster than it is to the 4090.

RT, it’s between the 3090 and 3090Ti, which have an RT gap of about 12% between them.
 
Which rDNA3 are you referring to? Because the 7800xt has roughly the same core config as a 6900xt, and near identical raytracing performance in most games.

As the kids like to say, cap.
 
rDNA3 was supposed to improve ray tracing, but it failed to do so, with RT perf per CU being almost identical to rDNA2.

Now we're seeing rDNA4 will have the same cache and memory config as rDNA3, with a similar core count.

Where is this improvement coming from?

if you look at memory system vs nvidia, yeah you see nvidia does a lot more.
256 bit vs 384.
64 mb vs 96mb l2 vs l3
+ amd has a big l2 of 6mb (you know... that's the last big cache of a 3090ti...)

and large L1 L0, so they're using bandwidth terribly bad, it's the architecture being extremely narrow and overfed and badly utilized.
Maybe getting tile based renderer working would help some scenarios, get the double fp working better and in more situations, drivers, matrix to help with certain tasks.

So many things, they're lagging behind so bad now and yet their trying desperately to make a WGP\Dual CU do everything without adding things to them, it's like they're trying to push and hammer down data(memory bandwidth through cache and bus) into a CU and magically thinking it'll work.
 
It does not.

The RX 6900 XT has 5120 shaders

The RX 7800 XT has 3840 shaders.

The RX 7800 XT performs nearly the same as the RX 6900 XT but with just 75% of the shaders and roughly 80% of the power consumption.

Indeed, here are links for people that want to confirm themselves:


It's not an earth-shattering increase in RT performance per core but it's certainly noteworthy.
 
rDNA3 is an architecture, the 6900xt is a chip.

Which rDNA3 are you referring to? Because the 7800xt has roughly the same core config as a 6900xt, and near identical raytracing performance in most games.

Each CU contained raytracing hardware. more CUs means more raytracing performance. Per CU, rDNA3 had almost no performance improvements. Go look at TPU's 7800xt performance review, average of a whopping 3% faster in RT then the 6900xt.


Oh sorry I didn't specify which one. In the first line I was referring to 7800xt which has 3840 shaders vs 5120 for the 6900xt but the 7800xt is 4.5% faster in TPU's RT chart at 1920x1080. Second line was 7900xtx against the 3090.

I think you're confused with regard the shader counts of the RDNA3 lineup but there hasn't been a massive increase on that front.
 
They should exit the markets, then:
1. Consumer Ryzen;
2. Consumer Radeon;
3. Semi-custom Playstation and Xbox chips.

No, 1. is profitable, 2... they're scaling back. Still profitable, but easily the smallest of AMD's current businesses... 3. is contractual, not much money (NVIDIA turned this down), but has served as a lifeline

Also regarding Mindfactory... Single store in the EU, AMD friendly market. I maintain it.

What feature set? Running local LLM with 8GB VRAM or fake frames with "fake" reflections :wtf:

Yet fake frames with fake reflections are some of the star features they've chosen to copy...

You can't compare specs from Nvidia with AMD to understand a performance level comparison. The internet is full of benchmarks that make it clear that the 7900 XTX is the competitor to the 4080. You are going to have to do some research or you can't have a grasp of what is going on in the tech world and you just waste everyone's time trying to explain obvious things to you.

The RTX 4090 came out first and it was clear they could not match it. This "it's positioned against the 4080" together with a price reduction is literally the only thing AMD could do in that situation.

The fact that 7900 XTX is a much larger processor with a clearly higher bill of materials remains unchanged.

7900 xtx in rasterization matches +/- 5% 4090
7900 xtx enabled raytracing +/-5% to a 3090

Adding another raytracing unit does not seem wise as RDNA3 has a hardtimd filling up its improved RT units with BHV trassveral additions. Beside that it also doesn't even both using it 2 issue per-clock addition either ubless specifically coded for it.

The 7900 XTX does not come even close to the 4090 in raster performance. It's 1-4% vs. 4080, 1-2% vs 4080 Super.
 
What feature set? Running local LLM with 8GB VRAM or fake frames with "fake" reflections :wtf:
Like FSR3 with framegen, AFMF, and selling 8GB products at a higher price than RDNA2 while offering no performance benefit?

RDNA3 has been the most boring release AMD has ever done, not broken like RDNA1, but offering no plus over RDNA2 and only increasing prices enough to not make everyone angry.
Focusing on the mid and low end (the market they literally abandoned right now) when it's obvious they are in no place to match XX90 NVIDIA products in hardware or software at the moment is the right move.
 
Like FSR3 with framegen, AFMF, and selling 8GB products at a higher price than RDNA2 while offering no performance benefit?

RDNA3 has been the most boring release AMD has ever done, not broken like RDNA1, but offering no plus over RDNA2 and only increasing prices enough to not make everyone angry.
Focusing on the mid and low end (the market they literally abandoned right now) when it's obvious they are in no place to match XX90 NVIDIA products in hardware or software at the moment is the right move.
But hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out? this is not the majority of buyers, and Nvidia is taking the piss, now more people are waiting multiple gens to upgrade cause $5-$600 for shit mid-gen GPU's is not realistic in the real world, I have a RX 6800 I bought for £350 and nothing new comes close to it in terms of performance/£ even after 2 years of newer GPU's, I am going to hodl this until at least Radeon 10 series or Nvidia 70**, they can charge their overpriced BS money for the same performance class all they want, both of them, I won't be spending a dime on either
 
No, 1. is profitable, 2... they're scaling back. Still profitable, but easily the smallest of AMD's current businesses... 3. is contractual, not much money (NVIDIA turned this down), but has served as a lifeline

Also regarding Mindfactory... Single store in the EU, AMD friendly market. I maintain it.



Yet fake frames with fake reflections are some of the star features they've chosen to copy...



The RTX 4090 came out first and it was clear they could not match it. This "it's positioned against the 4080" together with a price reduction is literally the only thing AMD could do in that situation.

The fact that 7900 XTX is a much larger processor with a clearly higher bill of materials remains unchanged.



The 7900 XTX does not come even close to the 4090 in raster performance. It's 1-4% vs. 4080, 1-2% vs 4080 Super.
Yep and costs double too.
 
But hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out? this is not the majority of buyers, and Nvidia is taking the piss, now more people are waiting multiple gens to upgrade cause $5-$600 for shit mid-gen GPU's is not realistic in the real world, I have a RX 6800 I bought for £350 and nothing new comes close to it in terms of performance/£ even after 2 years of newer GPU's, I am going to hodl this until at least Radeon 10 series or Nvidia 70**, they can charge their overpriced BS money for the same performance class all they want, both of them, I won't be spending a dime on either

We got members on this forum from Vietnam who will purchase 1 of these high end GPU's every release. Unless he ran out of organs to pay for it.
 
But hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out? this is not the majority of buyers, and Nvidia is taking the piss, now more people are waiting multiple gens to upgrade cause $5-$600 for shit mid-gen GPU's is not realistic in the real world, I have a RX 6800 I bought for £350 and nothing new comes close to it in terms of performance/£ even after 2 years of newer GPU's, I am going to hodl this until at least Radeon 10 series or Nvidia 70**, they can charge their overpriced BS money for the same performance class all they want, both of them, I won't be spending a dime on either

1. And have you asked yourself why, for one single moment, why this is so? (hint: it isn't Nvidia's greed)
2. Best selling GPU model of the generation. Outsold the entire Radeon stack. Demand is so extreme, there are shortages for some of its board components
3. The RX 6800 is 4 years old, not two. So you got a previous generation card on a discount - that doesn't really count. MSRP by MSRP (they launched at roughly the same cost) even with all the cuts in the 4070's configuration, the 4070 comes far ahead as a product - it is faster, its drivers are of superior quality (a studio branch for productivity is also offered at no extra cost), it is more power efficient - and in 2 years, it'll have the same age and be just as devalued as your RX 6800 is today - in other words, there is nothing special or praiseworthy about your graphics card, it's just an earlier generation card past its prime that obviously still does its job, as it always has

Yep and costs double too.

Supply and demand
 
You can't compare specs from Nvidia with AMD to understand a performance level comparison. The internet is full of benchmarks that make it clear that the 7900 XTX is the competitor to the 4080. You are going to have to do some research or you can't have a grasp of what is going on in the tech world and you just waste everyone's time trying to explain obvious things to you.
Just ignore the troll
 
Polaris based cards were one of their most successful products in this last decade lol.
This is the unfortunate reality. Every time I have checked out the GPU list here for the past 5 years or more: 580 is right there.

1724894203360.png


Steam Hardware Survey:

1724894309686.png


It's quite the scroll to find it but damn.

And of course any time I have some kind of graphics problem and need to default to something that just works: BAM!

1724894406388.png


This thing is everywhere. It's so everywhere that it became the new religion on the Chinese chopping block of chips for a number of years and is now so long in the tooth that we're just finally starting to see it happen to other cards either because the performance now sucks or DX feature lockout is kicking people out of the gaming hobby or the crap encoder is finally starting to force everyone out of service.

But hey, go and buy a 4070 which would have been an **60 class of GPU for $600 instead of the $300 it would have been before :kookoo: or if you want the best in class shell out $2K for a 4090, who the fuck wants to spend used car money on a fricken GPU that will become obsolete in 2 years and get bummed for another $2K when the 5090 comes out?

This is the hard reality. At the end of the day, people don't give a damn about GPUs. They just want a display adapter that can play the game. That means not playing around with voltage settings for weeks trying to find the sweet spot where it doesn't crash (if there is one), not struggling with balancing fans and noise when tesselation breaks and turns one squirrely mesh into pure nightmare fuel until reboot, not dicking around with a bunch of Bitcoin miner market bullshit and playing momentum monkey with drops just trying to acquire ONE of these cards and that's all assuming they're not fighting to diagnose one dud after another like I had to deal with on 7000 series. What I'm saying is there isn't a good price.

The same fiasco has just kicked off with 5000 series and will happen to 6000 very soon. I'm not saying it hasn't already, I'm talking MASS SCALE. This is the only thing keeping me in a loop of second guessing, which keeps me out of the 2nd hand market for chips. It's bad enough that every delivery of 7000 series seems to be a bomb. I can only hope that 8000 series arrives as expected (functional) and delivers on good enough quantity and price because it took several months to find anything good for previous generations and I still don't have a card.

The performance should be worlds better than what I have now. That's the bar. Don't really care where they fit between 7000 series SKUs, just that enough of them hit the shelves before all the scalpers figure out AMD pulled a massive disinfo campaign to finally get these cards in the hands of real customers.
 
Does this mean they arent coming out with an 8th gen counterpart to the 7900XTX? or simply that they dont plan on focusing on increased performance as much?
 
Like FSR3 with framegen, AFMF, and selling 8GB products at a higher price than RDNA2 while offering no performance benefit?

RDNA3 has been the most boring release AMD has ever done, not broken like RDNA1, but offering no plus over RDNA2 and only increasing prices enough to not make everyone angry.
Focusing on the mid and low end (the market they literally abandoned right now) when it's obvious they are in no place to match XX90 NVIDIA products in hardware or software at the moment is the right move.
RDNA3 is no jump over RDNA2? A quick look at the 7900GRE review shows the 7900xtx being 43% faster than the 6900xt at 2560x1440. Hell, I was super tempted to switch over from a 3090 because it's just that much faster but the lack of side ported waterblocks was the only deterrent.

I also tend to play some Warzone nowadays and for whatever reason RDNA3 is stupid fast in that game, faster even than the 4090 at most resolutions.

I would say being 43% faster is a plus..

Does this mean they arent coming out with an 8th gen counterpart to the 7900XTX? or simply that they dont plan on focusing on increased performance as much?

I think they will at least match the 7900xtx but most seem to think otherwise. I guess we'll see soon enough
 
Yet fake frames with fake reflections are some of the star features they've chosen to copy.
Copy what exactly? There's certainly feature parity between major vendors at the top of good/important(?) features & I doubt that you'd go anywhere if that wasn't the case. Did Intel copy AMD when they went with x64 route, IMC, chiplets & pretty sure a million other ways? Nvidia with Mantle Vulkan, shared memory, Hairworks et al?
 
You can't compare specs from Nvidia with AMD to understand a performance level comparison. The internet is full of benchmarks that make it clear that the 7900 XTX is the competitor to the 4080.

Nonsense. The 4080 uses DLSS to artificially lift the framerates, and look that its performance is close. It is oranges vs. bananas comparison.

The real competitor for RTX 4080 is the RX 7900 GRE.


RDNA3 is no jump over RDNA2

You must compare shader-to-shader. 6144 vs 5120.
 
Last edited:
It sure sounds to me like AMD is gonna stick to 7800xt performance and leave high end buyers out to dry.
Most people dont need another 1000+ costing card. Most people want previous gen high end performance for lower price.
What was the AMD's outlook last time when they tried this with the RX 580 and RX 5700 XT?
RX 6000 series that followed RX 5000 was very successful. The outlook is that RX 9000 series will do the same after RX 8000.
Polaris based cards were one of their most successful products in this last decade lol.
As were RX 6000 series that followed the same midrange RX 5000 series cards that RX 8000 seems to be.
The fact that 7900 XTX is a much larger processor with a clearly higher bill of materials remains unchanged.
I assume you mean vs 4080(S)? You have to also remember that Nvidia manufactures their dies on a more expensive node. I would not confidently say that AMD's MCM approach had higher BoM cost.
The 7900 XTX does not come even close to the 4090 in raster performance. It's 1-4% vs. 4080, 1-2% vs 4080 Super.
Considering it's 18% slower according to TPU's chart while costing only 60% of 4090 then id' say it's not too bad. The loss in RT performance is much bigger tho but in raster, in terms of price/performance it's actually better than 4090:

performance-per-dollar-3840-2160.png
 
Good. Add improved idle and video playback power consumption into the mix, and they've got a buyer.
 
Nonsense. The 4080 uses DLSS to artificially lift the framerates, and look that its performance is close. It is oranges vs. bananas comparison.

The real competitor for RTX 4080 is the RX 7900 GRE.

Really? Take a look at any TPU GPU review performance charts, for example:
XFX Radeon RX 7900 XTX Magnetic Air Review - Relative Performance | TechPowerUp

4080 is 23%, 28% and 35% faster than 7900GRE at 1080p, 1440p and 2160p respectively.
On the other hand, 7900XTX is 2%, 3% and 5% faster than 4080 at 1080p, 1440p and 2160p respectively.
Edit: 4090 is 13%, 19% and 23% faster than 7900XTX in the same graphs.

DLSS - or FSR for that matter - works on top of that.
 
Nonsense. The 4080 uses DLSS to artificially lift the framerates, and look that its performance is close. It is oranges vs. bananas comparison.

The real competitor for RTX 4080 is the RX 7900 GRE.




You must compare shader-to-shader. 6144 vs 5120.
The only competition that matters is in price. With that in mind, the only true competition to the 4080 is the 7900 XTX. If performance is on par within the same price bracket, then we've got some fair competition. If it isn't, then one is a better buy than the other. It's that simple. Shaders, memory bus and any other arbitrary number don't matter when one is talking about competition.
 
There is a pretty good RDNA2 > RDNA3 comparison out there - 6800 vs 7800XT, both are 60CU, 256-bit.
What complicates things is the double ALU thing although it seems to have helped even less than same in Nvidia's case.
Less Infinity Cache but given evolution/optimization of the size of that on both AMD and Nvidia newer generations this has negligible impact.
Other than that - clocks are up 6-15% and VRAM bandwidth up 22%.

Based on the last TPU GPU review 7800XT is overall 22-23% faster than 6800 which basically matches the expectations based on specs.
In RT, 27-34% with gap increasing with resolution. There is a nice little jump there - AMD clearly did improve the RT performance.
 
Back
Top