• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX

No, it's Bulldozer. AMD was promising higher performance than what the reviews show. I think they where hoping to improve their drivers considerably in this last month. But they didn't. Or we will have to accept that their marketing numbers where best case scenarios close to the border of being lies. Zen 1 was a new architecture that was offering tremendous IPC gains over Bulldozer(which probably was the easy part), while also incorporating a first step into chiplet design. Zen 1 was a clear jump into the future. RDNA3 is a big "What The Hell?".
Buldozer was in some cases slower than the previous gen Phenoms. That was the majority of the disappointment behind Buldozer. Is RDNA3 slower in some cases than RDNA2?
 
Is there any logical reason to measure performance per watt based on CP2077 and not the overall average?
 
Is there any logical reason to measure performance per watt based on CP2077 and not the overall average?
I'd reason because CP2077 is notoriously heavy on both CPU and GPU.
 
Well, it's... okay, I guess? However with these numbers Huang can either simply continue the giga-milking mode or deliver a fatality just by dropping 4080 to a $1000. Unfortunate for everyone on the consumer side. GG to AMD for enabling better margins for themselves if they're really saving by going with chiplets.
 
No, it's Bulldozer.

Here we have an architecture that probably fails to be utilized at it's full potential.
the closest comparison then would be vega 64
if you ask me the 7900 XT and XTX should've been priced at $699 and $799 respectively.
this performance is nowhere near worth $1k, when going from a 5700XT to a 6800XT net you a 2X increase in performance for $650 MSRP
here you go from a 6800 XT to a 7900 XT and get... 30% more fps for $250 more - abysmal generational leap
 
Not sure what people were expecting that so many comments are talking about disappointment :S

Its slightly faster than a 4080, while cheaper. Did you think AMD were retaking the performance crown while charging half as much? lol

Nice card IMO.

compared to death a bloody nose is an improvement, neither is a good outcome
 
Remember that even Nvidia didn't improve on RT performance. Ampere had around a 40-45% drop in performance when you turned on RT and so does Ada.

Neither company improved upon RT performance when compared to last gen. The only reason things look better is simply because of the rasterization performance gains. I think it's sad that neither company made any improvements with RT.....although, it doesn't really matter to me because I don't care about RT. I never use it on any games that support it because for me it really doesn't make any difference. You're moving too fast through the games to really take in small visual changes that RT has.
It did, but there's no games using SER at default yet...

Sad to say but rdna2 was more competitive with ampere than rdna3 is to ada(probably just because of TSMC).
 
To sum it up, pretty much underwhelming. I'm not electrical engineering but I'm already skeptical about dual issues SIMD on modular design GCD and MCD, in which I would suspect pose another latency or anything cache related issues. Still, RT performance is quite good despite it lack a dedicated hardware, but sadly a feature I don't ever use. Good job though for first try at chiplet design.
 
Probably a higher power limit and tweaking the voltage and boost curve will provide quite a decent performance boost and maybe get it close to the 4090. Some AIB cards have 3 8 pin connectors, so 500w~ versions are possible within spec.

They'll probably be pretty expensive and obviously hot though...
 
To sum it up, pretty much underwhelming. I'm not electrical engineering but I'm already skeptical about dual issues SIMD on modular design GCD and MCD, in which I would suspect pose another latency or anything cache related issues. Still, RT performance is quite good despite it lack a dedicated hardware, but sadly a feature I don't ever use. Good job though for first try at chiplet design.

they are on a higher node, so that should also explain some differences i think
 
Did we look at the same review? It's matching the 3090 Ti quite often, and a couple times sits between the 4080 and 4090 in RT. The 6000 series is languishing miles away in RT performance, not even remotely "identical" performance. Judging by results such as Far Cry, Resident Evil, and Watch Dogs this looks like a solution can be found in driver optimizations.
That's not the point. Look at the % of FPS drop vs. raster. It's almost exactly the same as previous gen. Of course it 50% faster than a 6900XT in RT, because it's also 50% faster in raster.

control-rt-3840-2160.png


6900XT - 61% drop with RT on.
7900XTX - 58% drop with RT on.

Where is the performance increase? There is none, it's in the margin of error. AMD is not even at the RTX 2000 level yet. Embarrassing.
 
That's not the point. Look at the % of FPS drop vs. raster. It's almost exactly the same as previous gen. Of course it 50% faster than a 6900XT in RT, because it's also 50% faster in raster.

control-rt-3840-2160.png


6900XT - 61% drop with RT on.
7900XTX - 58% drop with RT on.

Where is the performance increase? There is none, it's in the margin of error. AMD is not even at the RTX 2000 level yet. Embarrassing.
Do you understand that if you increase raster by 50%, and RT performance stays the same, you would still have zero improvements in RT frame rate?

The RT improvements are in line with the raster improvements, this does not mean zero improvements in RT lmao.
 
LOL the 7900 XT is a JOKE of a card when compared even to the 6950 XT - 10% faster than a 6950 XT. what kind of generational leap is this??? a fucking joke
if it cost $700 - OK, that'd be a great card, but at $900? this is dogshit
1670860084877.png
 
Last edited:
Realistically I expected the 7900XTX to be a bit faster, certainly expected it to be 10% better than the 4080 on average, it seems like its much closer to the 4080, anywhere from being 15% slower to being 25% faster depending on the games and on average being just 5% faster over the 4080. Still a great result mind you are it costs $1000, which is at least $200 less than the 4080 and considering most custom models for the 4080 are around $1400, that is pretty much $400 dollars cheaper and about 5% faster!

So, a great result overall, but slightly disappointed as well because I thought it would be faster than it ended up being.

There is clearly some performance issues in certain games and getting really poor results, so hopefully AMD can sort these out in the coming weeks and improve performance in these titles! With AMD and how they operate I am fully expecting these GPU's to gain at least 5% more performance over the next 2-3 months and the issues with power draw at media playback and multi monitor to be fully resolved!
 
This is a disaster for the consumer.

Instead of soundly beating NVIDIA at least in raster AMD offers comparable performance at the comparable price while not offering unique distinguishing features like e.g. DLSS 3.0. And RTRT performance is again hugely lacking though AMD has managed to reach ... Ampere levels of performance.

I hate both NVIDIA and AMD. It looks like both companies are in cahoots and are no longer interested in advancing the gaming industry and graphics.

You want more performance? You pay proportionally more money. This is not how the GPU industry worked for the previous 20 years. This is just disgusting.

And what's up with multimonitor power consumption? They had 5 years to perfect the architecture and we look at 103W at idle? WTF AMD?

This card doesn't disrupt anything. It's a mockery of competition.

This is a bloody duopoly.

View attachment 274145

There needs to be a price fixing investigation, don't know how they have gotten away with this for so long.
 
I'm surprised at just how big the differences are with the different CPU at 1080p compared to at 4k. Makes me wonder how low my scores would be with this GPU, as I have a Ryzen 5 2600x
 
That chiplet bullshit seems to benefit cost cutting more than bringing anything revolutionary like performance or lower power draw.
Basically Nvidia beat them everywhere, performance, power efficiency, productivity, only place left where AMD can claim to compete is price.
And again with the drivers, that multi monitor power consumption of 103W is crazy and 88W for video playback.
 
Do you understand that if you increase raster by 50%, and RT performance stays the same, you would still have zero improvements in RT frame rate?

The RT improvements are in line with the raster improvements, this does not mean zero improvements in RT lmao.
Keep pretending you don't understand what I'm saying if that makes you feel better.
 
Look at how close the 7900 XT is to 6950 XT :twitch: 126 vs 113 is literally the same performance.
XTX vs 6950XT ;)
 
@W1zzard Nice job with the review.

No real surprises for me, except noise-at-load which was disappointing just because the 6900XT was so incredible in this regard.
 
Incredibly disappointing. They really needed to beat the 4080 in raster across the board for it to even have a chance because when we start counting RT, it's a complete bloodbath, nevermind DLSS 3. Seeing RDNA 3 not improve RT % performance loss compared to RDNA 2 is just baffling. GG
 
This means no price disruptions.
That can be fixed easily, stop giving nvidia money, regardless of what they have to offer. This is the sh!t, people want AMD to produce a 4090 killer, but offer it at US$500, just so Nvidia is "forced" to cut prices so the rabid followers would still give nvidia their money.
Instead of soundly beating NVIDIA
Aah yes, you want a 4090 killer that only sells for US$500.
AMD offers comparable performance at the comparable price
Maybe I read another review, but AMD is cheaper and equal to faster performance.
not offering unique distinguishing features like e.g. DLSS 3.0
Those are not "unique features", they are "lock-in features" which I have always avoided instead of desiring them.
And RTRT performance is again hugely lacking
The amount of games that support RT are simply minuscule compared to the immense library of games (that many of us haven't played yet) that don't have it, neither need it.

Talking about the ones that have it, very, very few of the games that have RT, gives you the visual reward that is supposed to have, but at an insane performance hit.

Wait, arent you the same "birdie" that is always trashing AMD and kissing nvidias behind at Phoronix?

Anyways, AMD should rename the 7900 XT to 7800 XT and both GPUS (7900 XT and XTX) should be priced lower, but then again, it seems that nobody really complains as to how much the 3090Ti and 4090 MSRPs were, so why not ask for the moon at this point?
 
Back
Top