• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX

CPU bottlenecks very hard in some games so the ~45% uplift vs 6900XT @4K is pessimistic. For any game utilising the GPU properly it get closer to 60% and in some games even at 80% over the 6900XT (on other online reviews). So the up to 54% higher efficiency is true.
 
I'm testing a video file, not a stream, so I have perfect repeatability, independent of internet speed, YT shenanigans, codec changes etc. But yes, should be for YT as well

88W @8K ? Or even 88W @ 1080P ?
 
Last edited:
Did we look at the same review? It's matching the 3090 Ti quite often, and a couple times sits between the 4080 and 4090 in RT. The 6000 series is languishing miles away in RT performance, not even remotely "identical" performance. Judging by results such as Far Cry, Resident Evil, and Watch Dogs this looks like a solution can be found in driver optimizations.
Look at the (% changes) for the 6950 XT verses the 7900XTX. The 7900 XTX's efficiency in raytracing is basically on par with the 6000 series. This is the disappointing, especially as they specifically said they worked on raytracing performance for this architecture. It looks like the architecture changes to increase raytracing performance didn't translate well into real-world game raytracing performance.
 
I definitely think it's in a good spot in the terrible performance/value high end segment... that being said used 6900xt and 3080's / 3080Tis looking amazing right now.
 
The performance difference is less than said. With a good Overclock, the RTX 4080 can be brought to the raster performance of this card.
Pros are more up-to-date video outputs, more memory and more efficient power consumption.A future driver update will remove this extra power from a multi-monitor setup. There may be some increase in performance as well.
 
Last edited:
We should probably keep in mind while AMD has experience with chiplets in CPUs, GPUs I'm sure are a bit more complex.
Since this is a totally new architecture (GPU Chiplets) i feel like ALOT of driver optimizations are to come for it.
 
No gpu is bad if it is priced accordingly. But these are not priced well.
150$+ down and we are sold to AMD regardless the unacceptable RT numbers.
 
if AIBs don't overclock far better than reference cards to compensate for the shitty generational leap, then this is another vega 64 moment for AMD. so fucking disappointing.
"up to 1.7x faster than 6950 xt" my fucking ass
It's your fault , don't blame on AMD or you're trolling
 
pricing aside, AMD did it right ... performances and RT are competitive (RT is a bit irrelevant for me, as it bring nothing imho, but still good to see them making progress in that direction. ) the references card look drop dead gorgeous, 2.5slot no HPWR connector needed

imhho, they did better than Nvidia and certainly better than i expected and if i didn't do right by buying a RX 6700 XT this year, when the price dropped significantly, i would definitely go for a XT or XTX , although not at their prices... but i would have been forced to, because i suspect the RX 7700 XT will not be a worthy successor (having the same syndrome as the 4070 will have and reduced specs from one generation to another) the RX 7700 XT and 4070 feel like they would belong to the 7600 XT and 4060 line in terms of specs (to me the 7900 XT should be the 7700 XT )


in term of price the XTX should be 850$ and the XT 750$ at max .... (well if they were at that MSRP, in Switzerland they would be 900 (XT) and 1000 (XTX) anyway :laugh: i guess i will see them around 1100 (XT) and 1200 (XTX) )

i can skip a gen again (or more ... after all i stayed 5-6yrs with my 1070 :laugh: )
 
Last edited:
AMD really messed up on their presentation. They had a lot of people believe they would be around 1.5-1.7x the performance of a 6950XT, and in reality, it was closer to 1.35-1.4x. I'd expected they would be further ahead on raster performance than the 4080 (around 15%) based on their presentation figures, and in RT performance closer to a 3090/3090Ti and well behind the 4080, which ended up being the case.

Based on the reviews, and the relative performance of the 6000/3000 series GPU's at the prices you can currently pick them up for, the 7900XTX should be a $899 product and 4080 a $999 one.
 
I'd say, why is the 4090 so damn fast? That's a helluva card (too pricey for me) and was always going to be a reach for AMD to catch it.

I think the XTX is good but the XT would be better at $800 max (or ideally, £799 UK equiv.)

I don't think that 4090 is fast - it's normal for a generational architecture update and number of transistors.
But:
4090 - 78 B T for 122% performance
7900 XTX - 58 B T for 100% performance

Actually, the 7900 XTX has higher performance per transistor but too few of them.

The 7900 XTX is bad for other reasons, too:
-too high power consumption in idle (13 watts), multi-monitor (103 watts), video playback (88 watts) and v-sync (127 watts).
-extremely high price tag.
 
88W @8K ? Or even 88W @ 1080P ?
See the test notes, it's "4K 30 FPS video that's encoded with H.264 AVC at 64 Mbps bitrate"

edit: just tested it, YT 4K = ~100 W, 1440p = 100 W too, 1080p = 50 W, 720p = 21 W
edit2: added to conclusion, great question
 
Last edited:
A very disappointing effort from AMD so far. I was expecting the 7900 XTX to be around 90% of a 4090 in traditionally rasterized games, but it is barely beating the 4080. On top of that, the horrendous multi monitor and video playback power consumption is infuriating. I hope the latter is a driver bug. Given that RDNA3 is very different from RDNA2, maybe driver optimizations will increase the distance from the 4080 over time. All in all, I think I'll be snagging a 6800XT on the cheap instead of buying a 7700 XT.
 
No gpu is bad if it is priced accordingly. But these are not priced well.
150$+ down and we are sold to AMD regardless the unacceptable RT numbers.

'Unacceptable RT performance' that's quite something.

First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there. Secondly, RT is still a gimmick, Nvidia has made you believe it is worth paying a premium for 16% more of it.
 
I definitely think it's in a good spot in the terrible performance/value high end segment... that being said used 6900xt and 3080's / 3080Tis looking amazing right now.

Tell me about it, I went from a GTX 980 Ti to a RTX 3080 FE for a total of £450. Very happy :p
 
Expected better raster performance vs 4080.

@W1zzard In the GPUz screenshot, Revision C8 indicates that there were many revisions, correct? Compared to RDNA2 C0/1 for most of the cards. Or is it just some arbitrary info?
 
'Unacceptable RT performance' that's quite something.

First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there. Secondly, RT is still a gimmick, Nvidia has made you believe it is worth paying a premium for 16% more of it.
3000 series cards being released in late 2022 with a $1000 price tag would be unacceptable.

Maybe people are getting tired of AMD settling to be as fast as NVIDIA's previous gen every release.
 
@W1zzard In the GPUz screenshot, Revision C8 indicates that there were many revisions, correct? Compared to RDNA2 C0/1 for most of the cards.
No. Since AMD has started to run out of device IDs (the 744C number), they are separating some SKUs by the revision ID. It's an arbitrarily selected number that's just different on some cards. e.g. XT and XTX are the same device Id, but different revision. Different SKUs, like the Pro cards still get their own ID and the revision ID is 0, or some other seemingly random value. It has absolutely nothing to do with respins/ASIC revisions or similar for recent AMD cards

I hope that makes sense, happy to explain more
 
This reminds me of Bulldozer. RDNA3 looks like Bulldozer to me. That dual ALUs(?) architecture (please correct me here, I am probably describing wrong) doesn't seem to work as expected and probably needs much more work from the software team to optimize it. That's why I talk about Bulldozer. Bulldozer had the same issue. Dual ALUs and one FPU. It was losing even to Phenom in some cases or it wasn't offering what someone would expect from an "eight core" CPU. The same here. We see more like 6144 SPs performance in some cases than what we would expect from 12288 SPs. For example. F1 22 at 4K. 6900 XT at 132 fps, 7900XT at 152 and 7900XTX at 183 fps. Who wasn't expecting the 7900XTX at 200+ minimum before the reviews. Those power consumption problems with multi monitor and video playback also scream "rushed, immature drivers". The software team probably had other priorities to fix (meaning performance obviously) than power consumption.

Let's hope they can fix it, or Nvidia would be selling RTX 4070's like hot cakes. Even RTX 4800 might start selling better after today's reviews.
 
AMD has abandoned the ray-tracing optimisations. 55% faster means a generation or two ahead o_O

1670857079993.png
 
This reminds me of Bulldozer. RDNA3 looks like Bulldozer to me. That dual ALUs(?) architecture (please correct me here, I am probably describing wrong) doesn't seem to work as expected and probably needs much more work from the software team to optimize it. That's why I talk about Bulldozer. Bulldozer had the same issue. Dual ALUs and one FPU. It was losing even to Phenom in some cases or it wasn't offering what someone would expect from an "eight core" CPU. The same here. We see more like 6144 SPs performance in some cases than what we would expect from 12288 SPs. For example. F1 22 at 4K. 6900 XT at 132 fps, 7900XT at 152 and 7900XTX at 183 fps. Who wasn't expecting the 7900XTX at 200+ minimum before the reviews. Those power consumption problems with multi monitor and video playback also scream "rushed, immature drivers". The software team probably had other priorities to fix (meaning performance obviously) than power consumption.

Let's hope they can fix it, or Nvidia would be selling RTX 4070's like hot cakes. Even RTX 4800 might start selling better after today's reviews.

This is more like zen 1 IMO - they're sacrificing efficiency for the 1st gen modular design, but it has the potential to help them scale (or realize massive efficiencies) down the line. The first zen had the same issues.

If they just die shrunk and optimized RDNA 2 they would have gotten better results, for sure, but they're opting for the Zen strategy.

I definitely think it's priced ok according to the 4080 -- i don't agree with the reviews that "nvidia has to lowe price" since they do have more features and better RT performance, to me the 4080 and 7900xtx are at parity.
 
It's your fault , don't blame on AMD or you're trolling
???
1.7x performance of 6950 XT at 4K in cyberpunk would at least put the 7900 XTX on par with the 4090, yet it's only 1.42X faster - same story in most other games.

zIHDw7G.png

yukk6KI.jpg


113 / 84 = 1.34X more fps
 
This is more like zen 1 IMO - they're sacrificing efficiency for the 1st gen modular design, but it has the potential to help them scale (or realize massive efficiencies) down the line. The first zen had the same issues.

If they just die shrunk and optimized RDNA 2 they would have gotten better results, for sure, but they're opting for the Zen strategy.

Down the line now means 2 years in the future, 4 years in the future, 6 years in future.
I bet they will change the architecture, and no one would give them time for this.

Remember that AMD has only 8% market share today, which will soon be overtaken by Intel Arc.
 
Will be interesting to see how it undervolts, if the 4090 can achieve it's default FPS under 340W at 0.95v, the 4080 can probably go below 300W too without much performance loss.

It has crossed my mind a couple of times to sell the 4090 if don't start using it for other stuff than gaming, get a 7900XTX and pocket the 700-800€ difference, turn on freesync and play at 120hz instead of 144hz and hope i don't regret it if/when some good RT games come out that i'll play :laugh:
 
Down the line now means 2 years in the future, 4 years in the future, 6 years in future.
I bet they will change the architecture, and no one would give them time for this.

Remember that AMD has only 8% market share today, which will soon be overtaken by Intel Arc.

Im not sure Arc is even on the radar. The only competitor to Nvidia is AMD - arc is 3 gens behind.
 
Back
Top