• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
In any case even if 6nm Navi33 can hit the same 3GHz clocks as 5nm models and the reference model has 2.85GHz boost for example, it will likely won't be more than 1.5X vs 6600XT in FHD so not being able to match 6900XT FHD performance and in QHD RX 6800XT will be much faster (in 4K even RX 6800 will be faster too)
RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
 
Same, my 3080 continues to impress me 2 years in, but the desire for more performance can never be truly quenched. I'll be looking closely at the 7900XTX after release for sure, and hoping to see it shake up the market a bit and hopefully force more compelling prices from Nvidia around that price point too.
I'll also be looking at the XTX pretty closely. The number of titles I play that actually have meaningful raytracing is just two, and it's not as if RDNA3 can't raytrace, it's just not going to be 4090-tier (or possibly even 4080 tier) when it comes to full lighting/shadow/occlusion raytracing. If you stick to raytraced reflections only, the AMD hardware is basically pretty competitive.

My interest will be in getting an XTX and tuning it to see if I can run it at 200-250W without losing too much performance. If The 7900XTX can't manage that, perhaps the 7800XT will do. My HTPC is currently rocking a 6700 10GB which sips about 125W under full load after an undervolt and I really don't think my case or slim furniture can handle anything more than 200-250W. The irony is that it's right underneath the only 4K display in the house, so it needs the most GPU power.
 
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
And there we have it again; claims of better performance with better drivers and games.
It's the same old claim that AMD (or Intel) will catch up on Nvidia with better software over time, but it never happens. If driver overhead was holding back the performance, we would see a progressively growing overhead with the higher tier cards, holding them back to the point where high-end cards become almost pointless. The performance figures we've seen so far does not indicate this, and when reviews arrive, we can probably discredit that claim completely.

And no, (PC) games are not optimized for specific GPU architectures. Games are written using DirectX or Vulkan these days, neither are tailored to specific GPU architectures. Games may have some exclusive features requiring specific API extensions, but these don't skew the benchmark results.

BTW, did anyone catch the review embargo?

RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
I agree, I expect some good deals from both makers, so people better set some price notifications grab the best deals.
 
If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
In any case even if 6nm Navi33 can hit the same 3GHz clocks as 5nm models and the reference model has 2.85GHz boost for example, it will likely won't be more than 1.5X vs 6600XT in FHD so not being able to match 6900XT FHD performance and in QHD RX 6800XT will be much faster (in 4K even RX 6800 will be faster too)
RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
Maybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
 
What we can extrapolate is that the 7900XTX would have around 50-60% more performance than the 6900XT in pure rasterization games, so without RT, and all that FSR/DLSS bullshit. So look at some benchmark on the 6900XT and take a guess, while its RT performance might not be up to par with the 4090, since its about a generation late in comparison, its still up to 50% greater than before, which probably put it in the ballpark of the 3090/3090ti RT performance, which is still far below that of the 4090.
Then take into account the price, it would probably be far superior than the 4080, in most games barring RT performance, then it's also cheaper, $999 vs $1199, so it definitely a better choice IMO. The 7900XTX might not be targeting the 4090 especially at it's price point, instead the 4080.
This right here is what I've suspected and been telling friends and coworkers. I believe the 7900XTX is going to be more on par with the 4080. Now if it does so happen to be that it comes somewhat closer to the 4090, then Nvidia is going to be in a bunch of trouble due to pricing and Display Port 2.1. I still have zero clue as to why Nvidia skipped out on including 2.1, which in itself is a selling point.

BTW the 8K reference is not really 8K (7680x4320 = 33M pixels) but a widescreen 8K (7680x2160 = 16.5M pixels)
Yes, you are correct! 8K is not 4K x2, this is not how it works and I see how someone can be tricked into thinking this which is what AMD is clearlydoing here. 8K is 4k x4.
 
Maybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
The theoretical difference (I don't mean FP32 diff : 512/336) between 4090 and full AD103 it's close to +40%, I agree, let's say +39% as an example.
My assumption is that in 5800X TPU testbed with the current game selection, RTX 4090 realizes around -10% from it's potential.
For example:
RTX 4090 theoretical 4K 139%
RTX 4090 realized 4K 125% (-10%)
Full AD103 with the quoted clocks 100%
I may be wrong, we will see what actual performance the current RTX 4080 model config will achieve (304 TMUs/TC vs 336 and -4% clocked vs my proposed AD103 specs) in relation with RTX 4090.
 
I'm waiting to see the AMD flagship that sells for $1000 and offers the performance of the 4090. It would be a pleasant surprise, but it's not Liza Su's style.

I don't know why so many people do not believe that AMD can undercut with good pricing quite substantially. After all, the chiplets were made exactly to cut the pricings.
AMD has a 300 sq. mm die vs nvidia's 2x larger die. Of course, AMD's product is around 60-70% of the cost of the nvidia's.
 
Impressive if it's really like that, given the 95W lower tdp and the much, much lower price. If this continues down the stack Lovelace will be the biggest joke Nvidia made since a while ago...
 
I really can't say the prices are great for AMD since 7900XT is going to bee $899. That is a lot. NV is just a kick in the teeth for consumers and that is not the end of the story since there will be 4090 Ti I suppose. Anyway, I will wait for the reviews and RT performance is still, not a deal breaker not a winning deal either. We are getting there but we are not there yet.
I only hope the AMD GPUs are as fast as advertised.

Impressive if it's really like that, given the 95W lower tdp and the much, much lower price. If this continues down the stack Lovelace will be the biggest joke Nvidia made since a while ago...
Some reviewers claim it is already a big joke and a cash grab.
 
AMD has already stated that the 7900 XTX is a video card to challenge the RTX 4080 and not the RTX 4090.
 
If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
Sooo, basically all they have to do is make a new GPU. Would this be a 4080 Super or ti? Because I promise you that unless the 4080 launch is next summer, they are already manufactured.

Yes, you are correct! 8K is not 4K x2, this is not how it works and I see how someone can be tricked into thinking this which is what AMD is clearlydoing here. 8K is 4k x4.
It is either 8K halfheight or 4K ultrawide. Pick your poison! :P
 
AMD has already stated that the 7900 XTX is a video card to challenge the RTX 4080 and not the RTX 4090.
They did the same for 6900XT vs 3080 but ended up matching 3090 @1440P & 4K
1667830155734.png
1667830166002.png
 
The theoretical difference (I don't mean FP32 diff : 512/336) between 4090 and full AD103 it's close to +40%
No.
4080 being 60% of 4090, means that if 4080 is 100%, 4090 is 166%
 
Exactly and that (7950XTX..?) will be at the time of 4090Ti probably.
And yes RT performance of 7900XTX is known (by AMD claims) to be ~1.8x over the 6950XT that will place it around the 3090/Ti.
Its just math... +50% per CU +20% more CUs
1.0 + 50% = 1.5 + 20% = 1.8x
7950/7970 XTX >|~|< 4090 Ti > 4090 ~/a little > 7900 XTX > 4080 16GB >|~|< 7900 XT > "Unlaunched" 4080 12GB.

Very compelling to see how it all plays out once they all have been released.
 
I beg to differ, don't fall for the memes.
However, it's objective to say that Nvidia drivers are worst in Linux.
It shows you're retired for a while now, because this is absolute nonsense.

That is blatantly false.
Nvidia's Linux drivers have been rock solid for over a decade, even more solid than their Windows drivers, and have consistently offered the highest level of API compliance.
What you are reciting is typical forum nonsense coming from people who don't use AMD's "open source" Linux drivers to any real extent, fueled by ideology because people think one is completely free and open and the other is proprietary and evil, when the reality is both are partially open. The truth is the "open" Mesa/Gallium drivers are bloated and abstracted drivers, full of workarounds and are a complete mess.
.. and this is the truth.

Maybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
That is exactly why it appears they cancelled the 4080. Initially I thought they had to reposition because of their OWN marketing (after all how vague is such a large gap between two 4080 cards, and a different VRAM cap. to boot, these just aren't two similar cards in any way), but with the 7900XTX performance estimates out the door, we can easily defend the idea that 4080 12G turd was pulled back in because it would mean AMD had a much better story at the high end all the way through. After all, if they drop the number to an x70, now AMD's 'faster' cards are no longer all competing with (and in many cases performing over the level of) 4080's. Its a better marketing reality.

It is also highly likely the 4080 16G will get repositioned - in MSRP. $200 or even $300 just for somewhat better RT perf is steep. Too steep - and thats giving Nvidia benefit of the doubt that 4080 won't get eclipsed by AMD's 7900XT (yes XT). I honestly think the 4080 is going to be only situationally equal, and overall lower in raster perf, and even the 7900XT will be highly competitive with its performance, seeing the tiny gap between XTX and it.
 
Last edited:
Since you started the "nobody" talk let me deliver my suggestion: Nobody should pay so much for a gaming device. Only for professional reasons. And 1440P is a great res for everyone. 4K will not become mainstream even in 10 years since most people (>80%) aren't and will not be willing to spend so much for the monitor and GPU combo needed.
Well I for one find my new 4K monitor fantastic compared to 1080P. That said, I wonder how much of a difference it would have been vs 1440P. I just don’t think your statement is valid, just because you can’t justify spending so much on something for gaming doesn’t mean anything. I’m not made of money, I drive a 13 year old Ford Escape and live in a rented room, but I am saving up for a powerful GPU to match my monitor. Why? Because I love when games are pretty. Especially MSFS2020 which will be amazing once I get a card that can run 4K smoothly.
 
Well I for one find my new 4K monitor fantastic compared to 1080P. That said, I wonder how much of a difference it would have been vs 1440P. I just don’t think your statement is valid, just because you can’t justify spending so much on something for gaming doesn’t mean anything. I’m not made of money, I drive a 13 year old Ford Escape and live in a rented room, but I am saving up for a powerful GPU to match my monitor. Why? Because I love when games are pretty. Especially MSFS2020 which will be amazing once I get a card that can run 4K smoothly.
Its what you settle for in the end, we all make our choices. But there is also just laws of physics and ergonomics; 4K is not required in any way to get high graphical fidelity. I run a 3440x1440 (close enough, right...;)) panel and its really the maximum height that's comfortable to view, the width is already 'a thing' and only a curve makes it a good fit - 4K has 400 extra pixels in width and 700 in height.

4K struggles with efficiency because you're basically wasting performance on pixels you'll never notice at the supposed ideal view distance. You'll make a choice between a perf sacrifice for no uptick in graphical fidelity, versus sitting closer to see it all and killing your neck/back. At longer view distances, you can make do with lower res for the exact same experience. Another issue is scaling, 4K requires it for small text or its just unreadable.

Its a thing to consider ;) Not much more; the fact remains 4K is becoming more mainstream so there's simply more on offer, specifically also OLED. But the above is where the statement '1440p is enough' really comes from. Its a sweet spot, especially for a regular desktop setting. Couch gaming follows a different ruleset, really. But do consider also the advantages. I can still play comfortably at 3440x1440 on a GTX 1080... (!) 4K is going to absolutely murder this card though. Jumping on 4K is tying yourself to a higher expense to stay current on GPU, or sacrificing more FPS for wanted IQ.

Some reviewers claim it is already a big joke and a cash grab.
Nvidia has all opportunity to tweak the line up and the better half isn't even out... They always ran the risk of misfires because they release first.
 
Last edited:
4K is becoming more mainstream

Only the PC environment is lagging behind the reality but 4K 100% dominates the markets in which it is allowed to develop.
Also, AMD already markets "8K" or 4K ultrawide experience with Radeon RX 7900 series GFX.

A local store with offers (quantity of offers is in brackets):

1667848188437.png
 
Sooo, basically all they have to do is make a new GPU. Would this be a 4080 Super or ti? Because I promise you that unless the 4080 launch is next summer, they are already manufactured.
All GPCs are active in RTX 4080, they just disabled some SMs, all they have to do is re-enable them for the AD103 dies that can be fully utilized and the rest can be used in future cut-down AD103 based products (and also increase the clocks for the full AD103 parts)
And anyway my point wasn't what Nvidia will do but what it could achieve based on AD103 potential...

No.
4080 being 60% of 4090, means that if 4080 is 100%, 4090 is 166%
According to leak, even an OC cut-down RTX 4080 (304TCs enabled vs 336TCs of my higher clocked full AD103 config...) appears to be only -20% slower vs RTX 4090 in 3DMark Time Spy Performance preset and -27% in Extreme 4K preset...
You do your math, I will do mine!
For example theoretical Shading performance delta alone is useless to extract performance difference between 2 models, it's much more complex than that...

IMG_20221107_221337.jpg
 
Last edited:
Only the PC environment is lagging behind the reality but 4K 100% dominates the markets in which it is allowed to develop.
Also, AMD already markets "8K" or 4K ultrawide experience with Radeon RX 7900 series GFX.

A local store with offers (quantity of offers is in brackets):

View attachment 268931


Context, man, you might need to look that word up.

These posts make no sense whatsoever. 4080 isn't in the correct place in that chart, obviously, and 'local store offers' tell just about jack shit about where 4K is for gaming. Its marketing; you can find IoT devices like a fridge with '4K support'.

Resolution was, is and will always be highly variable. Especially now with FSR/DLSS. There is also a resolution for every use case, its not true the only way is up, enough is enough.
 
Back
Top