• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

Joined
Jun 6, 2022
Messages
622 (0.91/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
It could just be that AMD is simply pricing their hardware to a market holding a recession, where demand for high-end graphics cards is at an all time low because people are going out more and playing less videogames in their homes, and the second-hand market is still getting flooded with sub-$1000 3090 and lower GPUs that deliver a lot more performance than any game really demands even at 4K.

When Nvidia announced their RTX40 pricing many people accused them of living in an echo-chamber. It's not really fair to doubt AMD's new hardware solely because they didn't price their new hardware in an echo-chamber.
I'm waiting to see the AMD flagship that sells for $1000 and offers the performance of the 4090. It would be a pleasant surprise, but it's not Liza Su's style.
 
Joined
May 26, 2021
Messages
122 (0.12/day)
thanks for the explanation, but that methodology makes no sense, it's clearly a 4k ultra-wide under the hood
I said the same when they called a 2160p a 4k, even though it has 2k rows.

Methodology now is to measure columns, so columns is what we measure. We haven't marketed rows since the introduction of 4k and above resolutions.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
At worst 10% worse performance than a 4090.
Do we have enough data to conclude that? Have you considered if the titles shown were cherry-picked?

I expect it to beat RTX 4090 in performance per dollar though.
no shitty nvidia drivers
Just for the record, AMD have never offered better drivers than Nvidia. That's not saying Nvidia is perfect though.
 

hpr484

New Member
Joined
Nov 4, 2022
Messages
1 (0.00/day)
Way to steal the exact graphs and charts from Linus’s video with estimated performance projections. Couldn’t even bother to modify the colors to make it look like your own charts or give LTT a shoutout?

Hopefully needs a lot less power to get those numbers too.

Who bought a launch price 4090?
Reference model 7900xt has a power draw of 300 watts on a 2 x 8 pin connector, 7900xtx has 330 watts, also on 2x8

I mean, for $999 if it's even only 75% the performance of a 4090 it's way better performance/$...
As the trend seems to go generation-to-generation, I’d expect the top of the line AMD to compete with 2nd best NVIDIA, but priced at mid range. That is, 6900xt competing with 3080 performance but 3070(ti) pricing. I expect the 7900xtx to beat the 4080, with the 7900xt to be about the same as the 4080
 
Last edited:

Koth87

New Member
Joined
Oct 23, 2021
Messages
3 (0.00/day)
"Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090"

Lmao that would be a neat trick, considering it only has 59% of the cores of a 4090. The 7900 XTX will *slaughter* the 4080 in raster, and might even come close in RT, and it costs $200 less.
 
Joined
Aug 3, 2006
Messages
83 (0.01/day)
Location
San Antonio, TX
System Name Geil
Processor Ryzen 6900HS
Memory 16GB DDR5
Video Card(s) Radeon 6700S
Storage 1TB SSD
Display(s) 120Hz 2560x1600
Software Windows 11 home premium
Just for the record, AMD have never offered better drivers than Nvidia. That's not saying Nvidia is perfect though
I beg to differ, don't fall for the memes.
However, it's objective to say that Nvidia drivers are worst in Linux.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I beg to differ, don't fall for the memes.
However, it's objective to say that Nvidia drivers are worst in Linux.
That is blatantly false.
Nvidia's Linux drivers have been rock solid for over a decade, even more solid than their Windows drivers, and have consistently offered the highest level of API compliance.
What you are reciting is typical forum nonsense coming from people who don't use AMD's "open source" Linux drivers to any real extent, fueled by ideology because people think one is completely free and open and the other is proprietary and evil, when the reality is both are partially open. The truth is the "open" Mesa/Gallium drivers are bloated and abstracted drivers, full of workarounds and are a complete mess.
 

s1nn3r86

New Member
Joined
Nov 5, 2022
Messages
1 (0.00/day)
Lol what ? The 4080 is way more than 10 % behind the 4090. That card is so gimped compared to 90. It will be closer to 20-30% slower than the 90. The xtx will eat it alive in raster,I don't think the xtx will be too far behind on ray tracing but that's definitely in the 4080s favour. But if you're like and don't care about ray tracing the xtx is by far the better card. The xt will probably be closer to the 4080 while being $300 cheaper. All nvidia need to do is lower the price down to $1,000 though and the xt will be irrelevant because of the ray tracing advantage
 
Joined
Nov 26, 2021
Messages
1,340 (1.53/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
That is blatantly false.
Nvidia's Linux drivers have been rock solid for over a decade, even more solid than their Windows drivers, and have consistently offered the highest level of API compliance.
What you are reciting is typical forum nonsense coming from people who don't use AMD's "open source" Linux drivers to any real extent, fueled by ideology because people think one is completely free and open and the other is proprietary and evil, when the reality is both are partially open. The truth is the "open" Mesa/Gallium drivers are bloated and abstracted drivers, full of workarounds and are a complete mess.
I can't soeak for anyone else, but my GTX 670 has given me many headaches for Linux whereas my Vega 64 has always worked fine.
 
Joined
Feb 18, 2013
Messages
2,180 (0.53/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Windows 11 Professional x64 (Update 23H2)
If Moore's Law Is Dead's sources are right, the only paper launch here will be the RTX4080. 20-40% less day 1 stock & less resupplies than the RTX4090, LOL. :oops: Assuming it's not getting "unlaunched" (like the RTX4080 12GB) to save face from getting slapped around by cheaper AMD cards, lol.

assuming if it's right.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,752 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Does one need to jump into a volcano to verify if it is indeed hot?
I'm curious if you don't know that this is a logical fallacy or you're being purposely intellectually dishonest?

Things like how a game looks and feels are incredibly subjective and yeah, you should absolutely see it with your own eyes and feel the controls to form an opinion that's actually worth something.

So I ask for a reason, people who have zero experience with it, and choose to be negative about it, I put those opinions in one pile, but if they have constructive thoughts to share, I'll listen. People who bought a 4090 obviously run the risk of exhibiting confirmation bias, but their opinion on it would still carry far more merit given they leverage experience. Optimal to me would be unbiased people understanding how it works and then checking it out for themselves and giving a subjective assessment of it.
 
Joined
Apr 30, 2011
Messages
2,651 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
That pricing and power tuning for RDNA3 GPUs was a great decision by AMD. They learned from the Zen4 launch and the critics about them pushing wattage unreasonably high to get the crown. They should have a "Rage" mode for their CPUs but the stock power limits should be lower than 170W where the efficiency curve would show the CPUs as all round great vs the power very hungry ones from Intel. So, they decided to limit the ref GPUs to reasonable power limits losing by ~10% vs the mega-GPU that the 4090 is and win in all metrics (efficiency, vfm, size). RT will remain a gimmick for years and with the downsampling or frame-generating techs it will become a normality sooner for some people. Also, RTX 4080 and lower (including RTX30 series) are DOA with the prices AMD went for their new GPUs. Moreover, we don't know the performance of the RDA3 GPUs at 1080P and 1440P. If they perform similar to RDNA2 they will win over nVIDIA again. FInally, the AIBs will go for close to 3GHz clocks which will bettle hard vs the 4090.
 
Joined
Nov 15, 2020
Messages
868 (0.69/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
Way to steal the exact graphs and charts from Linus’s video with estimated performance projections. Couldn’t even bother to modify the colors to make it look like your own charts or give LTT a shoutout?


Reference model 7900xt has a power draw of 300 watts on a 2 x 8 pin connector, 7900xtx has 330 watts, also on 2x8


As the trend seems to go generation-to-generation, I’d expect the top of the line AMD to compete with 2nd best NVIDIA, but priced at mid range. That is, 6900xt competing with 3080 performance but 3070(ti) pricing. I expect the 7900xtx to beat the 4080, with the 7900xt to be about the same as the 4080
6900XT competing with a 3080XT? What drugs are you on? The 6800XT is the natural competitor. Your theory is bonkers.

We will see but I expect there will be a paddock of room between Nvidia's 4090 and 4080 which both the 7900XT and 7900XTX will sit comfortably in.
 

ARF

Joined
Jan 28, 2020
Messages
3,934 (2.55/day)
Location
Ex-usa
What drugs are you on?

This is so much rude :oops: :(

They are right, RTX 3080 10 GB is a competitor to both the RX 6800 XT 16 GB and RX 6900 XT 16 GB as far as the performance chart shows us, since the performance deltas are low.

1667639414834.png


This is without knowing and deeper analysis on the nvidia shenanigans about lowered textures quality because of insufficient VRAM amount in some games and under certain maxed out settings.
 
Joined
Apr 8, 2008
Messages
328 (0.06/day)
Honestly the fact they didn't compare it directly to the 4090 shows you it's beneath it. And the aggressive pricing tells the story of the bad ray tracing performance. Pretty much another Nvidia win across the board this generation. Sorry AMD.
Honestly, RT is not everything. While most like the visuals but IRL most people doesn't use it. You can see for your self in NV reddit comments.
But personally, I love it, but not for gaming, for rendering as using OptiX (which uses both RTX & HW accelerated denoising to accelerate 3D rendering), the results are massive improvements, the 7900 XTX perform lower than 3090Ti in RT, and the 4090 is massively faster, so a 4070/4080 will be very good, but overpriced.
 
Joined
Nov 23, 2018
Messages
15 (0.01/day)
AMD must deliver for the sake the buyers and the market. But these are very bold claims considering much less Tflops, Frequency,etc. Reviews can't come sooner !
AMD hasn't lied of fudged a single benchmark of their products at launch for the last 5+ years...every single number they published was independently verified every year.

That pricing and power tuning for RDNA3 GPUs was a great decision by AMD. They learned from the Zen4 launch and the critics about them pushing wattage unreasonably high to get the crown. They should have a "Rage" mode for their CPUs but the stock power limits should be lower than 170W where the efficiency curve would show the CPUs as all round great vs the power very hungry ones from Intel. So, they decided to limit the ref GPUs to reasonable power limits losing by ~10% vs the mega-GPU that the 4090 is and win in all metrics (efficiency, vfm, size). RT will remain a gimmick for years and with the downsampling or frame-generating techs it will become a normality sooner for some people. Also, RTX 4080 and lower (including RTX30 series) are DOA with the prices AMD went for their new GPUs. Moreover, we don't know the performance of the RDA3 GPUs at 1080P and 1440P. If they perform similar to RDNA2 they will win over nVIDIA again. FInally, the AIBs will go for close to 3GHz clocks which will bettle hard vs the 4090.
Nobody is buying these cards for 1080p gaming. That resolution is useless with current gen hardware and doesn't even belong in benchmarks anymore. The only resolution that matters going forward is 4k.

I'm curious if you don't know that this is a logical fallacy or you're being purposely intellectually dishonest?

Things like how a game looks and feels are incredibly subjective and yeah, you should absolutely see it with your own eyes and feel the controls to form an opinion that's actually worth something.

So I ask for a reason, people who have zero experience with it, and choose to be negative about it, I put those opinions in one pile, but if they have constructive thoughts to share, I'll listen. People who bought a 4090 obviously run the risk of exhibiting confirmation bias, but their opinion on it would still carry far more merit given they leverage experience. Optimal to me would be unbiased people understanding how it works and then checking it out for themselves and giving a subjective assessment of it.
I have a 4090 and a 6900xt. I also have a 10gb 3080. My daily driver until the 4090 released has been the 6900xt which is considerably better than the 3080. I only had a 3090ti for a short while and I honestly couldn't tell the difference in performance between that and the 6900xt. The 4090 is obviously better at the moment but I'll be getting a 7900XTX too. I have a LG C2 42 as my display and pretty much only game at 4k.
 
Joined
Feb 20, 2019
Messages
7,285 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
If you buy a 4090 at this point you're a fool.
Honestly, buying a 4090 prior to December 5th when the review embargo on these lift was foolish anyway. For the first couple of months, the 4090 was always going to be scalped, overpriced, hard to find in stock etc, and you were buying blind, without any idea if it would be the best solution this generation. Impatience and zealotry are the only virtues by which 4090s have sold, so far.

The only reason to actually buy a 4090 at the moment is for CUDA application support where there's a very genuine potential for it to be cost-effective over the 3090 and/or Quadro RTX6000/8000 cards. That's only if your income depends on GPU performance, and even in a company where we have people that need those cards, we don't buy many of them because they're really hard to justify compared to just farming the work out to a group of lesser cards. The caveats are literally "something that requires a large contiguous VRAM allocation" and "is needed ASAP for a deadline or submittal". Niche within niche within the 3D rendering industry. I don't know how niche that is but it's definitely not a mainstream scenario IME.
 
Joined
Apr 30, 2011
Messages
2,651 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Nobody is buying these cards for 1080p gaming. That resolution is useless with current gen hardware and doesn't even belong in benchmarks anymore. The only resolution that matters going forward is 4k.
Since you started the "nobody" talk let me deliver my suggestion: Nobody should pay so much for a gaming device. Only for professional reasons. And 1440P is a great res for everyone. 4K will not become mainstream even in 10 years since most people (>80%) aren't and will not be willing to spend so much for the monitor and GPU combo needed.
 
Joined
Nov 23, 2018
Messages
15 (0.01/day)
Honestly, buying a 4090 prior to December 5th when the review embargo on these lift was foolish anyway. For the first couple of months, the 4090 was always going to be scalped, overpriced, hard to find in stock etc, and you were buying blind, without any idea if it would be the best solution this generation. Impatience and zealotry are the only virtues by which 4090s have sold, so far.

The only reason to actually buy a 4090 at the moment is for CUDA application support where there's a very genuine potential for it to be cost-effective over the 3090 and/or Quadro RTX6000/8000 cards. That's only if your income depends on GPU performance, and even in a company where we have people that need those cards, we don't buy many of them because they're really hard to justify compared to just farming the work out to a group of lesser cards. The caveats are literally "something that requires a large contiguous VRAM allocation" and "is needed ASAP for a deadline or submittal". Niche within niche within the 3D rendering industry. I don't know how niche that is but it's definitely not a mainstream scenario IME.
You could sell a 4090 used at higher than MSRP. I have one and while it's a great card, I doubt I'll be keeping it long term. Too much power draw and the power cable issues are scary. I'm waiting to see 3rd party reviews on the 7900xtx but if its in the 4090 ballpark, I see zero reason to keep the 4090.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
OK, got it.

My humble perspective: 7900XTX/XT, if claimed figures are true, is an amazing, wipes-the-floor-with-competitor product that actually is NOT a direct competitor of 4090 as:

1) It is quite a bit smaller (300mm2 of 5nm and 37*6 of 6nm, totals at around 522mm2, quite down fro 600mm2 of 4090)
2) Is about 2 times cheaper (the "$1600" 4090 is 2300 Euro here, cough, even that 1.6k price point is BS, it should cost around 1900)
3) Is way cheaper to produce (600mm2 monolith vs 300mm2 + 6 small thingies at N6)
4) is fed by only two 8-pin connectors


Its direct competitors are 4080 16GB, and 4080 12GB (RIP :D)
(This product has literally KILLED one of its two competitors)

So, with all that in mind, it would make sense to present 'vs 4080' angle in the presentation.
Exception there is no 4080 out yet.
AMD rep essentially confirming that 7900XTX is a 4080 wipe-the -floor-er , wasn't meant as necessarily 4090 competitor (heck, and why would it be, at half the price).

 

doc7000

New Member
Joined
Mar 6, 2022
Messages
17 (0.02/day)
What we can extrapolate is that the 7900XTX would have around 50-60% more performance than the 6900XT in pure rasterization games, so without RT, and all that FSR/DLSS bullshit. So look at some benchmark on the 6900XT and take a guess, while its RT performance might not be up to par with the 4090, since its about a generation late in comparison, its still up to 50% greater than before, which probably put it in the ballpark of the 3090/3090ti RT performance, which is still far below that of the 4090.
Then take into account the price, it would probably be far superior than the 4080, in most games barring RT performance, then it's also cheaper, $999 vs $1199, so it definitely a better choice IMO. The 7900XTX might not be targeting the 4090 especially at it's price point, instead the 4080.
I would add that it is a 50% improvement per compute unit and the 7900XT has 4 more compute units then the 6900XT while the 7900XTX has 16 more compute units then the 6900XT, so ray tracing gains maybe higher then suggested.

I think this was pretty promising, also the one huge advantage that AMD has over Nvidia here is that while the Nvidia AD102 die is 608mm2 the graphics portion of the 7900XT/X is only 300mm2, besides being a solid cost advantage it also means that AMD can release something with a much bigger GPU portion of the SOC and stack the chiplets to take the cache from 96mb to 192mb.
 
Joined
Sep 3, 2019
Messages
2,979 (1.76/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 150W PPT limit, 79C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:288, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (387W current) PowerLimit, 1060mV, Adrenalin v24.3.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3155)
I would add that it is a 50% improvement per compute unit and the 7900XT has 4 more compute units then the 6900XT while the 7900XTX has 16 more compute units then the 6900XT, so ray tracing gains maybe higher then suggested.

I think this was pretty promising, also the one huge advantage that AMD has over Nvidia here is that while the Nvidia AD102 die is 608mm2 the graphics portion of the 7900XT/X is only 300mm2, besides being a solid cost advantage it also means that AMD can release something with a much bigger GPU portion of the SOC and stack the chiplets to take the cache from 96mb to 192mb.
Exactly and that (7950XTX..?) will be at the time of 4090Ti probably.
And yes RT performance of 7900XTX is known (by AMD claims) to be ~1.8x over the 6950XT that will place it around the 3090/Ti.
Its just math... +50% per CU +20% more CUs
1.0 + 50% = 1.5 + 20% = 1.8x

What I'm interested in way more than who will take the crown eventually (I couldn't care less) is that we can get solid performance gains with low power (<250W) on the sub 600$ segment, like a 7700XT.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,752 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Impatience and zealotry are the only virtues by which 4090s have sold, so far.
Only a sith deals in absolutes. I wouldn't be so sure those are the only reasons, but I'll agree they're significant factors to 4090 sales.
 
Joined
Mar 28, 2020
Messages
1,643 (1.11/day)
why the fuck is the heat not leaving my case out the port side??? all heat is being dumped in case with this design... oh shit... my poor cpu... fuck... imo heat should leave out the port side and the top... not just forced one way or the other...


View attachment 268501
Because if you pick up any GPU and look at the cooler fin orientation, it's anyway not going to allow hot air to escape from the rear vents. The only GPU models that vent hot air out from the rear vents are the blower type.

Anyway, it's good to see that AMD is keeping up the pressure on Nvida.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,752 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I have a LG C2 42 as my display and pretty much only game at 4k.
Same, my 3080 continues to impress me 2 years in, but the desire for more performance can never be truly quenched. I'll be looking closely at the 7900XTX after release for sure, and hoping to see it shake up the market a bit and hopefully force more compelling prices from Nvidia around that price point too.
 
Top