• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

Joined
Mar 10, 2010
Messages
11,878 (2.28/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
My dude, I've said my piece months ago when I had the card. If you have only 60Hz screens and/or single monitor there's nothing to note. I'm not repeating the whole writeup again. Go look back in the owners club or my project log.
If you're looking as hard as you did, for, and at issues and only came up with the three I just read, I wouldn't call that a shocking amount relative to Intel and Nvidia. GPUs but that's me, fan stopping really seamed to bother you plus multi monitor idle above all else, one of those doesn't even register on my radar most of the time, as for stuttering, I don't use your setup you do, some of that stuttering Was down to your personal set-up IE cable's, glad you got it sorted by removing the card entirely but for many other silent users and some vocal, few of your issues applied.

And more importantly, this Is made by the same people, but it isn't your card, so I think your issues might not apply to this rumour personally.
 
Joined
Nov 11, 2016
Messages
3,147 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
on par with the 4070ti, so 10% slower then the 7900xt, how many cards will they plan on releasing to stack them every 10%?!

i guess no 7800xtx or it would just be hilarious :roll:

7800XT probably tie with 6900XT more than likely, AMD has excess volume of Navi31 so they have no other choice than to sell cut-down Navi31 as 7800XT, or even 7700XT :roll:
 
Joined
Dec 31, 2020
Messages
797 (0.63/day)
Processor E5-2690 v4
Motherboard VEINEDA X99
Video Card(s) 2080 Ti
Storage NE-512 KingSpec
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
There is no 10%, RDNA scales pretty linearly. 33% less means 33% and slightly better than a 4070 but it could force Nvidia to release the 16GB. As for power savings most people surely set the driver to prefer performance mode where it's probably using the higher 3D clock as opposed to power saving where the low clock is applied instead.
 
Joined
Jun 21, 2013
Messages
549 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
I won't even consider any AMD GPU until they fix the low load power draw.
 
Joined
Jan 8, 2017
Messages
9,127 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Obviously we are exchanging opinions. And as Harry Callahan have said... :p

Don't know what a GPU needs to do to keep feeding a high refresh rate monitor or a dual monitor setup. But close to 100W power consumption in video playback is unacceptable in 2023. While an opinion I also believe it is logical to say that. Your smartphone in your pocket or on your desk can probably playback the same video files at 5-10W power consumption. In fact even AMD's own APUs play video while keeping their power consumption for the whole SOC at even lower than 15W. So, something gone wrong with RDNA 3 here. And even if we consider this still an opinion, then there is this chart
View attachment 302130
where a 6900XT is at half the power consumption.
AMD Radeon RX 7900 XTX Review - Disrupting the RTX 4080 - Power Consumption | TechPowerUp

I own a 7900XT, I never see it using more than 30-35W during video playback with a ton of other things opened in the background and I use two monitors as well.
 
Joined
Sep 17, 2014
Messages
21,311 (5.99/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
7800XT probably tie with 6900XT more than likely, AMD has excess volume of Navi31 so they have no other choice than to sell cut-down Navi31 as 7800XT, or even 7700XT :roll:
Yep...
 
Joined
May 17, 2021
Messages
3,005 (2.68/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
88W to watch a video is insane. :kookoo:
 
Joined
Jan 17, 2018
Messages
389 (0.17/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
The 3GHz, it's not. This power consumption in multi monitor and video playback is unacceptable in 2023.
View attachment 302128
This is from the original review of TechPowerUp but I think it still remains a problem even today.


Probably.

In my opinion it is. Looking at 6650XT and 7600 specs side by side and then no performance change in games, that's a problem.
Why are you referencing the original review on TPU when you can look at the most recent GPU review to see that those power consumption numbers for multi-monitor and video playback are around half of what they were at release.

There's nothing wrong with the 7600, aside from price(it should be around the $200-225 price range, max). Even it's naming scheme left out the "XT" to differentiate it from the previous generation. Sure it's basically on par with the 6650XT in terms of performance, but it's also around 20% more efficient with that performance.
 

R_Type

New Member
Joined
Jun 17, 2023
Messages
5 (0.01/day)
Can't blame anyone who chose a 4090 over any AMD GPU, it's faster, more feature complete, better maintained, it's simply a product that AMD currently can't hope to match. They don't have either the engineering or software development required to release a product of its caliber at this time.



I thought the same until I saw the RX 7600, and nope, RDNA 3 is practically irredeemable at this point. It's just bad beyond belief if you stack it against Ada Lovelace. And from my standpoint as someone with a high-end Ampere card, it's not for me either. Its marginal gains in raster and equal RT performance at the cost of losing of the whole feature set of the RTX ecosystem and Nvidia's superior driver software make it unworthy of me even considering a replacement. This generation has failed, and short of massive price cuts, I don't think any of these are worth buying.

Unfortunately no heads will roll, Radeon needs new blood, innovative people to develop products and the brand. The boomers that developed these GPUs in the 2000s and linger in upper management need to go. They can't keep up with the industry anymore.

You don't think chiplet gpus are innovative?
 
Joined
Apr 17, 2021
Messages
530 (0.46/day)
System Name Jedi Survivor Gaming PC
Processor AMD Ryzen 7800X3D
Motherboard Asus TUF B650M Plus Wifi
Cooling ThermalRight CPU Cooler
Memory G.Skill 32GB DDR5-5600 CL28
Video Card(s) MSI RTX 3080 10GB
Storage 2TB Samsung 990 Pro SSD
Display(s) MSI 32" 4K OLED 240hz Monitor
Case Asus Prime AP201
Power Supply FSP 1000W Platinum PSU
Mouse Logitech G403
Keyboard Asus Mechanical Keyboard
RTX 4090 is a status symbol for someone that can't afford a Tesla or Mercedes. If you want to spend 70 percent more money for 20 percent more performance, go brag to your mommy.

Meanwhile we are all waiting for further improved performance/dollar, and AMD is always the leader there. Hopefully the 7800 XT can impress. Almost fall 2023 here, time for refreshed products, never mind the first release.

I own a 7900XT, I never see it using more than 30-35W during video playback with a ton of other things opened in the background and I use two monitors as well.
Have to spend most of your time responding to the imaginary product in their heads, not the one you actually own.

----------

Sad to realize that most hated GPU release from NVidia, the 4060 Ti, is the only card with a decent perf/dollar. Building on the last gen but lacking VRAM. EVERY other NVidia GPU is much worse. Oh well.
 
Last edited:
Joined
Jan 8, 2017
Messages
9,127 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
You don't think chiplet gpus are innovative?

Another thing people are overlooking is that AD102 has 30% more transistors, also Navi31 dedicates more transistor budget for the cache, GPUs scale very close to linearly, when you realize that there is absolutely nothing impressive about it.

More transistors = more compute units = more performance. Performance per compute unit is comparable between ADA and RDNA3.
 
Last edited:
Joined
May 17, 2021
Messages
3,005 (2.68/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
7800XT probably tie with 6900XT more than likely, AMD has excess volume of Navi31 so they have no other choice than to sell cut-down Navi31 as 7800XT, or even 7700XT :roll:

but will people buy more just because it's called 7800xt? i don't think so. Unless of course they make a big price gap, but in that case no one will buy the 7900 obviously, so i think they have the same problem but now getting even less money.
 
Joined
Feb 3, 2017
Messages
3,564 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
The problem isn't RDNA3, its the competitive edge. The whole RDNA3 'stack' (three SKUs lol) just doesn't lead and also fails to cover a relevant stack position in price/perf. Ada realistically has it beat; its ever so slightly more power efficient, has bigger featureset, and sacrificed raster perf/$ compared to RDNA3 is compensated for by DLSS3, if that's what you're willing to wait for. All aspects that command a premium. It should have competed on price. The only advantage it has is VRAM, but RDNA2 has that too.
I would actually argue that RDNA3 was not intended to have a VRAM advantage. AMD successfully failed into that.

OK, RTX4090 is Nvidia's successful halo product at the top but RTX 4080 should not be competing with 7900XT, much less 7900XTX. It should be competing with the upcoming 7800XT.
76SM vs 84CU, 256-bit vs 320-bit memory bus, 64MB vs 80MB of LLC.
For whatever reason AMD had to move the product stack down a notch and as a result ended up with more VRAM at same product stack/price/performance levels.

Its high. But there are other examples with strange gaps. Look at the 3090ti compared to a 3090. 13W at the same VRAM capacity. Mkay?
Double the GDDR6X chips. 12x2GB vs 24x1GB. That comes with a nasty tradeoff in power consumption.
 
Joined
May 17, 2021
Messages
3,005 (2.68/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
I would actually argue that RDNA3 was not intended to have a VRAM advantage. AMD successfully failed into that.

OK, RTX4090 is Nvidia's successful halo product at the top but RTX 4080 should not be competing with 7900XT, much less 7900XTX. It should be competing with the upcoming 7800XT.
76SM vs 84CU, 256-bit vs 320-bit memory bus, 64MB vs 80MB of LLC.
For whatever reason AMD had to move the product stack down a notch and as a result ended up with more VRAM at same product stack/price/performance levels.

AMD has been using the higher vram as a selling point for year, it was like that in the Polaris days almost a decade ago, it kept it ever since. It's no a new move or a miscalculation of rdna3
 
Joined
Dec 25, 2020
Messages
5,007 (3.96/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
You don't think chiplet gpus are innovative?

Since they don't measure up in performance and AMD currently offers me less features and worse software for my money than Nvidia, I don't care if it's a monolithic, chiplet, 3D stacked, whatever really.

I paid about the same price an RX 6750 XT costs nowadays for my Radeon VII, which I bought brand new in box (on sale). GPU prices are high because the companies realized that they sell at those inflated prices regardless - and it's interesting to note that the VII was considered a "poor value" choice at $699 when you could have the 5700 XT with RDNA architecture (but less memory and a sliver less performance at the time - though it's faster now) for less money indeed...
 
Joined
Sep 6, 2013
Messages
3,078 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
I own a 7900XT, I never see it using more than 30-35W during video playback with a ton of other things opened in the background and I use two monitors as well.
That's nice.
Why are you referencing the original review on TPU when you can look at the most recent GPU review to see that those power consumption numbers for multi-monitor and video playback are around half of what they were at release.

There's nothing wrong with the 7600, aside from price(it should be around the $200-225 price range, max). Even it's naming scheme left out the "XT" to differentiate it from the previous generation. Sure it's basically on par with the 6650XT in terms of performance, but it's also around 20% more efficient with that performance.
Another person pointed at the newer numbers, but probably you didn't read more posts. That being said the over 40W power consumption in TPU's review is still pretty high. Please read rest of the posts before replying. Not going to repeat what is already written.

As for 7600, you are missing the point about performance. More efficient? Where do you base this?
Have to spend most of your time responding to the imaginary product in their heads, not the one you actually own.
We are using those.... imaginary numbers from TPU reviews. Maybe the reviews are flawed? What do you say?
 
Joined
Feb 3, 2017
Messages
3,564 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
AMD has been using the higher vram as a selling point for year, it was like that in the Polaris days almost a decade ago, it kept it ever since. It's no a new move or a miscalculation of rdna3
Yes, but they genuinely added more VRAM in RDNA2 vs Ampere. That was an interesting comparison because both made different technical tradeoffs in search of faster memory system.
- AMD went heavily into LLC and got the chance to cut memory bus widths as a result without significant loss in performance.
- Nvidia wanted faster VRAM and went for GDDR6X - with the primary problem that it was only available (probably against predictions) in 1GB chips. This led to VRAM sizes on Ampere being lower than on RDNA2 despite wider memory buses.

In that comparison Nvidia either failed or maybe just had less luck this time around.
This generation - RDNA3 vs Ada - Nvidia followed AMD example of adding a large cache and cutting the memory bus width.
 
Joined
Feb 24, 2023
Messages
2,307 (4.88/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
Would you call a 33% difference "marginal"?
I would. When the newer card is more expensive (as per current prices) getting +33% performance is marginal and no single live one should ever care how little the price difference is. 33% can't justify nothing.

RDOA3 has its name for an eversolid reason. More of the same is almost the complete opposite of what AMD should've been doing if they're aiming at gains and not losses. 7900XTX is a complete mess driverwise and RT-wise. 7900XT is even more of a total nonsense because it shares the 7900XTX problems and is even worse moneywise (even though you have never expected that to be possible). And... 7600 is a marginally overclocked 6650XT marketed as something new. Since the whole line-up is a 10 outta 10 failure how do you expect these 33% to justify anything? Those who already have 3090 will at least buy 4090 or, which is more likely, wait till something can really beat it effortlessly aka x2.5 performance. x2.5, not x1.33.

Those whose best card is at most 3070/6700 XT will still upgrade (if decided to do it now which is senseless but I take it) to 4070 Ti or 4090 because the former has DLSS and better RT and the latter is really providing with massive performance gains over an old GPU.

Nothing, even big discounts, can help RDOA3. x700 and x800 area is doomed because almost everyone who wanted such performance card has already got such a card. And the leatherjerket boy will make even worse products in his RTX 5000 line-up. 8 PCI-e lane 5080 and $800 5060 just because nothing competes.
 
Joined
Jun 2, 2017
Messages
8,212 (3.20/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Yep this is definitely an AMD titled GPU post. You would think these cards are an absolute failure. There is one thing the 7900 series cards can do that no 6000 card can. Both cards can run 4K 144Hz panels no problem. There is no Game, not even TWWH3 (Not the benchmark) that cannot be enjoyed at 4K with those cards. Idle monitor usage or video playback issues are so small that people forget that the message before the cards were launched was to get a nice Beefy PSU to avoid any issues with power and I hope none of the people Whining about power draw own an OLED panel. This card has no foreknowledge so saying it will be X, Y or Z is just conjecture. I will say the last time AMD showed confidence the Internet barbqed them and some of the negative hype is directly from the propaganda Campaign that is Nvidia's "advantage". This card should absolutely kick butt at 1440P anything anyway. Let's keep in mind that the 6700XT is not far away from a 6800 at 1440P.
 
Joined
Feb 24, 2023
Messages
2,307 (4.88/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
Yep this is definitely an AMD titled GPU post. You would think these cards are an absolute failure. There is one thing the 7900 series cards can do that no 6000 card can. Both cards can run 4K 144Hz panels no problem. There is no Game, not even TWWH3 (Not the benchmark) that cannot be enjoyed at 4K with those cards. Idle monitor usage or video playback issues are so small that people forget that the message before the cards were launched was to get a nice Beefy PSU to avoid any issues with power and I hope none of the people Whining about power draw own an OLED panel. This card has no foreknowledge so saying it will be X, Y or Z is just conjecture. I will say the last time AMD showed confidence the Internet barbqed them and some of the negative hype is directly from the propaganda Campaign that is Nvidia's "advantage". This card should absolutely kick butt at 1440P anything anyway. Let's keep in mind that the 6700XT is not far away from a 6800 at 1440P.
You don't get it. nVidia has put a negative effort, yet their Ada products are still better than RDOA3. This is enough to ignore every other detail.
 
Joined
Feb 3, 2017
Messages
3,564 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
If the 7900 XTX hadn't failed, it would have been matching the 4090, just as the 6900 XT once matched the 3090.
No. Simply, no. GPUs are not that complicated to have a performance estimate on.

6900XT vs 3090 were roughly equal (figuring out the SKUs aside where AMD seems to have reacted with 6900XT)
- 80CU vs 82SM, roughly same amount of transistors and shader units. Nvidia had slight disadvantage from being half a node behind.
- AMD bet on LLC to make up for 256-bit memory bus vs 384-bit on 3090. A successful bet, in hindsight.

This is simply not the case for 4090 vs 7900XTX. 128SM vs 96CU on same process node, same memory bus width, similar enough LLC.
There are definitely cases where 79000XTX can get close, mostly when power or memory becomes the limiting factor.
 
Joined
Jun 2, 2017
Messages
8,212 (3.20/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I would. When the newer card is more expensive (as per current prices) getting +33% performance is marginal and no single live one should ever care how little the price difference is. 33% can't justify nothing.

RDOA3 has its name for an eversolid reason. More of the same is almost the complete opposite of what AMD should've been doing if they're aiming at gains and not losses. 7900XTX is a complete mess driverwise and RT-wise. 7900XT is even more of a total nonsense because it shares the 7900XTX problems and is even worse moneywise (even though you have never expected that to be possible). And... 7600 is a marginally overclocked 6650XT marketed as something new. Since the whole line-up is a 10 outta 10 failure how do you expect these 33% to justify anything? Those who already have 3090 will at least buy 4090 or, which is more likely, wait till something can really beat it effortlessly aka x2.5 performance. x2.5, not x1.33.

Those whose best card is at most 3070/6700 XT will still upgrade (if decided to do it now which is senseless but I take it) to 4070 Ti or 4090 because the former has DLSS and better RT and the latter is really providing with massive performance gains over an old GPU.

Nothing, even big discounts, can help RDOA3. x700 and x800 area is doomed because almost everyone who wanted such performance card has already got such a card. And the leatherjerket boy will make even worse products in his RTX 5000 line-up. 8 PCI-e lane 5080 and $800 5060 just because nothing competes.
Yep that's why the 6700XT is the best selling card on Newegg Canada. All this card has to beat is the 6800XT, It is not a 7900XT and what driver issue are you talking about? You must mean the 3 months AMD spent making sure that console ports sing with RDNA3. I guess you would have to own one to appreciate. Just read some of the posts in the 7000 Owners Club and you will understand. It is all about pricing.

You don't get it. nVidia has put a negative effort, yet their Ada products are still better than RDOA3. This is enough to ignore every other detail.
Yep the 4070 is the same price in Canada as the 7900XT



So which would a knowledgeable Gamer buy in a World of 4K benchmarks?

Nvidia has gone full in for Greed and are paying the price. In some ways it is the same as Intel. The issue is the hubris of Nvidia Fanboys that qoute high power draw in a world of burning connectors and use desultory words to describe something they have no real experience with.

 
Joined
Feb 24, 2023
Messages
2,307 (4.88/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
what driver issue are you talking about?
I'm talking about aforementioned absurdly high wattage at video playback and multi-monitor usage. It doesn't matter if it's fixed, the only thing that matters is they launched a product unable to perform efficiently. You can afford taking money from customers for a pre-alpha testing privilege when you're miles ahead of your competition, not when you're stone age behind and you yourself are not a competition at all.

7900 XTX was launched marginally cheaper than 4080 and has nothing to brag about. +8 GB VRAM does nothing since 4080 can turn DLSS3 on and yecgaa away from the 7900 XTX. "Sooper dooper mega chiplet arch" does also do nothing when 4080 can preserve 60ish framerates RT On whilst 7900 XTX goes for shambly 30ish FPS with permanent stuttering inbound. More raw power per buck? WHO CARES?
 
Joined
Feb 3, 2017
Messages
3,564 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Another thing people are overlooking is that AD102 has 30% more transistors, also Navi31 dedicates more transistor budget for the cache, GPUs scale very close to linearly, when you realize that there is absolutely nothing impressive about it.

More transistors = more compute units = more performance. Performance per compute unit is comparable between ADA and RDNA3.
In both RDNA3 and Ada, the purpose of LLC is not to augment compute units. It is to augment memory controller and increase effective bandwidth. Memory bus width is the same for both 4090 and 7900XTX so yes, AD102 dedicates less of it to LLC. On the other hand, the transistor budget that 7900XTX dedicates to LLC is the one that does not matter - that cache is next to the memory controllers on MCDs.
 
Joined
Jun 2, 2017
Messages
8,212 (3.20/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I'm talking about aforementioned absurdly high wattage at video playback and multi-monitor usage. It doesn't matter if it's fixed, the only thing that matters is they launched a product unable to perform efficiently. You can afford taking money from customers for a pre-alpha testing privilege when you're miles ahead of your competition, not when you're stone age behind and you yourself are not a competition at all.

7900 XTX was launched marginally cheaper than 4080 and has nothing to brag about. +8 GB VRAM does nothing since 4080 can turn DLSS3 on and yecgaa away from the 7900 XTX. "Sooper dooper mega chiplet arch" does also do nothing when 4080 can preserve 60ish framerates RT On whilst 7900 XTX goes for shambly 30ish FPS with permanent stuttering inbound. More raw power per buck? WHO CARES?
That the thing with a 7900XT you don't need any of those. Just turn the colour and contrast up, enable Freesync and you are good. Please tell me that is not the case.
 
Top