• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+

Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Uh again totally off track with F1 yes pun intended. F1 has a strict set of rules that all cars must abide by from ride heights to fuel tank size and yes they all run the same fuel. Your asking to include what is basically a cheat to be included as a valid benchmark. Tha5 would be like one driver gets an extra turbo.. benchmarking is about comparing cards under equal settings and equal conditions DLSS is the furthest thing from equal conditions. By your logic every cards 4K results are accurate while DLSS is actually 720p upscaling to 4K How you can even consider that an equal benc( result is truly laughable ...

Well if 720p upscaled to 4K have equal IQ as native 4K, I say go for it.
Every hardware editorial seems to agree that if DLSS is available, use it. They didn't say anything about trading visual quality for performance (as least with DLSS in Quality mode).

The only thing with DLSS is that it's unfair to AMD, who has no answer yet. But it's a competition out there, not some fun race between friends.

Let think of it this way, while AMD continue to optimize rasterization performance for their GPU, Nvidia effort is to incorporate DLSS into new games. But somehow benching DLSS is too unfair for AMD, it makes no fricking sense.

Also equal settings and equal conditions were never actually equal, some games just have optimizations for specific brand, just like Far Cry 5 feature rapid-packed math.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
Well if 720p upscaled to 4K have equal IQ as 4K, I say go for it.
Every hardware editorial seems to agree that if DLSS is available, use it. They didn't say anything about trading visual quality for performance (as least with DLSS in Quality mode).

The only thing with DLSS is that it's unfair to AMD, who has no answer yet. But it's a competition out there, not some fun race between friends.

Let think of it this way, while AMD continue to optimize rasterization performance for their GPU, Nvidia effort is to incorporate DLSS into new games. But somehow benching DLSS is too unfair for AMD, it makes no fricking sense.

Also equal settings and equal conditions were never actually equal, some games just have optimizations for specific brand, just like Far Cry 5 feature rapid-packed math.
Okay again fir the last time ...That’s not how benchmarks work! you compare every card under equal conditions or else it’s not a valid benchmark PERIOD. If AMD has a similar feature it ALSO could not be used...no ones denying some games are optim better for one brand but the cards are run at DEFAULT settings and results are based on their raw performance at those settings a# equal as possible I’m glad you really like DLSS but it’s a “feature“ nobody gets to run features in benchmarks. all cards are run stock/default always. Why do you think W1zzard has. single test bench where the only difference is card he“s testing and the drivers installed for it.? 5o every card is tested under EQUAL conditions.
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Okay again fir the last time ...That’s not how benchmarks work! you compare every card under equal conditions or else it’s not a valid benchmark PERIOD. If AMD has a similar feature it ALSO could not be used...no ones denying some games are optim better for one brand but the cards are run at DEFAULT settings and results are based on their raw performance at those settings a# equal as possible I’m glad you really like DLSS but it’s a “feature“ nobody gets to run features in benchmarks. all cards are run stock/default always. Why do you think W1zzard has. single test bench where the only difference is card he“s testing and the drivers installed for it.? 5o every card is tested under EQUAL conditions.

I have been saying this all along, 5700XT is being benched with PCIe 4.0 config in HUB testing, while 2070S is PCIe 3.0, get it ?
Are those equal testing conditions ?
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
I have been saying this all along, 5700XT is being benched with PCIe 4.0 config in HUB testing, while 2070S is PCIe 3.0, get it ?
Are those equal testing conditions ?
It will be tested under against every other card at the same standard The test is card to card, not card 1 under this standard and card 2 under this standard plus some hardware the other can’t have.

where you WILL see 4.0 tested with bells and whistles is in the AAA game performance reviews that W1zz does occasionally. In thise he DOES test what a game can do under the soecial abitlities or feathres that different cards have, because it is not a card to card comparison.
 
Last edited:

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
I have been saying this all along, 5700XT is being benched with PCIe 4.0 config in HUB testing, while 2070S is PCIe 3.0, get it ?
Are those equal testing conditions ?
And were bac’k to Appless and Pumpkins it does NOT effect the results n any way. Did you read the 3080 PCI scaling tests? i saw 1FPS difference across all resolutions that is literally margin of error iW1zzards current test bench is 4.0 but again ir will have absolutely zero difference you’re trying to make a metric that doesn’t exist to justify your favourite new pet feature should somehow make for valid benchmarks. you know where PCI4 would be an unfair advantage? Testing JNVME drives...something that can actually util the extra bandwidth GFX still don’t fully utilize 3.0. It was also pointed out that ithe HUB reviews f DLSS was available the results were listed but would NEVER be included in the benchmark result s
 
Joined
Sep 26, 2012
Messages
862 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
I have been saying this all along, 5700XT is being benched with PCIe 4.0 config in HUB testing, while 2070S is PCIe 3.0, get it ?
Are those equal testing conditions ?

I'm not sure why you have such a hard on for this, testing shows negligible to no difference, and all Nvidia GPU's are now at PCIE 4.0.

Furthermore, DLSS & PCIE are not comparable. One is a lossy upscaling technology only available on a few games, the other is a ubiquitous connection standard.

But if you really want to split hairs, everyone testing on an Intel system tested at PCIE3.
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
And were bac’k to Appless and Pumpkins it does NOT effect the results n any way. Did you read the 3080 PCI scaling tests? i saw 1FPS difference across all resolutions that is literally margin of error iW1zzards current test bench is 4.0 but again ir will have absolutely zero difference you’re trying to make a metric that doesn’t exist to justify your favourite new pet feature should somehow make for valid benchmarks. you know where PCI4 would be an unfair advantage? Testing JNVME drives...something that can actually util the extra bandwidth GFX still don’t fully utilize 3.0. It was also pointed out that ithe HUB reviews f DLSS was available the resu were listed but would NEVER be included in the benchmark result s

holy jeez, since when were I talking about TPU benchmarks.
I was responding to a post about HUB testing, not TPU.
In the HUB testing, Steve also said that PCIe 4.0 contribute to a few % net gain for 5700XT, which 2070S and 2060S were left out.

This is the vid I was responding to, not some PCIe scaling benchmark at TPU.

Take a hint will you.
 
Joined
Sep 26, 2012
Messages
862 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
2.4Ghz but slower than the 3080 especially when RT is turned on. Insta-Passss!

Sauce? Since you have links to prerelease benchmarks across the product stack.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
holy jeez, since when were I talking about TPU benchmarks.
I’m just informing you that TPU is now benchmarking on PCI 4.0 as well so you better clutch your pearls at all the benches done here too then. You literally can’t let go that PCI 4.0 is offering no advantage. All the AmPete cards are 4.0 I’ll be sure to cry foul when the have an unfair advantage over your 2080ti... :rolleyes:
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
I’m just informing you that TPU is now benchmarking on PCI 4.0 as well so you better clutch your pearls at all the benches done here too then. You literally can’t let go that PCI 4.0 is offering no advantage. All the AmPete cards are 4.0 I’ll be sure to cry foul when the have an unfair advantage over your 2080ti... :rolleyes:

So you are comparing apple to pumpkin and accusing me of doing so. TPU benchmark suit are different to HUB, different game different testing config.
If TPU testing show no difference then it must be the same for every other testing condition ?
Pretty small minded aren't you, and you called yourself enthusiast ?

Well since you have X570 and 5700XT, might as well test them yourself against HUB

death.png


Yeah some time next month once I get 5950X + 3090 I might do some PCIe scaling benchmark for you :D
 
Last edited:

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
So you are comparing apple to pumpkin and accusing me of doing so. TPU benchmark suit are different to HUB, different game different testing config.
If TPU testing show no difference then it must be the same for every other testing condition ?
Pretty small minded aren't you, and you called yourself enthusiast ?

Well since you have X570 and 5700XT, might as well test them yourself against HUB

View attachment 172317
The problem is YOU cannot let go of this literally insignificant deviation like 4.0 is cheating 3.0 cards somehow. Still not seeing any advantage her!e? W1z ha# always done PCI scaling tests and he’s tested this as well and I trust his methods and he literally useful the most recent flagship 4.0 card I just picked one bench at 4K but you can pic any title or resolution he tested and the all look like this...I don’t use HIUB for hardware reviews or benchmarks so no I don’t know their methods but I do know W1zzards there is no more than 2FPS difference in every resolution and game he tested with a 4.0 3080 if any card could possibly use the extra bandwidth it would be that one. Your“re chasing a metric that ha# zero impact on benchmark scoring
87D6D5C3-264E-4744-B5CE-A25DE2618ED9.png
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
The problem is YOU cannot let go of this literally insignificant deviation like 4.0 is cheating 3.0 cards somehow. Still not seeing any advantage her!e? W1z ha# always done PCI scaling tests and he’s tested this as well and I trust his methods and he literally useful the most recent flagship 4.0 card I just picked one bench at 4K but you can pic any title or resolution he tested and the all look like this...I don’t use HIUB for hardware reviews or benchmarks so no I don’t know their methods but I do know W1zzards there is no more than 2FPS difference in every resolution and game he tested with a 4.0 3080 if any card could possibly use the extra bandwidth it would be that one. Your“re chasing a metric that ha# zero impact on benchmark scoring
View attachment 172318

I litterally talked about how flawed HUB testing was, yet here you are defending TPU ?
When did I even talked about TPU
:roll:
OK sure mister "zero" impact, "equal testing condition" :D, whatever you said
 
Joined
Sep 26, 2012
Messages
862 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
Lets just pretend for a sec that PCIE 4 made more than a statistical anomaly of a difference for a sec.

Why should a PCIE 4.0 card be tested at 3.0 when its a review of its performance? It's like testing USB 3 drives against a USB 2 drive on a USB 2 slot, it doesn't make any sense.

And before you utter DLSS, PCIE 4.0 is A: A standard available regardless of game, B: Isn't a lossy upscaling solution that isn't rendering at the benchmarked resolution.
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Lets just pretend for a sec that PCIE 4 made more than a statistical anomaly of a difference for a sec.

Why should a PCIE 4.0 card be tested at 3.0 when its a review of its performance? It's like testing USB 3 drives against a USB 2 drive on a USB slot, it doesn't make any sense.

And before you utter DLSS, PCIE 4.0 is A: A standard available regardless of game, B: A lossy upscaling solution that isn't rendering at the benchmarked resolution.

C: should you disable DLSS where it is available to you ?

It's like DX12 vs DX11, you use whatever API give your better FPS and frametimes, forcing DX12 benchmark onto Nvidia cards where it is detrimental is quite unfair no ? same as disabling DLSS.

I believe the point of benchmarking is that it must resemble real world usage ?
 
Joined
Sep 26, 2012
Messages
862 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
C: should you disable DLSS where it is available to you ?

It's like DX12 vs DX11, you use whatever API give your better FPS and frametimes, forcing DX12 benchmark onto Nvidia cards where it is detrimental is quite unfair no ?


A lossy image isn't the same as a lossless image, and suggesting as such reminds me of the Quake 3 driver shens from ATi & Nvidia back in the day. Or should we benchmark AMD cards running at the same internal resolution of DLSS & apply sharpening with native render target and call it comparable?
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11

A lossy image isn't the same as a lossless image, and suggesting as such reminds me of the Quake 3 driver shens from ATi & Nvidia back in the day. Or should we benchmark AMD cards running at the same internal resolution of DLSS & apply sharpening with native render target and call it comparable?


DLSS vs Fidelity FX
 
Joined
Nov 13, 2007
Messages
10,233 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Lets just pretend for a sec that PCIE 4 made more than a statistical anomaly of a difference for a sec.

Why should a PCIE 4.0 card be tested at 3.0 when its a review of its performance? It's like testing USB 3 drives against a USB 2 drive on a USB 2 slot, it doesn't make any sense.

And before you utter DLSS, PCIE 4.0 is A: A standard available regardless of game, B: Isn't a lossy upscaling solution that isn't rendering at the benchmarked resolution.

Except drives are limited by the bandwidth offered by USB 2.0, where as the 5700xt doesn't come anywhere near tapping out the bandwidth of a PCIE 2.0 slot, nevermind 3 or 4. It's more like testing a USB 1 drive in a 2 or 3 slot.

Also PCI-E devices are specc'ed to be backwards compatible, or are supposed to be. So if the 5700XT can't properly operate at 3.0 even though it has more than enough bandwidth, then there is something wrong with the design/implementation of that standard on the card.
 
Joined
Sep 26, 2012
Messages
862 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue

DLSS vs Fidelity FX

So now one lossy solution isn't allowed to be benchmarked, but another one is? At least be consistent in your bullshit.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
I litterally talked about how flawed HUB testing was, yet here you are defending TPU ?
When did I even talked about TPU
:roll:
OK sure mister "zero" impact, "equal testing condition" :D, whatever you said
How many more charts showing zero difference do you need to see before you drop it as advantage so you can keep typing to justify DLSS a is not?
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
So now one lossy solution isn't allowed to be benchmarked, but another one is? At least be consistent in your bullshit.

Sure, with equal Image Quality let benchmark DLSS Performance (or Ultra Performance) vs FidelityFX
 
Joined
Sep 26, 2012
Messages
862 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
Sure, with equal Image Quality let benchmark DLSS Performance (or Ultra Performance) vs FidelityFX

Bit hard to say its equal image quality since DLSS's processing of the image uses graphical techniques that aren't exposed by the game engine, making comparison difficult. Secondly, not all frames have equal graphical fidelity, as when new data is introduced to the scene, the AI algorithm has to 'race' to gain info from multiple frames to reconstruct the image.

Is DLSS good? Yup, will we probably play all games like this? Yup (if Nvidia bothers to make a solution that works across all games, which is another reason why it shouldn't be used in benchmarking scenarios). But it has fatal flaws that preclude it from being compared apples to apples, and benchmarking data with DLSS should be shown, but called out separately so the reader can make a value call on the limited support and performance subset.....

Which both HWU & TPU do. So again, whats your problem?
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
Sure, with equal Image Quality let benchmark DLSS Performance (or Ultra Performance) vs FidelityFX
Again neither of those things would be used in any benchmark except comparing them to each other and you h literally found the only example where it’s ev possible because it’s the only game that has both. Still gives zero weight to your argument For DLSS.But hey I can actually argue for Fidelity FX being the better of the 2 in that game I own I finished it and use$ Fidelity F.X but that comparison shows DLSS with some obvious visual issues Fidelity FX diid‘t and overall looked better.I know my performance in the game was fantastic and looked totally amazing . Sti , can’t use it in benchmarks. .
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Bit hard to say its equal image quality since DLSS's processing of the image uses graphical techniques that aren't exposed by the game engine, making comparison difficult. Secondly, not all frames have equal graphical fidelity, as when new data is introduced to the scene, the AI algorithm has to 'race' to gain info from multiple frames to reconstruct the image.

Is DLSS good? Yup, will we probably play all games like this? Yup (if Nvidia bothers to make a solution that works across all games, which is another reason why it shouldn't be used in benchmarking scenarios). But it has fatal flaws that preclude it from being compared apples to apples, and benchmarking data with DLSS should be shown, but called out separately so the reader can make a value call on the limited support and performance subset.....

Which both HWU & TPU do. So again, whats your problem?

DLSS is like a specific customization for a game, doesn't mean it should be excluded though.
I was responding to a recent HUB testing that did not include DLSS result, so there is that.
TPU did some DLSS 2.0 testing, where is that ?

Again neither of those things would be used in any benchmark except comparing them to each other and you h literally found the only example where it’s ev possible because it’s the only game that has both. Still gives zero weight to your argument For DLSS.But hey I can actually argue for Fidelity FX being the better of the 2 in that game I own I finished it and use$ Fidelity F.X but that comparison shows DLSS with some obvious visual issues Fidelity FX diid‘t and overall looked better.I know my performance in the game was fantastic and looked totally amazing . Sti , can’t use it in benchmarks. .

Kinda funny your conclusion is entirely contrary to what the author was saying. But hey, I was saying HUB was pretty unfair too :D, reviewer vs normal user yeah ?
You can apply FidelityFX to any game, just create a custom resolution and use the GPU scaler to upscale it to fit the screen, 5 seconds custom job.
 
Last edited:
Joined
Nov 4, 2019
Messages
234 (0.14/day)
DLSS is like a specific customization for a game, doesn't mean it should be excluded though.
I was responding to a recent HUB testing that did not include DLSS result, so there is that.
TPU did some DLSS 2.0 testing, where is that ?

DLSS is terrible and doesn't work in 99 percent of the games I play. Next.

No because polls from hardware unboxed 3 months back suggest that only 7% of card owners were having severe issues like what you were describing (and lets be honest here, a number of those users are likely Nvidia owners checking the box that makes AMD look worst). You are suggesting that 100% of 5700 / 5700 XTs had issues and that simply isn't true. I should not have to say that though, if 100% had issues, drivers or otherwise, AMD would have been decimated. Why you make the assumption that 5/5 cards having issues is "likely" . No, in no universe is it. None of the data supports that and we haven't ever seen anywhere near 100%.

My prior statement stands: Either you are extraordinary unlucky or there is something else going on.



No one's saying AMD shouldn't improve it's drivers. Navi clearly had / has issues.

The problem is when people come into threads making claims like 100% of AMD cards have issues. You can't complain about partisanship when you yourself added to it.

I didn't contribute to any partisanship. I bought 5 RX 5700 XT cards over 13 months for 5 different clients and all of them had problems. 5/5 had problems. I'm pointing out it was not rare at all. There was mass misery online with those people knowledgeable enough about the problem trying to bring it to AMD's attention. Many people using those cards didn't even know it if they didn't play the same games, or didn't watch Netflix on Windows for example.

Lived experiences are better than random people talking about something they haven't experienced.

AGAIN: RMA of a broken card, vs. unfixable crashing from driver issues... those two things are totally different.
 
Last edited:
Top