• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6600 XT PCI-Express Scaling

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,037 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
When the Radeon RX 6600 XT launched with an interface limited to PCI-Express 4.0 x8, lots of discussion emerged about how AMD crippled the bandwidth, and how much it affects the gaming experience. In this article, we're taking a close look at exactly that, comparing 22 titles running at PCIe 4.0, 3.0, 2.0, and even 1.1. Frametimes are included, too.

Show full review
 
Joined
Jun 19, 2010
Messages
401 (0.08/day)
Location
Germany
Processor Ryzen 5600X
Video Card(s) RTX 3050
Software Win11
Thank you very much for your article. well done.
It doesn´t struggle as much as the 4GB versions of 5500(XT), right?
InfinityCache helping a lot, but leaving those exotic corner cases as you mentioned?
 
Joined
Jan 24, 2011
Messages
272 (0.06/day)
Processor AMD Ryzen 5900X
Motherboard MSI MAG X570 Tomahawk
Cooling Dual custom loops
Memory 4x8GB G.SKILL Trident Z Neo 3200C14 B-Die
Video Card(s) AMD Radeon RX 6800XT Reference
Storage ADATA SX8200 480GB, Inland Premium 2TB, various HDDs
Display(s) MSI MAG341CQ
Case Meshify 2 XL
Audio Device(s) Schiit Fulla 3
Power Supply Super Flower Leadex Titanium SE 1000W
Mouse Glorious Model D
Keyboard Drop CTRL, lubed and filmed Halo Trues
Thank you very much for your article. well done.
It doesn´t struggle as much as the 4GB versions of 5500(XT), right?
InfinityCache helping a lot, but leaving those exotic corner cases as you mentioned?
Infinity Cache doesn't have anything to do with PCIe bandwidth bottlenecks; it's just helping alleviate memory bandwidth pressure due to the smaller memory bus. The 6600 XT doesn't run into the same performance issues as the 5500 XT because it's not running out of VRAM and trying to shuttle data back and forth from main memory.
 
Joined
Aug 4, 2020
Messages
1,570 (1.16/day)
Location
::1
Probably the most important question for many of you is "How much performance is lost due to AMD cutting the PCIe interface in half, down to x8 from x16?"
Given this previous data:

I am pretty confident that it is safe to assume that the delta would be nil. (As PCIe 4.0 x8 = PCIe 3.0 x16.)

Also, small other thing (unrelated) that's been annoying me for awhile now: Currently, GPU reviews are structured like this:
Introduction > Individual Gamespam > Performance Summary > Conclusion.
Given how many games are tested, I'd like to suggest this structure instead:
Introduction > Performance Summary > Conclusion > Appendices for the Individual Gamespam
The individual Games are of course relevant and should be published, but it's kinda really annoying to click through them just to reach the Performance Summary and Conclusion pages since for the vast majority of users that's what they're after since the grand total is reasonably representative (and perhaps a few individual titles they themselves play). The individual games should instead be relegated to the Appendices, and then each linked individually in a list there, like this
* Game A
* Game B
*Game C
et cetera
 
Low quality post by london

Outback Bronze

Moderator
Staff member
Joined
Aug 3, 2011
Messages
1,896 (0.41/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 32GB Hynix
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Gigabyte 34" Curved
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
Hey W1z great review as usual!

Any chance of a 68/6900 XT scaling review?
I see the latest AMD card was the 5700 XT and from that review there didn't seem to be much variance in the PCI-E scaling from say the
RTX 2080 TI & 3080. It would be interesting to see what the latest high end AMD cards deliver. I fully understand if this cant be achieved.
Also on the opening page you have RTX 980 which I'm pretty sure should be GTX 980 : )

Thanks.
 
Last edited:
Joined
Feb 20, 2019
Messages
7,293 (3.87/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Cool, so at this class of card more bandwidth is always better but we're getting real close to diminishing returns at 8GB/s (so PCIe 3.0 x8 and PCIe 4.0 x4) and outside of a few games, even PCIe 3.0 x4 isn't leaving too much on the table.

another SAD attempt to push this utter crap GPU
Seems like a pretty decent GPU to be honest - cheap to make, power-efficient, and capable well into 1440p for AAA gaming.
If you're moaning about the price, join the queue; In 2019 this class of card would have been a $299 product, sure, but this isn't 2019.
 
Joined
Mar 28, 2020
Messages
1,643 (1.11/day)
Price aside, I think the card is decent in performance. However, what I don’t like is that for a card that is meant for “budget” gamers who are mostly on PCI-E 3.0, the fact that you may not be able to get the most out of the GPU is quite annoying, even it’s not a common issue. I wonder if the main driver for AMD to cut down on number of PCI-E lane support is due to cost and power savings.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.19/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
First thought before reading anything:

STICK IT IN A 1x SLOT

Edit: wow, the loss is actually quite small. <20% on the really outdated 1.1 8x is impressive, and the 2.0 results are almost not noticeable in general use.

another SAD attempt to push this utter crap GPU
How is it crap? its a great 1080p/1440p budget card, and the prices slaughter nvidia in many regions

 
Joined
Jun 27, 2011
Messages
6,684 (1.43/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
I had to go look at Hitman's results to see the worst case scenario. That is pretty severe in my opinion. @W1zzard how much more FPS would you estimate Hitman 3 would have if it was PCIE 4 x16? I am wondering if the worst case scenario would still be bottlenecked by full PCIE 4.
 
Last edited by a moderator:
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
no test done with RT enabled? I guess 6600XT is a "1080p non-RT GPU" then :roll:
 
Joined
Apr 11, 2021
Messages
214 (0.19/day)
How is it crap? its a great 1080p/1440p budget card, and the prices slaughter nvidia in many regions
The prices slaughter Nvidia probably because demand is relatively weak, ergo the market consider it relatively crappier? Performance is solid at 1080p, but 1440p is a stretch and even AMD knows it and appropriately markets it as a 1080p card (Cyberpunk particularly stands out at less than 40 fps, but even another relatively recent title like Valhalla can't hit 60 fps); in addition a relatively worse video encoder (I understand that streaming is very popular between 1080p esports players) and purely theoretical raytracing capacities make for an overall worse card compared to a 3060ti and even the less performing 3060 can be an overall more attractive option for some folks. Of course it's a decent option if one is desperate for a replacement, for instance because the old GPU has bought the farm, and can't afford to spend more money, but I wouldn't really consider it otherwise.
 
Last edited:

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
no test done with RT enabled? I guess 6600XT is a "1080p non-RT GPU" then :roll:
Well considering the RT performance of the 3060 I think it’s pretty pointless at this level…
 
Joined
May 21, 2019
Messages
12 (0.01/day)
Interesting. Some game engines are definitely streaming in assets on the fly (from CPU/RAM) via PCIe for GPU to render or display.

PCIe bandwidth has never really been a huge issue. You can get around it by copying data to GPU VRAM during level load. Seamless open world games don't really have this opportunity though, especially as you get farther away from preloaded/cached area. Has to be streamed in. Optimizing texture quad sizes for stream-in can help maximize available PCIe bandwidth without bogging down or hitching. Most engine designers probably expect a minimum of PCIe 3.0 these days. That shows in the small difference between 3.0/4.0, but much larger discrepancies for 2.0/1.1.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,037 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
how much more FPS would you estimate Hitman 3 would have if it was PCIE 4 x16?
Not a lot. If you look at any of our RX 6600 XT reviews and check Hitman 3, 1080p. Look where 6600 XT is in relation to other cards it's not way down. Maybe a few percent, if even that much

no test done with RT enabled? I guess 6600XT is a "1080p non-RT GPU" then :roll:
6600 XT is not usable with RT at 1080p imo and it's not worth sacrificing other settings just for RT

Most engine designers probably expect a minimum of PCIe 3.0 these days.
I'd say that most engine designers expect a console ;) some slack and are done at this point, others do test on PC

It's interesting to understand why Techspot/HWUnbox Doom testing had such a big FPS gap between 4.0 and 3.0. Early drivers, maybe?
View attachment 213984
Could be test scene. Did they specifically search for a worst case or is that their typical standard test scene?
 
Joined
Apr 21, 2005
Messages
170 (0.02/day)
Great testing. Would be interested in if PCIe 4 x 4 gets the same results as PCIe 3 x 8. I expect they would but you never know.
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
It's interesting to understand why Techspot/HWUnbox Doom testing had such a big FPS gap between 4.0 and 3.0. Early drivers, maybe?
View attachment 213984

It depends on where you test. For example I still do not understand how Wizzard doesnt see regressions in DOOM Eternal with 8GB of VRAM cards. My RTX 2080 and RX 5700 XT both had issues at 4K and even some (though much fewer) at 1440p Due to VRAM amounts.

So apparently, Wizzard is testing in an area that is just much MUCH lighter on memory, HU tests in an area that is much harder on memory and I test the DLC areas that are huge and even more demanding on VRAM (also, actually fun to play unlike the easy initial 3 levels). Also, I play for at least 5 minutes, while a short benchmark run may not be nearly as harsh on VRAM by design.

So that really is it. IMHO the biggest ever mistake was doing WItcher 3 CPU tests in White Orchard's forest area... that is like doing a CPU test while staring at the sky in GMOD.
 
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
6600 XT is not usable with RT at 1080p imo and it's not worth sacrificing other settings just for RT

Could be test scene. Did they specifically search for a worst case or is that their typical standard test scene?

Ultra vs High settings is just as arbitrary as RT ON/OFF, though I bet more people could notice RT (Reflections) than High vs Ultra settings during gameplay.

Could you include Doom Eternal into the RT benchmark suite? the game get insane high FPS and is already in the benchmark suite anyways. BTW not only HUB is experiencing big drop off perf in Doom Eternal with PCIe Gen3, every other 6600XT review I have seen is saying the same thing, did you disable Resolution Scaling option in Doom Eternal?

6600.jpg
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,037 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
disable? you mean enable?

I just checked in-game settings and the option is called Resolution Scaling Mode, which should be set to OFF instead of Dynamic to have a fair comparison between GPUs
 
Joined
Sep 17, 2014
Messages
20,917 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
yeah but we cant ban people just for having bad opinions, at least anyone seeing his post has a good rebuttal and knows he's incorrect

Bans are not the answer. What the new cultures on internet need is a straight up reminder their opinion is irrelevant and facts - only facts - are relevant.

Every time, in their face, they need to be told they're wrong and idiots for sticking with nonsense (perhaps in a nicer way). Keep bashing it in until there is no escape. That goes for all those new alternative facts we see. And if the facts are wrong, present a fact that proves it. That's how you get proper discussions.
 
Joined
Jan 27, 2008
Messages
25 (0.00/day)
System Name Home
Processor Ryzen 5 5600X@4.8Ghz
Motherboard Gigabyte Aorus Pro AXi 550B
Cooling Custom water cooling CPU+GPU (dual 240mm radiators)
Memory 32GB DDR4 3600Mhz CL16
Video Card(s) Gigabyte RTX2080 Super Waterforce
Storage Adata M.2 SX8200Pro 1TB + 2TB Crucial MX500 2.5" SSD + 6TB WD hdd
Display(s) Acer Nitro XF252Q FullHD 240hz + 1440p 144hz AOC
Case CoolerMaster NR200 white
Audio Device(s) SteelSeries Arctis 9 Wireless headset
Power Supply Corsair SF600 Platinum
Mouse Logitech G Pro Wireless Superlight
Keyboard Logitech G915 TKL
Software Windows 10
This is a great e-gpu card! Those usually are constrained to low pci-e specs because the adapter.
 
Top