• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

Joined
Nov 3, 2013
Messages
2,141 (0.56/day)
Location
Serbia
Processor Ryzen 3600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 470 Nitro+ 4GB
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (gateron milky yellow)
Software W10
Yep.

Lots of engineers these days on these forums, in where their skills obviously exceeds that of AMD mosts professional out there.
You mean the same Amd professionals that made the 5500XT 4GB?
 
Joined
Feb 20, 2019
Messages
7,188 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
AMD are clearly taking the piss with this card. I thought nGreedia were bad, but this card is bordering on the unusable.
Unusable? No. It'll work just fine.
Even if the x4 interface limits it slightly, it'll be way faster than the integrated graphics or GTX 750Ti it's replacing, for example.

It's just that some people will complain that they were promised 60fps in Fortnite on max settings and they're only getting 56fps; They'll have a valid complaint.
 
Joined
Mar 21, 2016
Messages
2,194 (0.75/day)
I don't think it'll have a major impact insignificant overall, but cost savings probably more readily noticeable. It's probably worth the trade off from AMD's perspective and in the end worth it for the consumer as well even though on paper from a technical standpoint looks less ideal. I do wish they would've used the untapped PCIE slot bandwidth though for M.2 devices rather than just leaving them vaccant. It would be worth paying the cost association to integrate and include 2-3 of those especially for SFF ITX and Micro ATX builds. There is more to be gained with PCIE 5.0 in regard to putting the untapped PCIE lanes to usage though. It would be a shame if Intel didn't take advantage of that on it's GPU's depending on PCIE usage in regard to hardware itself. The sad part is AMD already has demonstrated using a M.2 on a GPU.
 
Last edited:

Bongo_

New Member
Joined
Jan 7, 2022
Messages
1 (0.00/day)
The point is that people who use it on PCIe 3.0 or perhaps even 2.0 board will also be limited to x4 link but wil much less bandwidth than 4.0 x4 would provide. Obviously 4.0 x4 is just fine for this card but it may not be for 3.0 or 2.0 users.

Based on TPU's GPU database and assuming 6500XT has roughly the performance of GTX 980 it could lose up to 14% with 2.0 and up to 6% with 3.0: https://www.techpowerup.com/review/nvidia-gtx-980-pci-express-scaling/21.html
The GTX 1080 loses 8% of its performance when run at PCIe 3.0 x4
The GTX 980 loses 5% of its performance when run at PCIe 3.0 x4

It's fair to say that the 6500XT stands to lose at least 5% of its performance when used in a PCIe 3.0 system as it's likely to fall somewhere between the range of those two cards.

If you have a PCIe 3.0 system you plan to put a 6500XT into it's worth bearing in mind that you're not getting the advertised performance, but 92-95% of what you'd otherwise get is still good enough that it's not a deal-breaker.
From the article:
This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection.
 
Joined
Jun 18, 2021
Messages
2,247 (2.22/day)
From the article:
This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection.

You misunderstood, the bandwidth of PCIe 4.0 x4 is in fact equivalent to 3.0 x8 and 2.0 x16 but this card will only be able to work at 4.0 x4, 3.0 x4 and 2.0 x4 (pcie is backwards compatible but lanes are lanes and can't be split, you can't split x1 pcie 4.0 into x2 3.0)
 
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
Yep.

Lots of engineers these days on these forums, in where their skills obviously exceeds that of AMD mosts professional out there.
You mean the people that basically invented Vulkan and are still trying to figure it out? Yeah, sure.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Ultimately it's a little thing, but those over the years wishing the "evil competition" would disappear are slowing learning the good guys can quickly become monsters.
 
Joined
Mar 28, 2020
Messages
1,632 (1.12/day)
In my opinion, the specs for this card is way too gimped to be good. But good really depends on the price. AMD's MSRP sounds about right for a budget card in times like this. But the problem is whether it will turn up at MSRP or 2x the MSRP. Anything more than MSRP is not worth it, at least for me. The main one that annoys me is the removal of AV1 decode. How much does it cost to add that feature when cheap ARM SOCs can support AV1 decode?

From the article:
This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection.
It is if you are using it in a PCI-E 4.0 slot. Problem is that budget gamers may want to just upgrade their GPU, and I believe most people are still using a system with PCI-E 3.0 support. Even AMD themselves are selling Zen 3 APUs with PCI-E 3.0 only support. I wonder do they save a lot of money by cutting the PCI-E lane support by half? Are they that tight fisted that x8 is not possible?
 
Last edited:
Joined
Aug 4, 2020
Messages
1,567 (1.18/day)
Location
::1
This is AMD intentionally being cheap. The PCIe 4.0 is just a charade and a strawman. Nobody who can afford a PCIe 4.0 platform will buy this card. The best budget options are the 10100F and 10400F and will in all likelihood remain for at least another year, the H610 board's just too expensive. The 10100F and maybe 10400F are what you'd want to pair with such a card except, because fuckyou thatswhy now we're stuck on 3.0 x4 and losing ~4% of performance. Because AMD can.

Yeah.
Fuck you too AMD.
 
Joined
Oct 15, 2011
Messages
1,920 (0.42/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
WTF? AV1 is not the proprietary stuff that H.264 and HEVC are.
 
Joined
Dec 6, 2016
Messages
748 (0.28/day)
Ultimately it's a little thing, but those over the years wishing the "evil competition" would disappear are slowing learning the good guys can quickly become monsters.

If they are monsters for you, what is the "evil competition" then? Morgoth? Even with stupid decisions like this they are still the least evil of the trio, and it seems it will stay like that for a long time ...
 
Joined
Aug 12, 2020
Messages
1,115 (0.84/day)
The best budget options are the 10100F and 10400F and will in all likelihood remain for at least another year, the H610 board's just too expensive.
Nah, I ordered i3 12100F + Gigabyte B660M Gaming mATX for roughly the same price as I'd pay for 10400F + similarly featured B560 mobo. I will have to wait extra week for delivery, but that's not a problem for me.

That said, missing codecs thing is def concerning for this card.
 
Joined
Oct 10, 2009
Messages
786 (0.15/day)
Location
Madrid, Spain
System Name Rectangulote
Processor Core I9-9900KF
Motherboard Asus TUF Z390M
Cooling Alphacool Eisbaer Aurora 280 + Eisblock RTX 3090 RE + 2 x 240 ST30
Memory 32 GB DDR4 3600mhz CL16 Crucial Ballistix
Video Card(s) KFA2 RTX 3090 SG
Storage WD Blue 3D 2TB + 2 x WD Black SN750 1TB
Display(s) 2 x Asus ROG Swift PG278QR / Samsung Q60R
Case Corsair 5000D Airflow
Audio Device(s) Evga Nu Audio + Sennheiser HD599SE + Trust GTX 258
Power Supply Corsair RMX850
Mouse Razer Naga Wireless Pro / Logitech MX Master
Keyboard Keychron K4 / Dierya DK61 Pro
Software Windows 11 Pro
Don't be so sure, the 6600XT clearly saturates the bus occasionally with a measurable performance drop at PCIe 3.0 x8:



The 6500XT is half the performance, but PCI 3.0 x4 is also half the bandwidth, implying that the 6500XT will very likely saturate the bus.

The performance drop caused by putting a 6500XT into a PCIe 3.0 slot could easily be the 98% drop to 93% drop in the chart above if everything scales linearly (it doesn't, but the factors that scale non-linearly might cancel each other out - we'll have to wait for real-world PCIe scaling tests like the above test to know for sure).
Well, a 1080 card for Apex and Fortnite. They are playing with the probable fact that this card is for your kid's pc who doesn't know anything about computers so making cuts here and there is something no one at that level would notice even, 4-5 fps less maybe?

hat said, missing codecs thing is def concerning for this card.
And the price point. I can live without a few fps in games, cutting on video decoding is just a plain bad decision. These cards could have been reused as HTPC cards in the future when the kid's pc is rebuilt and this just gimps their reusability. Precisely what the low end cards needs is all the wacky not gaming implementations, that's how a lot 9300GT survived as Physx cards back then.
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
If they are monsters for you, what is the "evil competition" then? Morgoth? Even with stupid decisions like this they are still the least evil of the trio, and it seems it will stay like that for a long time ...

Please don't defend them.
 
Joined
Apr 19, 2018
Messages
957 (0.44/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Unusable? No. It'll work just fine.
Even if the x4 interface limits it slightly, it'll be way faster than the integrated graphics or GTX 750Ti it's replacing, for example.

It's just that some people will complain that they were promised 60fps in Fortnite on max settings and they're only getting 56fps; They'll have a valid complaint.
A second-hand gaming laptop for double the price of this card would perform the same, and is a whole, portable system. This card has no future because the kids that will want this card will be pissed at the performance, as it will be slower than their friends on the consoles, and finished off by the fact they will also find out that they can't stream video of their gameplay, makes it a hard pass, almost useless card. For once, i'd pay nGreedia the $50 more, and get a real graphics card.
 
Joined
Oct 23, 2020
Messages
671 (0.54/day)
Location
Austria
System Name nope
Processor I3 10100F
Motherboard ATM Gigabyte h410
Cooling Arctic 12 passive
Memory ATM Gskill 1x 8GB NT Series (No Heatspreader bling bling garbage, just Black DIMMS)
Video Card(s) Sapphire HD7770 and EVGA GTX 470 and Zotac GTX 960
Storage 120GB OS SSD, 240GB M2 Sata, 240GB M2 NVME, 300GB HDD, 500GB HDD
Display(s) Nec EA 241 WM
Case Coolermaster whatever
Audio Device(s) Onkyo on TV and Mi Bluetooth on Screen
Power Supply Super Flower Leadx 550W
Mouse Steelseries Rival Fnatic
Keyboard Logitech K270 Wireless
Software Deepin, BSD and 10 LTSC
What a Junk Card, 64bit and only x4.
But 200$ for this one, a 3050 8GB looks like very cheap with MRSP 249$ against this 6500.

Primary the Card would be a choice for User with older Hardware and then they got only PCIe 3.0 x4,
its a slower connection than a GT 1030.
 
Joined
Dec 6, 2016
Messages
748 (0.28/day)
Please don't defend them.
I'm not defending them, I just stated the facts. GPP, vendor-lock in, anticompetitive buyouts, etc. are far more damaging (and long lasting) to the ecosystem than selling a crippled card that should cost $100 for $200. Prove me wrong.

I really won't care too much if you blast the card, but please don't make it look like AMDs design decisions for this are comparable with the super shady stuff the "evil competition" does. It just makes the "evil competition" look better and enables them to do more shady stuff.
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I fully expected you to do a " yeah, but Nvidia" post and you delivered.
 
Joined
Aug 4, 2020
Messages
1,567 (1.18/day)
Location
::1
Nah, I ordered i3 12100F + Gigabyte B660M Gaming mATX for roughly the same price as I'd pay for 10400F + similarly featured B560 mobo. I will have to wait extra week for delivery, but that's not a problem for me.

That said, missing codecs thing is def concerning for this card.
Well, you're an enthusiast (on a budget, but nonetheless)
Average people on a budget like this will just score the cheapest, bottom-of-the-barrel board and call it a day. At this price bracket w/ a 65W part, there's nothing wrong w/ that either. And it'll be quite a bit cheaper than a b660.
 
Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Proper troll fest now eh , lol looks like a hate fest in here, meanwhile no reviews no tests just hyperbolic bs.

I await reviews, and then like most commenters, I already have a better GPU, I wouldn't be buying it, it wouldn't matter to me, I would expire no butt hurtness.
 
Joined
Jan 14, 2019
Messages
9,723 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
"This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection" - In bandwidth yes, but one needs to be careful with PCI-e 3.0 and 2.0 motherboards, as those will still run the card at x4.

"The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding." - I can't understand or endorse AMD's decision on that. Even the HTPC market will look away now.
 
Joined
Aug 12, 2020
Messages
1,115 (0.84/day)
Well, you're an enthusiast (on a budget, but nonetheless)
Average people on a budget like this will just score the cheapest, bottom-of-the-barrel board and call it a day. At this price bracket w/ a 65W part, there's nothing wrong w/ that either. And it'll be quite a bit cheaper than a b660.
Fair enough about myself.
But if someone is looking at bottom of the barrel stuff, they should not even be thinking about this card for a while to begin with.
Because, let's face it, it WILL be price inflated, even if it won't make a good miner. Demand WILL be crazy, and sellers WILL take advantage of it regardless. At best I expect it to match price/performance of those already inflated "miner approved" cards, and for some time after launch maybe even worse.
 
Joined
Dec 16, 2017
Messages
2,720 (1.19/day)
Location
Buenos Aires, Argentina
System Name System V
Processor AMD Ryzen 5 3600
Motherboard Asus Prime X570-P
Cooling Cooler Master Hyper 212 // a bunch of 120 mm Xigmatek 1500 RPM fans (2 ins, 3 outs)
Memory 2x8GB Ballistix Sport LT 3200 MHz (BLS8G4D32AESCK.M8FE) (CL16-18-18-36)
Video Card(s) Gigabyte AORUS Radeon RX 580 8 GB
Storage SHFS37A240G / DT01ACA200 / WD20EZRX / MKNSSDTR256GB-3DL / LG BH16NS40 / ST10000VN0008
Display(s) LG 22MP55 IPS Display
Case NZXT Source 210
Audio Device(s) Logitech G430 Headset
Power Supply Corsair CX650M
Mouse Microsoft Trackball Optical 1.0
Keyboard HP Vectra VE keyboard (Part # D4950-63004)
Software Whatever build of Windows 11 is being served in Dev channel at the time.
Benchmark Scores Corona 1.3: 3120620 r/s Cinebench R20: 3355 FireStrike: 12490 TimeSpy: 4624
But if someone is looking at bottom of the barrel stuff, they should not even be thinking about this card for a while to begin with.
... I'd argue they wouldn't even look at graphics cards but rather whatever IGP is available in Intel or AMD's CPUs
 
Top