Friday, January 7th 2022

AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

The recently announced AMD Radeon RX 6500 XT only features a PCIe 4.0 x4 interface according to specifications and images of the card published on the ASRock site. This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection and is a step down from the Radeon 6600 XT which features a PCIe 4.0 x8 interface and the Radeon 6700 XT with a PCIe 4.0 x16 interface. This fact is only specified by ASRock with AMD, Gigabyte, ASUS, and MSI not mentioning the PCIe interface on their respective pages. The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding.
Sources: ASRock (via VideoCardz), 3DCenter
Add your own comment

114 Comments on AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

#26
Chrispy_
The GTX 1080 loses 8% of its performance when run at PCIe 3.0 x4
The GTX 980 loses 5% of its performance when run at PCIe 3.0 x4

It's fair to say that the 6500XT stands to lose at least 5% of its performance when used in a PCIe 3.0 system as it's likely to fall somewhere between the range of those two cards.

If you have a PCIe 3.0 system you plan to put a 6500XT into it's worth bearing in mind that you're not getting the advertised performance, but 92-95% of what you'd otherwise get is still good enough that it's not a deal-breaker.
Posted on Reply
#27
Mothertrucker19
Am I the only one who's more bothered by only 2 video outputs?
Posted on Reply
#28
ExcuseMeWtf
We don't even get hypothetical 6500XT with x16 so it's not even possible to test how much this card would lose, if any lol.
I just trust AMD engineers know what they are doing there and will base my decision on final performance numbers and ofc pricing/availability (so almost certainly nope due to latter :roll: )
Posted on Reply
#29
mechtech
DeeveoI'd be more worried about the cut video processing capabilities than the PCIE lane amount. This makes a huge difference for anyone planning to use this for HTPC environment.
Agreed

"The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding."

If so that's a major ding against it right there. Smaller cards like this should be strong with codecs to offset the mediocre gaming performance. I was considering this for my HTPC, but if it's true about the codecs, then nope, sorry, I will keep my money.
ShurikNx16 is definitely not needed, but x4 is just straight up insulting. Put it in 3.0 system and you got yourself a quarter of the PCI bandwidth of an RX470. A card launched 4 and a half years ago.
But the memory bandwidth is almost 1/4 due to the 64-bit memory interface vs the 256-bit interface of the RX470. One thing is probably certain, miners won't be buying it, and probably neither with anyone else lol (except oems) And with the possible omission of hardware codecs, it's looking like a waste of silicon and a PCB.
Posted on Reply
#30
ixi
mechtechAgreed

"The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding."

If so that's a major ding against it right there. Smaller cards like this should be strong with codecs to offset the mediocre gaming performance. I was considering this for my HTPC, but if it's true about the codecs, then nope, sorry, I will keep my money.
Yep, codecs are big boomer... Dont like it either :/.
Posted on Reply
#31
Chrispy_
mechtech"The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding."

If so that's a major ding against it right there. Smaller cards like this should be strong with codecs to offset the mediocre gaming performance. I was considering this for my HTPC, but if it's true about the codecs, then nope, sorry, I will keep my money.
HTPC options suck right now (I just dumped a full-fat 3060 in mine as I have two slots and plenty of cooling in my HTPC case) but the traditional 75W sector has been neglected for the last two generations.

I'd be surprised if Nvidia don't eventually release a desktop variant of GA107, which at it's fully-enabled spec is called the "Laptop 3050 Ti". In laptops it is configurable from 35-80W, so presumably it would make a good candidate for a low-profile, slot-powered HTPC card.

To the best of my knowledge GA107 has all the features of the Ampere lineup with nothing cut.
Posted on Reply
#32
Tartaros
ShurikNx16 is definitely not needed, but x4 is just straight up insulting. Put it in 3.0 system and you got yourself a quarter of the PCI bandwidth of an RX470. A card launched 4 and a half years ago.
I doubt it will saturate the bus even with that downgrade.
HTPC options suck right now (I just dumped a full-fat 3060 in mine as I have two slots and plenty of cooling in my HTPC case) but the traditional 75W sector has been neglected for the last two generations.
A GT3030 or 1630 or however they want to call it. But at this point we might have to wait to Intel to see good options in the 100 bucks space, nowadays everything is overpriced and old.
Posted on Reply
#33
Chrispy_
TartarosI doubt it will saturate the bus even with that downgrade.
Don't be so sure, the 6600XT clearly saturates the bus occasionally with a measurable performance drop at PCIe 3.0 x8:



The 6500XT is half the performance, but PCI 3.0 x4 is also half the bandwidth, implying that the 6500XT will very likely saturate the bus.

The performance drop caused by putting a 6500XT into a PCIe 3.0 slot could easily be the 98% drop to 93% drop in the chart above if everything scales linearly (it doesn't, but the factors that scale non-linearly might cancel each other out - we'll have to wait for real-world PCIe scaling tests like the above test to know for sure).
Posted on Reply
#34
holyprof
Chrispy_HTPC options suck right now (I just dumped a full-fat 3060 in mine as I have two slots and plenty of cooling in my HTPC case) but the traditional 75W sector has been neglected for the last two generations.

I'd be surprised if Nvidia don't eventually release a desktop variant of GA107, which at it's fully-enabled spec is called the "Laptop 3050 Ti". In laptops it is configurable from 35-80W, so presumably it would make a good candidate for a low-profile, slot-powered HTPC card.

To the best of my knowledge GA107 has all the features of the Ampere lineup with nothing cut.
I fully agree with what you wrote. I'm currently using my old GTX960 as a HTPC GPU because there's nothing I can replace it with. It works well for some gaming and for outputting 4k video signal to feed to the TV set.
I don't expect the 75W HTPC cards to return anymore. AMD's current APUs (and future Intel) are good enough and made that segment obsolete.
I tested a PC with AMD Ryzen 5 PRO 4650G APU and it's really good for everything except hi-res AAA gaming. Any person that's happy with FHD at medium-low settings can skip the old $100-$200 GPU range because the APUs are good enough.
Posted on Reply
#35
Chrispy_
holyprofI fully agree with what you wrote. I'm currently using my old GTX960 as a HTPC GPU because there's nothing I can replace it with. It works well for some gaming and for outputting 4k video signal to feed to the TV set.
I don't expect the 75W HTPC cards to return anymore. AMD's current APUs (and future Intel) are good enough and made that segment obsolete.
I tested a PC with AMD Ryzen 5 PRO 4650G APU and it's really good for everything except hi-res AAA gaming. Any person that's happy with FHD at medium-low settings can skip the old $100-$200 GPU range because the APUs are good enough.
I hate to say it but waiting for an expensive, unicorn GPU to appear that may never appear is a far less sensible approach than just rebuilding your HTPC to accommodate dual-slot cards. There are now enough good HTPC cases that can take either an SFX or full ATX PSU and still provide enough ventilation for 150W GPUs or more. I got bored of waiting for a good low-end card to replace my passively-cooled 7600GT and just bought a new case (Silverstone GD04) that could accommodate bigger cards. I've used the Fractal Node 202 quite a lot for HTPC builds as it fits in an AV/media rack alongside consoles* and surround receivers.

My solution is to get a more expensive card than necessary and then massively reduce the TDP. The 3060 I have restricted to around 120W and whilst I'm obviously losing some performance it's near-silent whilst still being moderately capable.

*yeah, I'm not sure what Microsoft was thinking this generation.
Posted on Reply
#36
Logoffon
puma99dk|This here could make partners do graphics cards that has a physical PCI-E x4 and x8 slot on the cards I remember some partners made a lower end Nvidia GeForce GPU with physical PCI-E x1 port on their card.

I think one was Zotac back in the day for their GT 520 or 710.
Zotac actually made an x1 GT 730, and there are quite a few others as well.
www.zotac.com/jp/product/graphics_card/geforce%C2%AE-gt-730-pcie-x1
www.techpowerup.com/gpu-specs/?mobile=No&interface=PCIe%202.0%20x1&sort=generation

Oh wait, how about some "modern" GPUs on the 133MHz PCI interface?
www.techpowerup.com/gpu-specs/geforce-gt-610-pci.c914
www.techpowerup.com/gpu-specs/radeon-hd-7350-oem-pci.c2365
Posted on Reply
#37
ShurikN

And this was with x8. 6500XT is x4
Now I really wanna see how this card performs in a 3.0 VS 4.0 comparison. I doubt infinity cache will help it that much
MusselsIf its OEM only, and only for systems with PCI-E 4.0 out of the box that more than enough bandwidth
6400 is oem only. 6500XT is not.
Posted on Reply
#38
Lionheart
I doubt this card needs heaps of bandwidth anyways but at the same time, that's gross AMD.
Posted on Reply
#39
TheoneandonlyMrK
Is pciex 3 still Even a thing?!.:p

I jest but of the four computers I have ATM two Intel three and and all released in the last two years None have pciex 3 main slots?!.

So
Posted on Reply
#40
kruk
nguyen6500XT is so bad that it's good, for people desperate enough :D
Please don't make fun of people that can't afford better hardware - it's disgusting (and it's basically Hardware Elitism). In these hard times when the prices of GPUs have skyrocketed, PC Gamers should stick together and encourage each other. We can make fun of the card, but don't mock people that can't afford anything better.
Posted on Reply
#41
Mathragh
krukPlease don't make fun of people that can't afford better hardware - it's disgusting (and it's basically Hardware Elitism). In these hard times when the prices of GPUs have skyrocketed, PC Gamers should stick together and encourage each other. We can make fun of the card, but don't mock people that can't afford anything better.
I think it's quite a good move by AMD in times like these where there are such big shortages all around.
The GPU seems to be exceptionally frugal in all the import ways, lowering die space and even smd (=Surface Mounted Device, think of things like capacitors, resistors, power regulators) part count, of which many are in short supply as well.
This will allow them make the absolute most of their resources and get as many cards into peoples hands as possible. I'm quite certain most people in this segment would rather choose a card with it's features slightly gimped over no card at all.

It's probably gonna be quite a big success, even with it's somewhat gimped features, especially since they'll probably be sold the most combined with a new PC/Laptop which will have PCI-E 4.0 anyway. Furthermore, the lack of encoding/decoding hardware can usually be negated by software decoding, something desktop PC's should have no problem with, while laptops will have decoding hardware in the iGPU anyway.
Posted on Reply
#42
ixi
holyprofI fully agree with what you wrote. I'm currently using my old GTX960 as a HTPC GPU because there's nothing I can replace it with. It works well for some gaming and for outputting 4k video signal to feed to the TV set.
I don't expect the 75W HTPC cards to return anymore. AMD's current APUs (and future Intel) are good enough and made that segment obsolete.
I tested a PC with AMD Ryzen 5 PRO 4650G APU and it's really good for everything except hi-res AAA gaming. Any person that's happy with FHD at medium-low settings can skip the old $100-$200 GPU range because the APUs are good enough.
I'm using currently 4650G as CPU before I got gtx 3060 ti. DayZ full hd with low settings fps 70-100 :). WoW vanilla smooth as well. GTA V full hd with low/medium settings above 60 fps. Can't wait to get cpu with rdna2 :}.
Posted on Reply
#43
Chrispy_
TheoneandonlyMrKIs pciex 3 still Even a thing?!.

I jest but of the four computers I have ATM two Intel three and and all released in the last two years None have pciex 3 main slots?!.

So
Intel were stuck on PCIe 3.0 until Rocket Lake which is only 9 months old and launched to pretty negative or lukewarm reviews. I don't think there are that many Rocket lake chips out in the wild TBH; High end Rocket Lake was a dumpster fire as the 11900K lost two cores and was a step back in most ways, cheaper rocket lake models offered far lower performance/$ because Comet Lake was selling for far less. I'd still recommend the 10400F today on a budget and it's still readily available.

AMD got PCIe 4.0 over 2 years ago but it existed only on high-end boards (X570) that make up a pretty small proportion of AMD's overall market. It only really arrived for mainstream buyers with Zen3 and the B550 which is barely a year ago (Nov 2020) and even then the cheapest point of entry was the 5600X which is a good $140 more expensive than the equvalent Zen2 or Comet Lake configuration.

Given that there are a lot more Intel machines out there than AMD machines, I think it's fair to assume that there are a huge number of modern PCIe 3.0 motherboards that will be wanting a GPU update before they're retired.

Let's face it, if you're in the market for a 6500XT you're probably not rocking a recent flagship motherboard and CPU, making its PCIe 3.0 performance even more important!
Posted on Reply
#44
ShurikN
TheoneandonlyMrKIs pciex 3 still Even a thing?!.
Zen, zen+ and all 400 chipsets are 3.0. Those platforms are not that old, and a lot of people are still on them.
Posted on Reply
#45
TheoneandonlyMrK
ShurikNZen, zen+ and all 400 chipsets are 3.0. Those platforms are not that old, and a lot of people are still on them.
Come on, I said I Jest. IE joke.
Posted on Reply
#46
Chrispy_
TheoneandonlyMrKCome on, I said I Jest. IE joke.
You jest, yet it did make me actually think about how many PCIe 4.0 machines were really out there in the wild, and it's not that many!
Posted on Reply
#47
GoldenX
4GB and 4x at "200 usd".
What a joke.
Posted on Reply
#48
stimpy88
AMD are clearly taking the piss with this card. I thought nGreedia were bad, but this card is bordering on the unusable.
Posted on Reply
#49
TheoneandonlyMrK
Chrispy_You jest, yet it did make me actually think about how many PCIe 4.0 machines were really out there in the wild, and it's not that many!
Your points were well taken.

And after thinking about it one of mine is pciex 3 I forgot pre x570s were that for a moment there.

I think the people should be told, most of these will be in entry level OEM gaming rigs and for those x4 pciex4 will be fine.
Others might have something to complain about, though some here are definitely being way too dramatic, tests show x4 pciex 4 has enough bandwidth.
Posted on Reply
#50
Jism
FourstaffGoing to wait for W1zzard's numbers before passing judgement. I don't think they purposely gimp it unless it doesn't matter anyway.
Yep.

Lots of engineers these days on these forums, in where their skills obviously exceeds that of AMD mosts professional out there.
Posted on Reply
Add your own comment