• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6500 XT PCI-Express Scaling

Ouch.

I was guessing 5-8% loss from PCIe 3.0 in threads last week.

I didn't expect 12-18%. That's bad.
 
Gamers should not buy that GPU for even $200 with hw encoding absent and 4GB of VRAM, even if they own a PCIE 4.0 board. AMD should price it closer to $150. But we see people buying $2000 GPUs so...
Nah!!!! 6500XT should only cost $99 USD MSRP ( IF MSRP EVEN EXISTED) NO ENCODING!!! Gen4 x4 Pusshyish links (gen3 will suffer, GOOD LUCK CHINA LGA 2011 XEON USERS) THISGPU ISNOT FOR YOU!!! keep RX 570 4GB ? LOL LMMFAOSHIFOOMC NICE AMD JUST NICE JOB....... NOT!!!!!!!!!!!!!!!
 
Does anyone have an educated guess as to how much is being saved with this 4x bus compared to let's say a x8 bus? I'm specifically interested in raw materials (PCB), SMD's on the PCB and perhaps die space. I wonder whether the shortage of loads of devices and materials also had something to do with this decision.
Hardly anything. The price of copper has increased from $2.5/lb (Jan 2019) to $4.5/lb (Jan 2022). Even if this GPU came with a huge 1kg solid copper heatsink, it still would have barely increased prices by $5 per GPU vs where we were pre-lockdown. A cheap aluminium heatsink and copper used only on PCB tracks / interconnects alone = not even $0.1 saving on copper. As for die costs, I don't have the maths but as a reality check, even the 6 year old 2015-era $99 GTX 1050 2GB (non Ti) had both a 3.0 x16 bus and Shadowplay encoder, so "We did this to this $300 GPU to save money" = "We love to insult your intelligence"...
 
given 32 bit cards.....wait did those every exist in the first place?
they do now
1642612073380.png
 
Well only games I play on that massive list is witcher 3 and BL3. And on witcher 3 the pcie version makes no difference and on BL3 it’s minimal which is surprising in a good way.

now having said that there is absolutely no reason to replace my 8GB RX480 with that card.

and if it doesn’t support hardware encode/decode of all modern video codecs then as I said before it’s a waste of ram and pcb and silicon.

with 4GB of ram this should have been priced in the rx460/560 bracket at $120

edit. Lol ya nope
1F627B19-14AE-4F49-811B-752678C8FD5C.png
 
Should have put 3D v-cache on that infinity cache, lol and at least 8x 4.0
So the only features you really get is FSR and RSR no decoder or encoder. hhhmmm I think you're better off with a 5500 XT 8Gb at that point. Doesn't FSR and RSR work on those 5500 XTs too?
 
And the winner for the worst graphics card launch ever gets: THIS!

Absolute shit show from AMD :kookoo::rolleyes::banghead:
:twitch:

It is DBE.

don't-buy-edition :D
 
Last edited:
And the winner for the worst graphics card launch ever gets: THIS!

Absolute shit show from AMD :kookoo::rolleyes::banghead:
:twitch:

It is DBE.

don't-buy-edition :D
Now watch it sell out to the very last unit....
 
x8 would have fixed the biggest of the criticisms. Sure, 8GB wouldn't have hurt either and most gamers aren't using the encode/decode blocks, but x4 makes this well over the line to DOA.

But how are they out of stock??
 
x8 would have fixed the biggest of the criticisms. Sure, 8GB wouldn't have hurt either and most gamers aren't using the encode/decode blocks, but x4 makes this well over the line to DOA.

But how are they out of stock??
 
What an utter piece of garbage!
 
Looks like some AMD's engineers are worse than I thought. Gave them a benefit of doubt, they did not deliver. It doesn't even conclusively beat 5500XT.
 
Looks like some AMD's engineers are worse than I thought. Gave them a benefit of doubt, they did not deliver. It doesn't even conclusively beat 5500XT.

Making a silk purse from a sow's ear is even more difficult than it sounds.
 
Making a silk purse from a sow's ear is even more difficult than it sounds.
people just seem to be looking at this product in the wrong way it seems, was awaiting for this hot take. AMD might just be misunderstood

 
The 1440p relative results outperformed 1080p and 4K which is a bit of a oddity. The card is overall weak though. Even beyond he PCIE matter other parts of the card are really cut down from the 5500XT. The cache and GDDR is better, but other parts of the hardware are more vital. If I had to guess AMD will sell a lot of these to OEM's though in bulk perhaps. It's not a total dud of a card, but it is underwhelming for the DIY enthusiast market and below expectations.
 
That's not the only issue by a long shot. How about "its not only slower then the 570 but it's also $200 AND is gimped for non AMD platforms"

Because if it WAS faster then a 570, that still means it offers worse price performance then a $200 RX 480 from 6 YEARS AGO. And it would still scale very pporly on anythign that isnt rocket lake, alder lake, or AMD 500 series chipsets, AKA the majority of buyers looking for a low end card like this. Few are going to be buying a $200 GPU to go with a new $500 CPU after all, and for anyone with a PCIe 3.0 platform (AMD from k10 to zen 2, intel from ivy bridge to comet lake) is going to lose more performance.

At $200 this thing would need to be consistely outperforming the 1660 super, and even then wouldn't be a very good value. Add on that pathetic 4GB VRAM buffer and this thing should be a sub $100 GT 1030 competitor. And dont forget, this thing draws 100 watts of power when gaming, compared to the ~140 watts pulled by the 6 year old 12nm RX 480 or the ~70 watts pulled by the 12nm 1650, or the ~95-100 pulled by the 1650 super which occasionally outperforms this 6nm rDNA2 card.

This thing is an attrocious GPU.
It is not just Intel platform that it’s gimped. If one is using a Ryzen APU, ie, 4000G and 5000G series, you are limited to PCI-E 3.0, and will face the same issue. There are rumors that AMD may release 4000G APUs to compete with Intel in the low end segment. So budget system gets bad performance due to bad design decision.
While this is a good review to understand the impact of moving from PCI-E 4.0 to 3.0, the use of average performance lost in the conclusion to me is wrong. Instead the worst case scenario should be painted because there are potential games that will lose significant performance. Potential buyer should not read that on average the lost in performance is 13% and decide to buy it and upgrade to a PCI-E 4.0 later because they may be leaving a lot of performance untapped. And average numbers can be skewed based on the games tested.
 
Last edited:
Me: Hell yeah, gunna use the timezone advantage and get in early

also me: 50 comments here and 150 on the card review, before i start. popular topic, much?


Final thoughts:

Ouch. Yeah this is a repurposed laptop card that shouldnt exist outside budget prebuilts.

Being equal to an RX470/480 from 500 years ago but requiring PCI-E 4.0 to reach those levels, makes it a terrible choice.
The only upside is if they have a lot of stock, during the GPU shortage.
 
Me: Hell yeah, gunna use the timezone advantage and get in early

also me: 50 comments here and 150 on the card review, before i start. popular topic, much?


Final thoughts:

Ouch. Yeah this is a repurposed laptop card that shouldnt exist outside budget prebuilts.

Being equal to an RX470/480 from 500 years ago but requiring PCI-E 4.0 to reach those levels, makes it a terrible choice.
The only upside is if they have a lot of stock, during the GPU shortage.
I feel that going forward with new systems, chances of us seeing PCI-E 3.0 is slim. PCI-E 5.0 is the current standard, so I expect even budget builds to come with PCI-E 4.0, at least mostly. I am aware that Intel's latest H610 still supports PCI-E 3.0. Overall, I think if one is getting this card to run with a PCI-E 4.0, it may not be a deal breaker, if you don't need things like support for more than 2 monitors, lack of updated video decoder and lack of any video encoder.

In any case, I feel AMD's GPU range from 6700 XT and lower is very lacklustre. Each downgrade from the RX 6800 series resulted in very hefty 50% cut(s) somewhere. The RX 6700 XT lost 50% of the CUs to 40 CUs, and if you consider the fact that the RX 6600 XT has 32 CUs, I can't help but think AMD cheap out too much and hampering performance. Next downgrade is the RX 6600 XT, which I feel is not as bad when the biggest downgrade is the Infinity cache to 32 MB, which is more than a 50% downgrade. In addition, the PCI-E support also got axed by half to x8 where the performance degrade when using it on PCI-E 3.0 is not as severe, but again, some titles suffer more than others. Lastly comes the disastrous RX 6500/ 6400 series, and the axe fell hard on it where almost everything is axed by half, with the exception of features like video encoder/ decoder which is as good as completely axed. In my opinion, AMD should have given this card a 96 bit memory bus so that it can accommodate 6GB of VRAM. But to save their own cost, they cheap out once again and went for a meagre 64 bit. The cache is not "infinite" as the name suggest, and that silly decision to couple low VRAM with x4 connection is a deal breaker.

Sure it will sell well to OEMs, but people buy it because they either do not know about the limitations of this card, or desperate to get their hands on a GPU. Both are not good reasons and a poor showing from AMD. I don't believe there will be stock issue because if you look at Amazon, you can easily find RX 6600 available for sale. Only problem is the price is not attractive, which I believe will be the case for this card as well. If RX 6600 is not popular, you can imagine this is probably 3x worst.
 
x8 would have fixed the biggest of the criticisms. Sure, 8GB wouldn't have hurt either and most gamers aren't using the encode/decode blocks, but x4 makes this well over the line to DOA.

But how are they out of stock??
From a modder or just for fun standpoint, I wonder how it perform with a 256-bit mem bus and x16 pcie lanes and 8GB of top speed GDDR6, then OC it all to the nuts just to see what this itty bitty piece of silicon could have actually done. Maybe it tie the 6600xt lol
 
Yikes! "Radeon RX 6500 XT was killed by Radeon RX 580"->Me.

Looks like I'm more likely to OC my RX 580! It's on my A320 system with a Pinnacle Ridge.

For new cards, with possibly a few exceptions, it looks like Nvidia for me. Even the GTX 1650 Super, looks better than this!

And for the raytracing being ruled a fail, in that department, it feels more like a fake card! It at least has a "fake card"-like vibe! WTF! LOL!
 
Last edited:
Yikes! "Radeon RX 6500 XT was killed by Radeon RX 580"->Me.

Looks like I'm more likely to OC my RX 580! It's on my A320 system with a Pinnacle Ridge.

For new cards, with possibly a few exceptions, it looks like Nvidia for me. Even the GTX 1650 Super, looks better than this!

And for the raytracing being ruled a fail, in that department, it feels more like a fake card! It at least has a "fake card"-like vibe! WTF! LOL!
To me, ray tracing is not something viable for cards in the mid to low range. At least I would have preferred higher frame rates as opposed to having RT on. We can claw some performance back using DLSS or FSR for sure, but still having to game at a lower resolution which is further upscaled from an even lower resolution don't sound great to me.
 
Back
Top