• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6500 XT PCI-Express Scaling

Perhaps this is one of the reasons for the odd design choices made with this card:
AMD might be combining the Navi 24 chip, on which the 6500XT is based, with a "Core Complex Die" and an "IO Die" in order to build a chiplet-based APU. Who knows.. :rolleyes:
They would have to build a whole new IO die to support the bandwidth requirements of the CPU + GPU.

The situation is only getting worse, and it will bite them back. You know the universal law - what goes around, comes around. It will bite them back in their backsides.. Hopefully, it will be painful for them, as is painful for us now..
Oh it will be painful for them when the crypto market dumps millions of cards for cheap in the 2nd hand market. Nvidia's consumer division is especially going to suffer since they don't have iGPU/console market to fall back on. AMD RTG atleast knows that if PlayStation 6 and whatever the next Xbox is called happens, they are going to get the contract to design the chip using RDNA 4/5/whatever.
 
Last edited:
Sorry, this card is worth no more than 50$ including all taxes.
People who are going to buy this junk are probably on a low budget running mobos with PCI-E 3.0,so no way the card is worth more than 50$.
The 200$ MSRP price is beyond callous for this silicon waste dump.
 
Sorry, this card is worth no more than 50$ including all taxes.
People who are going to buy this junk are probably on a low budget running mobos with PCI-E 3.0,so no way the card is worth more than 50$.
The 200$ MSRP price is beyond callous for this silicon waste dump.

Realistic price floor for any newly-designed card is probably about $100, which I'd argue is fair. And margins will be slim at that price, if they're even positive.
 
There is no reason for this card to exist outside of additional display outputs for office PCs. There will be quite a few disappointed kids, having paid $300+ for one of these only to discover they aren't as good as entry level cards from 5+ years ago.

I suppose the one upside may be scalpers finally getting burned on their habits after demand for these dries up and no takers are found on ebay at $300-400...
 
Ok i know what i'm going to write is stupid now, but,

If you do have a reasonable board, you should be able or capable to increase the PCI-E frequency by either 4 to 12Mhz. You have any idea how much extra bandwidth that provides for a limited card like that? I suggest you try out. Other then that: even tho it's backwards compatible, it's obviously designed for use with PCI-E 4.0 and not 3.0.
Oh i missed this question:

On most systems you can get a max of 103Mhz off the PCI-E bus before devices stop working.
You can sometimes get higher, at the cost of NVME, lowering to previous gen (3.0/2.0) or various onboard devices stopping working
 
Damn rx 5500 XT is faster than this thing. Not by much but still. AMD you are gonna get smacked for this for sure.
 
Damn rx 5500 XT is faster than this thing. Not by much but still. AMD you are gonna get smacked for this for sure.
Reminds me of the AGP-era with Radeon 9500 and 9550!
 
the shoddy AMD marketing of 6500XT brand
as 'supposedly' replacement of 5500XT but in reality it's more like 5300 serie

let's point out that 5500, 5300 has 20% more transistors than 6500XT !
then point out that 5500 has 128-bit, 5300 has 96-bit memory bus contrary to 64-bit of 6500XT
note that 5500 came with 8 GB VRAM and only the mobile 5500M , 5300 came with 4GB VRAM
then realize that 5500 and 5300 are equppied with PCIe4x8 lanes
then realize the the video encode was tossed out for 6500xt
add to injure 6500xt has half the outputs than 5500xt
that 5500, 5300 came 28 and 26 months ago

now you want to wonder how graphics card
with cheaper GPU chip (less transistors on 6nm node)
with simpler PCB
with cheaper and smaller memory
costs more than 99 MSRP ?

speaking of which, it is shocking it needs more than 75W ...
AMD missed the chance to get rid of that extra connector

if it was named 6400XT and priced lower at MSRP ...
then maybe there would be way less of negative reactions
 
Last edited:
I've never seen a new card beaten by its equivalent from the previous generation. That's ridiculous.
Thus, I'm now waiting to see what the specs are going to be for Nvidia's '3050!
 
although this scaling of pcie 4.0 4x vs pcie 3.0 its very bad comparing with rtx 3000 pcie 4.0 vs pcie 3.0, this vga should be enough for all pc dekstop for office daily and light gaming on pc.....
 
The bad thing is acting like it's the Radeon HD days and earlier, with no video encoding, so that means you better prepare to OC your CPU cores just to record a game video.
You may need an "early-grave" CPU Vcore amount, just to record without lag, for all I know! Better get those CPU cores ready for x.264!

We may see a lot of Ryzen owners who do a manual all-core OC, as a result and we may see a lot of Ryzen CPU failures in the following months. I'm not joking or trolling there.
 
It's a mobile chip. AMD thought they could short cut and make it a low end GPU. They obviously ran into problems.
 
The bad thing is acting like it's the Radeon HD days and earlier, with no video encoding, so that means you better prepare to OC your CPU cores just to record a game video.
You may need an "early-grave" CPU Vcore amount, just to record without lag, for all I know! Better get those CPU cores ready for x.264!

We may see a lot of Ryzen owners who do a manual all-core OC, as a result and we may see a lot of Ryzen CPU failures in the following months. I'm not joking or trolling there.

Ryzens have had 16 cores in them for over 2 years now and 8 cores for almost 5. So they're set for CPU encoding.

I'm totally sure that's what AMD was counting on when releasing the 6500XT for desktop...
 
How is your bus bandwidth usage so high? On a gen 3 x16 slot my 3080 has never used more than 19%. So not even 4 lanes of gen 3 and its a more powerful GPU.

On the principle a low end card is released, the target market almost certainly will not have gen 4 boards, so on that basis its an odd choice, but on the other hand, the results dont make much sense to me as pcie slots have typically been massively over provisioned for gpu's.

Was it verified if the slot wasnt stuck in 1.1 power saving mode?
 
Ryzens have had 16 cores in them for over 2 years now and 8 cores for almost 5. So they're set for CPU encoding.

I'm totally sure that's what AMD was counting on when releasing the 6500XT for desktop...
Either that, or ramp up the RAM controller and RAM speeds for a fast single-pass, but then I wouldn't be surprised if the files are bigger than what I get with the on-GPU encoder, with ReLive.

Looks like more gamers will be using x.264, wondering what settings to use for x.264.
 
Either that, or ramp up the RAM controller and RAM speeds for a fast single-pass, but then I wouldn't be surprised if the files are bigger than what I get with the on-GPU encoder, with ReLive.

Looks like more gamers will be using x.264, wondering what settings to use for x.264.
I think GPU encoding is ok if you streaming, but the file sizes are intolerable for recording, they already huge for x264, but for GPU the increase is absolutely huge on what is already big. That to me is the real bottleneck, storage space hence I still use x264 recording when I can, but of course demanding games played locally on PC I switch over to GPU as I have to and i suppose I will need to do some post recording x264 encoding to get the sizes down.

I use the following for x264 recording.

CRF 24
keyframe 4
cpu preset faster
profile none
tune film
options level=4.2 vbv-bufsize=30000 vbv-maxrate=30000 profile=high422 bframes=5 rc-lookahead=20 threads=15 keyint_min=29

The quality will be similar to medium cpu preset but without medium cpu usage. My CRF is higher than many, I used to do CRF18 at one point but couldnt tolerate the file sizes and I honestly cannot see a difference from just watching the videos. II did get a visible difference from the options I set though.
 
How is your bus bandwidth usage so high? On a gen 3 x16 slot my 3080 has never used more than 19%. So not even 4 lanes of gen 3 and its a more powerful GPU.

On the principle a low end card is released, the target market almost certainly will not have gen 4 boards, so on that basis its an odd choice, but on the other hand, the results dont make much sense to me as pcie slots have typically been massively over provisioned for gpu's.

Was it verified if the slot wasnt stuck in 1.1 power saving mode?
The 3080 has 10 GB of VRAM, compared with 4 GB on the 6500 XT. You can get away with low VRAM, or you can get away with low PCI-e bandwidth, but you can't get away with both. That's the common theme of all the reviews I've seen. And yes, other reviewers have noted the PCI-e 3.0 issue; it isn't just a quirk with W1zzard's testing.

One of the morbidly amusing things about this release is that AMD erased their "4 GB is not enough" marketing page, just in time for their release of what is not just a 4 GB card--no, that would just be banal hypocrisy; AMD went a step further and released a 4 GB card that is uniquely constrained by its low VRAM, at least when you pair it with a PCI-e 3.0 rig. It's as if AMD went out of their way to prove their freshly disavowed marketing campaign correct.

Other 4 GB cards that appear in review benchmarks for this thing, many of them sporting extremely dated tech, actually tend to come out looking ok, lol.
 
It's a mobile card adapted for desktop use due to GPU shortage, folks.
Yup. It's not often we get access to mobile-optimized silicon in desktop.
It's a mobile chip. AMD thought they could short cut and make it a low end GPU. They obviously ran into problems.
Problems in marketing. Their marketing team dropped the ball BIG TIME. There's a half-dozen+ 'angles' to sell this card on. Yet, many reviewers didn't even get one for testing, and few touched on any of the cards' use cases or strengths. Instead, the publicity is either VERY negative, or dishonestly positive.
The fact it's 'available' is worth celebrating; 6nm die yields must be bretty gud. The fact the 6500 XT isn't a futher cut-down Navi 23, implies a few conjectural possibilities. Otherwise, the 6400 (adopted, OEM only) and 6400 XT monikers would've been more fitting, as others have mentioned. The model number alone is overselling it; especially if you're comparing to a 5500XT. They're hit and miss 'equals', when we expect at least some perf/W + raw perf. uplift.
 
Last edited:
The model number alone is overselling it; especially if you're comparing to a 5500XT. They're hit and miss 'equals', when we expect at least some perf/W + raw perf. uplift.

It's not even the first time this has happened. What was it, 5850 --> 6850 that also went backward? Something in the upper end of the HD 6XXX range, anyway.
 
Yup. It's not often we get access to mobile-optimized silicon in desktop.
Unfortunately, this isn't good news like it was in the late-socket-462-days!
 
Back
Top