• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

Proper troll fest now eh , lol looks like a hate fest in here, meanwhile no reviews no tests just hyperbolic bs.

I await reviews, and then like most commenters, I already have a better GPU, I wouldn't be buying it, it wouldn't matter to me, I would expire no butt hurtness.
If you're unable to make decisions yourself, and need a review to tell you it is crap, then I have plenty of other crap I'd like to sell you... I'll guess i'll start on some "reviews" first.

I guess some people are born to ridicule public descent, and hand their autonomy over to something with a logo, while sneering at those clever enough to have seen it coming. We have seen a lot of these types over the last few years.
 
If you're unable to make decisions yourself, and need a review to tell you it is crap, then I have plenty of other crap I'd like to sell you... I'll guess i'll start on some "reviews" first.

I guess some people are born to ridicule public descent, and hand their autonomy over to something with a logo, while sneering at those clever enough to have seen it coming. We have seen a lot of these types over the last few years.
Yeh your so smart bro , keep righting those wrongs eh.

As I said I couldn't care any less how this performs, I'm not buying it.

I sneer at those who think they know me from a paragraph of text or think they can derive greater meaning than I presented.

I laugh at fools who waste their time shit posting about things they'll never buy or want too.

And those that choose to attack an individual on his opinion rather than the topic at hand, trying to sound smart.

You feeling defensive, did you take the troll jab to heart perhaps , act on it , that'll prove you aren't?!.
 
AMD would have seemed to drop the ball here, because of no AV1 encoding and decoding! It's a truly-open format, unlike the horse hockey that goes with H.264 and H.265.

It's now possible that with the RX 6500 XT, that even 1080p recording lags with less than 8 CPU cores, maybe for less than 10 CPU cores for all I know!

A higher number card that looks inferior to 5500 XT, starkly reminds me of AGP-era-pre-AMD Radeons, with the 9000 Pro. :(
 
Last edited:
It's now possible that with the RX 6500 XT, that even 1080p recording lags with less than 8 CPU cores, maybe for less than 10 CPU cores for all I know!
The chart said "H264/4K", not "H264". To me that means 4K is blocked off, not H264. We'll have to wait until someone reviews the cards to make sure.

1641669735057.png

Though I do not understand why H265 isn't available, even if only for 1080p. The now long in the tooth RX 580 has the capability.
 
The chart said "H264/4K", not "H264". To me that means 4K is blocked off, not H264. We'll have to wait until someone reviews the cards to make sure.

View attachment 231707
Though I do not understand why H265 isn't available, even if only for 1080p. The now long in the tooth RX 580 has the capability.

I think we're going well into speculation and mis understanding now, for one any modern cpu will be able to power through software decoding. My old laptop with a sandy bridge (you read that right, intel 2nd gen) quad core had "no problems" chugging along h265 on software alone (it did put some load in the cpu but it worked).

Then there's directx, you might not have hw decoders but you can still use directx to push the workload to the gpu which will still be faster, not as good as dedicated hw support but still faster than cpu brute force.

Is it bad and silly that a 2022 gpu lacks basic enc/decode features? Yes, it's a damn disgrace but it's not the end of the world
 
It won't be a problem on 3.0, and god help anyone still on 2.0. You're going to face UEFI issues on most 2.0 platforms before you ever have the opportunity to face bandwidth problems.
Not sure if god will help people on B450 and X470. I'd honestly rather have AMD force this card to run PCIe 3.0 x8 or PCIe 2.0 x16 because, the card can work at the same bandwidth regardless of a newer generation. HOWEVER, if you limit it to PCIe 4.0 x4, you now require a PCIe 4.0 x4 slot, otherwise you won't get the same bandwidth any other way unless you put it in a larger PCIe 4.0 slot. Which is why 99% of people don't like graphics cards intentionally limited to the latest generation and 1/4th the max PCIe lanes for that slot.
 
I'd honestly rather have AMD force this card to run PCIe 3.0 x8 or PCIe 2.0 x16 because, the card can work at the same bandwidth regardless of a newer generation. HOWEVER, if you limit it to PCIe 4.0 x4, you now require a PCIe 4.0 x4 slot, otherwise you won't get the same bandwidth any other way unless you put it in a larger PCIe 4.0 slot

That's probably not an option. We don't really have a block diagram or anything but the likely scenario is that they built only 4x lanes into the thing, if they had used more lanes we'd have access to them (i.e. x8 gen4). This is most likely because the card uses a mobile chip, which usually are limited to x4.
 
That's probably not an option. We don't really have a block diagram or anything but the likely scenario is that they built only 4x lanes into the thing, if they had used more lanes we'd have access to them (i.e. x8 gen4). This is most likely because the card uses a mobile chip, which usually are limited to x4.
In that case, why couldn't AMD just make it a physical x4 card? That way it can fit in more motherboards.

Proper troll fest now eh , lol looks like a hate fest in here, meanwhile no reviews no tests just hyperbolic bs.
I await reviews, and then like most commenters, I already have a better GPU, I wouldn't be buying it, it wouldn't matter to me, I would expire no butt hurtness.
I laugh at fools who waste their time shit posting about things they'll never buy or want too.
So, you are laughing at yourself? :confused::confused:

AMD would have seemed to drop the ball here, because of no AV1 encoding and decoding! It's a truly-open format, unlike the horse hockey that goes with H.264 and H.265.
AV1 encode/decode won't really matter right now, however if AV1 encoding does kick off for twitch (odds of that happening are pretty low), as they have stated they are working on supporting AV1 encode, then it will basically kick this off the relevant GPU list for many streamers. Then again, 6500 XT is such a low end card and AMD's GPU encoder is so horridly trash for streaming, I don't think it even matters; just get an NVIDIA GPU w/ Turing/Ampere (7th gen) NVENC if you plan on streaming.
 
Last edited:
In that case, why couldn't AMD just make it a physical x4 card? That way it can fit in more motherboards.



So, you are laughing at yourself? :confused::confused:


AV1 encode/decode won't really matter right now, however if AV1 encoding does kick off for twitch (odds of that happening are pretty low), as they have stated they are working on supporting AV1 encode, then it will basically kick this off the relevant GPU list for many streamers. Then again, 6500 XT is such a low end card and AMD's GPU encoder is so horridly trash for streaming, I don't think it even matters; just get an NVIDIA GPU w/ Turing/Ampere (7th gen) NVENC if you plan on streaming.
I am now looking back.

Sometimes you think more than others

But every day, I learn

Sorry @stimpy88 I overreacted.
 
We don't even get hypothetical 6500XT with x16 so it's not even possible to test how much this card would lose, if any lol.
Technically you could just force the card into PCIe 2.0 x4 mode, that would be 1/4th the max bandwidth.
 
In that case, why couldn't AMD just make it a physical x4 card? That way it can fit in more motherboards.

Marketing probably, just like x8 cards are still built with the full lenght slot.

In terms of motherboard support, if you have a smaller slot that you plan to use for it you can just cut the back part so the card enters, there are even slots that come with the back open for that exact reason but it's possible to also diy'it with a dremel or something

maxresdefault.jpg
aa7fV.jpg


We don't even get hypothetical 6500XT with x16 so it's not even possible to test how much this card would lose, if any lol.

We can just test the performance on pcie 4.0 x4 like the card was designed and then test on pcie 3.0 and pcie 2.0 x4 and will see how much it looses, just like any card.

I don't think anyone is arguing that the card will be bottlenecked on pcie 4.0 which is very unlikely for a low end card like this, with older gens that's a different story and the discussion here
 
if you have a smaller slot that you plan to use for it you can just cut the back part so the card enters, there are even slots that come with the back open for that exact reason but it's possible to also diy'it with a dremel or something

maxresdefault.jpg
aa7fV.jpg
That is true, however in certain motherboards, there are other components right next to the slot, so it wouldn't be feasible for those boards. Boy am I glad my motherboard has open ended slots lol
 
Y'all say what you will. This is great! We finally have a GPU that's perfect for m.2 slot conversion, or use in the chipset PCIe x4 slot. Since x570 has bifurcation on the 4.0x16, you could now use that slot for other devices with the proper risers. (Like MaxCloudOn).
 
Y'all say what you will. This is great! We finally have a GPU that's perfect for m.2 slot conversion, or use in the chipset PCIe x4 slot. Since x570 has bifurcation on the 4.0x16, you could now use that slot for other devices with the proper risers. (Like MaxCloudOn).

Nothing stops you from doing that with x16 cards either, the situation is exactly the same since this one also has the full lenght connector on the pcb. The only difference is this card is limited to x4 but nothing stopping you from using only x4 lanes of a x16 card
 
dont need more than that for 1080
 
Not sure if god will help people on B450 and X470. I'd honestly rather have AMD force this card to run PCIe 3.0 x8 or PCIe 2.0 x16 because, the card can work at the same bandwidth regardless of a newer generation. HOWEVER, if you limit it to PCIe 4.0 x4, you now require a PCIe 4.0 x4 slot, otherwise you won't get the same bandwidth any other way unless you put it in a larger PCIe 4.0 slot. Which is why 99% of people don't like graphics cards intentionally limited to the latest generation and 1/4th the max PCIe lanes for that slot.
The card should have full 16X lanes no matter the pcie gen.

The AsRock B550 Steel Legend mobo allows selection of PCIE Gen but im forcing it to run PCIE 4 despite using a XFX R7 250X GHOST. It boots fine on uefi lol.
 
The card should have full 16X lanes no matter the pcie gen.

The AsRock B550 Steel Legend mobo allows selection of PCIE Gen but im forcing it to run PCIE 4 despite using a XFX R7 250X GHOST. It boots fine on uefi lol.

What you select in the bios doesn't matter, the card will only run at the specs it supports, gen 3.0 in your case
 
This could be one explanation why Navi24 has this weird design and limited PCI-e lanes. Less lane on laptop means more efficient pcb design.
Screenshot 9.png
 
Oh the horror, a low-end card that is gimped in a way that won't make a difference to its overall performance. I guarantee that 99% of users who pay over-inflated scalper prices for these cards won't notice any PCIe performance drop.

I'd honestly rather have AMD force this card to run PCIe 3.0 x8 or PCIe 2.0 x16
It would indeed be impressive if they could force the card to use lanes that it physically does not have.
 
Yeh your so smart bro , keep righting those wrongs eh.

As I said I couldn't care any less how this performs, I'm not buying it.

I sneer at those who think they know me from a paragraph of text or think they can derive greater meaning than I presented.

I laugh at fools who waste their time shit posting about things they'll never buy or want too.

And those that choose to attack an individual on his opinion rather than the topic at hand, trying to sound smart.

You feeling defensive, did you take the troll jab to heart perhaps , act on it , that'll prove you aren't?!.
Oh goodness. It's only a shitty graphics card...

But I just read your later post. No problems, and I'm sorry for the tone of my post too. Two reactions don't always make a good one!
 
Oh goodness. Read your own post dude, stop rage crying and take a chill pill. It's only a shitty graphics card.
I did , read further through the thread.:p

Have a go at being less insinuating and insulting yourself though , might help.
 
I did , read further through the thread.:p

Have a go at being less insinuating and insulting yourself though , might help.
I posted that before I saw your post apologizing. I then edited my rant to something a little different. Refresh the page...
 
Is this literally another card that's only good for DirectX 9 games?
 
Is this literally another card that's only good for DirectX 9 games?
We'll know when the card goes through reviews. At the very least, AMD is pushing its core clocks really close to their limit, so that should help a bit.

AMD's numbers aren't bad, but they're AMD's.
 
Back
Top