• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASUS Radeon RX 6500 XT TUF Gaming

AMD need to issue public apology for this "Insult edition" GPU!

edit: And substantial price cut!
 
Last edited:
https://tpucdn.com/review/asus-radeon-rx-6500-xt-tuf-gaming/images/gpuz-overclocking.gif



gpuz-overclocking.gif



This is showing that 6500XT only has 16 ROPS.
 
What a turd. The 3050 is going to wipe the floor with this, and we know how that performs already because of the same silicon already being in the laptop market for several months.

Unfortunately this bodes extremely badly for RDNA2 graphics built into the 6000-series laptop APUs, as it demonstrates that RDNA scales down horribly to lower bandwidths and performance levels.

The only silver lining is that Vega IGPs are also turds, so perhaps it will still be an improvement.

EDIT:
I've learned new info since yesterday morning:
 
Last edited:
Man this card is listed for 360 usd on newegg and still sold out... What a time to be alive :laugh:

 
The good thing is if there were any doubts that AMD is just like Intel and nVgreedia, then this release definitely dispelled them.
tbf this just seems like an extreme case of the stupid; in general amd's radeon department's been plagued w/ ineptitude as of late (dgpus that serve more than just a displayout -looking at you, gt1030- should have x16, period), unlike nv (aka ngreedia) or intel (aka pay-extra-for-OC) which usually exhibit an extreme case of avarice (in different flavors), but seldom outright ineptitude (RKL notwithstanding)
 
Starts at 350 Euro in DE.
So, tell me how it's "too expensive" for what crap it is.
 
2022 hall of fame worst GPU's ever made. #1
 
tbf this just seems like an extreme case of the stupid; in general amd's radeon department's been plagued w/ ineptitude as of late (dgpus that serve more than just a displayout -looking at you, gt1030- should have x16, period), unlike nv (aka ngreedia) or intel (aka pay-extra-for-OC) which usually exhibit an extreme case of avarice (in different flavors), but seldom outright ineptitude (RKL notwithstanding)
AMD has been plagues with "the stupid" since their GPU division was ATi. Remember them leaving GPU driver development to OEMs with ryzen APUs? The driver bug that cost GCN significant performance due to VRAM allocation? Evergreen being good, so AMD rebranded it only to get blindsided by thermi 2.0: actually good edition, then abandoning evergreen years before nvidia dropped fermi? The attrocity that was the 2900xt. The 5600x debacle. The 5500xt x8 debacle. the hot, loud, late, expensive vega. rabrandeon. frame time pacing. Fury X. Clock rate dropping. Black screeens. flickering. The list goes on.

Polaris and the 5700xt were flukes.
 
this sea of shit of rx 6500 xt have a simple solution....................
....................................................................................................
...........rx 6505 xt added 8gb model, encode capabilities, real pci-e and lower price

:)
 
This is a disappointing release. I had expected it to be as fast as a RX 580 at 1080p on average. However, in some games, such as F1 2021 and Watch Dogs: Legion, it falls behind even the RX570. I think the 4 lanes of PCIe is a bigger deal than the 4 GB of VRAM. Also, the performance relative to the RX 6600 is all over the place.


GameRelease YearRX 570RX 6500 XTRX 66006500XT/66006500XT/RX 570
DOOM Eternal
2020​
72.5​
75.1​
181.1​
41%​
104%​
Watch Dogs: Legion
2020​
28.2​
27.2​
60.4​
45%​
96%​
F1 2021
2021​
96.5​
86.7​
181.9​
48%​
90%​
CyberPunk 2077
2020​
24.7​
26.5​
52.1​
51%​
107%​
Deathloop
2021​
38.9​
40.8​
78.7​
52%​
105%​
Control
2019​
35.5​
39.6​
75.1​
53%​
112%​
The Witcher 3
2015​
54.3​
64.1​
106.2​
60%​
118%​
Far Cry 6
2021​
45.3​
58.3​
87.9​
66%​
129%​
Far Cry 5
2018​
64.3​
82.6​
117.3​
70%​
128%​
 
To those saying this is a laptop GPU remodeled for desktop, in laptops the RX 6500M will have to send a frame buffer back to the CPU 30+ times a second which will further worsen the very limited bus bandwidth situation. This is bad for laptops too.
 
Here's what gets me: this 6500xt is often slower then a 570, though not always by much. By extension it's MAYBE 5-10% faster then a RX 560.

The RX 560 was a 75 watt card from 2017. I have one. How on earth, with that clock speed, 6nm node, and 18 Gbps GDDR6 memory, did AMD manage to engineer a card that draws 100 watts yet is barely any faster then a 5 year old 14nm card with 5 Gbps GDDR5 memory? I mean jesus there's wastes of space and then there is actual TALENT at making something utterly horrible.
I think you're forgetting that the RX 570 is much faster than the RX 560. This is from TechPowerup's review of the Sapphire RX 570 Pulse:

1642612969497.png


The RX 560 is comparable to the GTX 1050. This means that the RX 570 is nearly twice as fast.

What a turd. The 3050 is going to wipe the floor with this, and we know how that performs already because of the same silicon already being in the laptop market for several months.

Unfortunately this bodes extremely badly for RDNA2 graphics built into the 6000-series laptop APUs, as it demonstrates that RDNA scales down horribly to lower bandwidths and performance levels.

The only silver lining is that Vega IGPs are also turds, so perhaps it will still be an improvement.

I was expecting the 3050 to dominate this even before the review. I suspect AMD expects that the 3050 will be much more expensive due to miners so they didn't price the 6500 XT according to its performance.
 
I'm seriously wondering how this happened. 1650 Super is 12nm, Turing. Capped at 100W. This card uses about the same power, about the same performance, probably similar real-world price. But this is on a newer architecture, smaller node. AMD bungled this horribly, whoever within RTG thought this was a good idea should be given permanent unpaid leave imo. This is technological regression.

And handicapped overclocking on top of that? The memory bus and PCIe bandwidth hold it back anyway, I honestly think this could run at 6GHz core and still not even improve more than 5%. Capping the sliders is a slap in the face especially when BIOS modding is near impossible on Navi afaik.
 
I'm seriously wondering how this happened. 1650 Super is 12nm, Turing. Capped at 100W. This card uses about the same power, about the same performance, probably similar real-world price. But this is on a newer architecture, smaller node. AMD bungled this horribly, whoever within RTG thought this was a good idea should be given permanent unpaid leave imo. This is technological regression.
They clocked it way beyond its sweet spot to match the RX 580 and 1650 Super.

Oh and in Canada, it's priced even more absurdly. This was the cheapest model sold by Newegg. That is equivalent to 240 USD before shipping.

1642614796238.png
 
W1zz I may have missed it in your extensive review but did you confirm what hardware video codecs encode/decode the 6500xt actually supports??

And thanks for the rx570/580. :-)
 
As a buyer in this segment, I wanted RX 6500 XT to be a proper RX 5500 XT successor:
- sub 90W power consumption
- same encoding/decoding options as previous gen
- 6 GB VRAM
- 96-bit bus with 16 MB Infinity Cache
- PCI-E 8x
- sub $200 MSRP (that is basically the only thing that was delivered)

The thing that released is more of an RX 550 successor and it should really be named RX 6300 XT (also, the efficiency is horrible). I guess I'm waiting for next-gen and/or hoping Navi 23 gets a cut down version ...
100% this. Has they used a 96 bit bus and a few more CUs, they could have used much lower clocks while still getting better performance (1660ti) and MUCH better efficiency. Not to mention 6 GB vram, which would probably avoid pci-e bottleneck had they still needed to use 4 Pci lanes.

This was a terrible debut for the N6 node as it was being clocked at insane levels making it appear inefficient.

sea of shit of rx 6500 xt have a simple solution....................
....................................................................................................
...........rx 6505 xt added 8gb model, encode capabilities, real pci-e and lower price

:)
That's not happening. Using a pathetic 64 bit bus eliminated any chance of an 8GB version.
 
I'm seriously wondering how this happened. 1650 Super is 12nm, Turing. Capped at 100W. This card uses about the same power, about the same performance, probably similar real-world price. But this is on a newer architecture, smaller node. AMD bungled this horribly, whoever within RTG thought this was a good idea should be given permanent unpaid leave imo. This is technological regression.

And handicapped overclocking on top of that? The memory bus and PCIe bandwidth hold it back anyway, I honestly think this could run at 6GHz core and still not even improve more than 5%. Capping the sliders is a slap in the face especially when BIOS modding is near impossible on Navi afaik.
RX 6600 has the best performance/watt and RX 6500 XT looks so stupid. I expected more with such high TDP, but the Vram and the PCIe made this card terrible.
1642617309416.png
 
It's understandable that AMD fans are mad, but from a business stand point they had no choice but to launch this GPU with limited RAM and bandwidth.

It's a difficult time for manufacturers. AMD had to do these limitation to keep the price low. Honestly though, it's not that bad. Obviously it's not meant to be for anything higher than 1080p.
Now if we compare this one with RX 580 or even RX 590 the performance is actually impressive even with half of their shaders.

I believe that AMD is doing a good job during these entire chip shortage thing and we will see a strong come back maybe at the 2H of 2022 or maybe beginning of 2023!
 
As a buyer in this segment, I wanted RX 6500 XT to be a proper RX 5500 XT successor:
- sub 90W power consumption
- same encoding/decoding options as previous gen
- 6 GB VRAM
- 96-bit bus with 16 MB Infinity Cache
- PCI-E 8x
- sub $200 MSRP (that is basically the only thing that was delivered)

The thing that released is more of an RX 550 successor and it should really be named RX 6300 XT (also, the efficiency is horrible). I guess I'm waiting for next-gen and/or hoping Navi 23 gets a cut down version ...
IMHO anything less than a 128-bit memory bus on todays graphics is silly. 128bit mem bus should be the minimum nowadays. Gddr6 is fast let it breathe even if the gpu can’t handle it. A 64-bit mem bus on a gpu is like a 32-bit os. Ya it works but it’s 2022. Why?
 
This is rubbish beyond words. Well, the word "scam" comes to mind.
 
Back
Top