• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASUS Radeon RX 6500 XT TUF Gaming

IMO bad energy efficiency is caused by too high clocks. I don't care much of gaming performance with this kind of cards. I'd rather see more video outputs and AV1 codec support.
 
Continuing with amd scamming day

another piece trash................. for 230us







and let me guess them compare scam and incomplete gpu with T600, T600 have encode capabilities and full PCIe 3.0 x16


:)

Let's not bring Radeon Pro to this discussion, Workstation cards are always more expensive. Not saying that it's great, especially for AMD that usually doesn't segment both markets nearly as agressively as nvidia, it's still bad, but just having that blue workstation colour makes it more expensive.

What a turd. The 3050 is going to wipe the floor with this, and we know how that performs already because of the same silicon already being in the laptop market for several months.

Unfortunately this bodes extremely badly for RDNA2 graphics built into the 6000-series laptop APUs, as it demonstrates that RDNA scales down horribly to lower bandwidths and performance levels.

The only silver lining is that Vega IGPs are also turds, so perhaps it will still be an improvement.

This was extremely overclocked, it doesn't correlate with what the APU will perform. I'd think it actually bodes pretty well for the APUs given they have similar compute unit counts.

It's understandable that AMD fans are mad, but from a business stand point they had no choice but to launch this GPU with limited RAM and bandwidth.

It's a difficult time for manufacturers. AMD had to do these limitation to keep the price low. Honestly though, it's not that bad. Obviously it's not meant to be for anything higher than 1080p.
Now if we compare this one with RX 580 or even RX 590 the performance is actually impressive even with half of their shaders.

I believe that AMD is doing a good job during these entire chip shortage thing and we will see a strong come back maybe at the 2H of 2022 or maybe beginning of 2023!

They had a choice. Bringing a new product to market costs an imense amount of money just in setting up production and certifications, along with all the design and engineering that went into it. They could have just re-branded the 5500xt as a 6500xt like they did in the past and save all the engineering and certification costs, maybe put some of that dough towards absorving any supply chain hurdles. They could even raise it's initial price point from the 169$ to 199$ because inflation on everything, it would be much better for everyone involved (including us!)
 
Scalpers who have bought the cards up will most likely feel sorry about their "investment".

I sure hope on one will touch those ebay listings and ultimately the cards will be returned or/and sold at MSRP which is still too high for this product. Considering its price and features, it should have been called RX 6400 XT or even RX 6300 XT and sold at $130 at most.

@W1zzard

Is too much to ask to

1) Remove all the cards above RTX 3060 Ti (there's just no point showing them)
2) Add performance in PCI-E 3.0 mode

in the charts at https://www.techpowerup.com/review/asus-radeon-rx-6500-xt-tuf-gaming/31.html

In the future no one will check the PCIe scaling article but these graphs will surely be cited over and over again and PCIe 3.0 performance is crucially missing.

Thank you
 
Last edited:
to the charts at
My summary charts are always the whole market. The individual game tests usually are cut off at 75% and 125%, but that's kinda difficult with the 6500 XT being so slow, and people want to see how it does in relation to other interesting cards. I found the deltas to Vega 64 quite surprising
 
My summary charts are always the whole market. The individual game tests usually are cut off at 75% and 125%, but that's kinda difficult with the 6500 XT being so slow, and people want to see how it does in relation to other interesting cards. I found the deltas to Vega 64 quite surprising

Still PCIe 3.0 perf would be nice to see. The charts are already insanely high, so adding yet another line won't hurt. Absolute most people who are considering this card don't have PCIe 4.0 motherboards.
 
I don't know about you guys, but that 2060 12GB card looks like a fricking steal compared to this.

Still PCIe 3.0 perf would be nice to see. The charts are already insanely high, so adding yet another line won't hurt. Absolute most people who are considering this card don't have PCIe 4.0 motherboards.
 
Still PCIe 3.0 perf would be nice to see. The charts are already insanely high, so adding yet another line won't hurt. Absolute most people who are considering this card don't have PCIe 4.0 motherboards.
 
My disappointment is immeasurable

Price would make or break this and at 350... I can only hope that the local pricing here is below that of the GTX 1650 or this will rot in stock
 
Would be okay at MSRP. But let's not dream too much.

Just no! It's a mobile chip on a desktop board, with cut down bandwidth AND very limited VRAM on PCI-E 4 and even more on PCI-E 3. Without a number of features, every low-end card in the last decade supported. AMD should be scolded to release a product in 2022, that in some titles get outperformed by their previous released models (back to 2017!) in the 200$ bracket (RX590, Radeon 5500XT).
 
Wow just wow, they put every piece of utterly trash into a graphic card and it generates profit from bottom of the trash can . These day anything will sell as hot cake and since intel cant make gpu it make sense now that this trend will continue at least 2 more years.
 
rdna 2 strategy was to have a big cache + very high frequency to make up for less CUs => smaller dies => more money. This has worked to some extents in the higher end cards because the infinity cache had a decent size of 32MB and more, but here, 16 megs cannot hide the fact that this gpu is gimped in every single aspect. bus width is super small, pcie lanes are very few, CUs are barely higher than an APU. So the cache and close to 3Ghz frequency cannot make up for all of that. I just don't understand how AMD took the decision to launch it as a 6500xt product when they clearly saw it is performing very badly.
 
I read somewhere that GDDR5 has become quite expensive (12$ per GB) so an additional 2/4 GB of RAM might have been too much (for the price). Even still, while 64 bit memory is somewhat tolerable, having only PCIe x4 interface is simply unjustifiable. Go f**k yourself AMD.
 
I try not to be this cynical, but this product looks more and more like the answer to the question, "What can we shove out the door quickly with components we have on hand that will make it look like we're trying to serve the low end of the market?" Heck, maybe even laptop manufacturers are wondering why AMD's trying to sell them a chip with a memory bus that can't even handle its own shader capability.
 


In the future no one will check the PCIe scaling article but these graphs will surely be cited over and over again and PCIe 3.0 performance is crucially missing.

When replying to posts make sure you've read everything.
 
Yikes, this makes my upgrade from a RX 570 4GB 'which I used since 2018' to a GTX 1070 in 2021 August for ~400 $ feel like a good deal.
Especially since I'm still using a PCIe 3.0 mobo.
 
I try not to be this cynical, but this product looks more and more like the answer to the question, "What can we shove out the door quickly with components we have on hand that will make it look like we're trying to serve the low end of the market?"
Not cynical, you've hit the nail on the head.
 
So, it's essentially a somewhat mining-resistant RX580, as long as you are 1080p on PCIe4. Otherwise, an RX570 and depending on the game an RX560...

Slots perfectly well into the current market, doesn't it...?
 
So, it's essentially a somewhat mining-resistant RX580, as long as you are 1080p on PCIe4. Otherwise, an RX570 and depending on the game an RX560...

Slots perfectly well into the current market, doesn't it...?

Anything slots perfectly well into the current market as long as it supports DX11.
 
When replying to posts make sure you've read everything.

I did read everything. I suppose how you made you comment I quoted I didn't understand that you were asking for a comparison directly in the review, which it looks like you were suggesting. I took your comment as you would like to see a pcie 3.0 comparison to which I linked the pcie comparison review.

How you word your posts, they can be read differently by others over how you want them to come across.
 
Sorry AMD,

Piss poor release.
 
Damn, that performance is a real bummer. It's actually slower than the 5500 XT? That's inexcusable, even for what is clearly a die shrunk as much as possible. And clearly AMD knew this - they obviously both run software simulations as well as hardware tests (easily done with a custom bios on a test card with a larger die) before committing to the silicon design. This just makes no sense to me.

I was really hoping this would be a return to form for AMD at the $200 mark and a worthy upgrade for the RX 570 in my secondary system, but clearly that isn't what we got.
 
Back
Top