• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASUS Radeon RX 9060 XT Prime OC 16 GB

To me, the fact these results just happen to slow AMD in a bad light
This is were we disagree. I think the testing shown in this review clearly indicates AMD has come to play and is making great progress. I mean, their upper budget tier card is on par with my top-tier NVidia card(3080) from a few years ago in raytracing. That's impressive to say the least. Then there's the fact that the 9060XT is on par with the 5060ti overall.

This is good!

You can just ignore games you are not interested in
That's what I do. In review, I jump to the games I want to see the results for.
 
This is were we disagree. I think the testing shown in this review clearly indicates AMD has come to play and is making great progress. I mean, their upper budget tier card is on par with my top-tier NVidia card(3080) from a few years ago in raytracing. That's impressive to say the least. Then there's the fact that the 9060XT is on par with the 5060ti overall.
I'm referring to the testing results from the two games they want to exclude, not the whole review in general, should have made that more clear but was counting on the context of the discussion.

Added "two" to before the results to clarify.
 
For prices to even make sense today, they should have been
RTX 5060 at $199
RX 9060XT 8GB at $229
RTX 5060 Ti 8GB at $249
RX 9060XT 16GB at $279
RTX 5060 Ti 16GB at $299

The performance of these cards is too low to have a price over $300. And the performance difference between these cards is so minimal, that the most expensive product shouldn't be over $100 more than the cheapest one. The difference in many cases resembles of that we where getting from overclocking alone 10-15 years ago.

That being said, let's hope that AMD and Nvidia will copy Intel and have 160bit and 192bit cards for the low end market. Or 80bit and 96bit cards with very fast VRAM if going with 160bit and 192bit data buses makes these cards... expensive to make. Because if they remain at 128bit, we might get more 8GB models in 2 years.
 
I'm referring to the testing results from the two games they want to exclude, not the whole review in general, should have made that more clear but was counting on the context of the discussion.
Ah, gotcha.

For prices to even make sense today, they should have been
RTX 5060 at $199
RX 9060XT 8GB at $229
RTX 5060 Ti 8GB at $249
RX 9060XT 16GB at $279
RTX 5060 Ti 16GB at $299
Wouldn't THAT be nice? The real world sucks sometimes..
 
You can just ignore games you are not interested in my man. Lots of games in the review I really don't care about. EG. Cyberpunk without RT is absolutely useless to me, never going to play that game without PT on, but im not asking for it's removal. That would be asinine
It isn't about ignoring the games I'm not interested in, it's about bias against the card because someone wants their favorite game to have 400fps at 800x600, and PT is a whole other unrelated issue.
I looked at the averages, didn't cherrypick any game. On average it's barely beating a 5 year old 3070. Saying "so does the 5060ti" proves my point, they are both garbage.
I'm talking about the 1080P averages, if CS2 and Elden Ring weren't so heavily weighted against the card, it should be closer to the 5060Ti 16GB.
I wouldn't buy either of these cards for what they're are retailing at, but it's interesting to see you and others bashing on this card while you and others defended the 5060Ti, your reactions here especially price to performance have proven my point.
It's 374 for the 8gb 5060ti.
So you'd rather pay the same for an 8GB as a 16GB card, didn't you say AMD was upcharging for VRAM? :laugh:
 
At MSRP of $350 it's a reasonable GPU. It's not exactly a exciting GPU particularly, but at least it's fairly close on value for dollar as the 580 if it actually sells MSRP or close enough to it. It's somewhat mediocre, but that's the current GPU environment we're stuck in. It looks like it could use a bit more compute to go with the VRAM ideal it starts to trail behind a bit at 4K or maybe it would've benefited from a slightly wider bus I didn't check out the memory bandwidth. There are certainly worse options overall on value. Will have to see if MSRP is even legitimate or no worse than the other phony MSRP's that you might see 2 samples of for sale at launch and handful at EOL before the next generation launches to clear out inventory rotting on shelves.
 
Wonder why they didn't want to remove doom as well, since it's a huge outlier as well. Do you have any idea why they are fine with doom but not the other outliers?

doom-eternal-1920-1080.png

On top of that it isn't very hard to take the 5-6 games you care about the most and average them out between the 2-3 cards you are thinking about purchasing.

Being overly fixated about where somthing sits on a chart so much so that you want consumers to have less data is ridiculous.

The main two reasons to buy this card is the 16GB and that it's 30 usd cheaper than the 5060ti 8GB removing a few games wouldn't change anything it's slightly worse than the 5060ti slightly better than the 5060 now it just comes down to real world pricing.

I don't like either product but they both have pros and cons that will depend on how a person wants to game to decide which option is superior for them.
 
No plans for Oblivion, no plans for COD. DOOM and Clair Obscur will be added in next rebench, Spiderman 2 already includede
Makes total sense. Oblivion is a oddball game with UE5 visuals wrapped around the base game engine, and there's plenty of UE5 representation already. And COD would be almost impossible to get a consistent benchmark sequence outside of the campaigns (which don't represent normal gameplay).
 
It isn't about ignoring the games I'm not interested in, it's about bias against the card because someone wants their favorite game to have 400fps at 800x600, and PT is a whole other unrelated issue.
But what bias man, nobody used CS 2 as a basis to hate this card. In fact nobody but YOU brought up CS2. Nobody. You are the first that mentioned the game.

I'm talking about the 1080P averages, if CS2 and Elden Ring weren't so heavily weighted against the card, it should be closer to the 5060Ti 16GB.
I wouldn't buy either of these cards for what they're are retailing at, but it's interesting to see you and others bashing on this card while you and others defended the 5060Ti, your reactions here especially price to performance have proven my point.
My first post was that this is the card to get, so how am I bashing this card? But the point is, if I was to use your standards, this card is absolute garbage.But im judging based on current market conditions, in which case it's passable. But then amd having a passable card and daring to call nvidia greedy is just delusional.
So you'd rather pay the same for an 8GB as a 16GB card, didn't you say AMD was upcharging for VRAM? :laugh:
Im not buying either cause they are both abhorrable, I have a 3060ti and none of these cards are anywhere near a decent replacement. They are both garbage. BUT, in case someone is in need of a card (say my 3060ti breaks), im buying the 9060xt 16gb, assuming I find it at it's msrp. Happy?

Being overly fixated about where somthing sits on a chart so much so that you want consumers to have less data is ridiculous.
To play devils advocate, and if they weren't fixated over a specific company but actually looked at this objectively, the argument would be "remove CS2 cause every card can play it and replace it with a different game". Which is actually my opinion, I think all cards can do CS2 plenty fast, just replace it.
 
For prices to even make sense today, they should have been
RTX 5060 at $199
RX 9060XT 8GB at $229
RTX 5060 Ti 8GB at $249
RX 9060XT 16GB at $279
RTX 5060 Ti 16GB at $299
thats something that has been bugging me for a little while.

I agree that they should be cheaper but then I ask, maybe they are in the correct price, given how much the dollar and other currencies, have devalued and how we just had an insane jump in inflation?

So thats my conundrum.

Then again, historically, new hardware offered more and many times, it launched at the same MSRP as the previous one, but then quickly would fall.

But that is not something that we have experienced much in recent times either.
 
Im not buying either cause they are both abhorrable, I have a 3060ti and none of these cards are anywhere near a decent replacement. They are both garbage. BUT, in case someone is in need of a card (say my 3060ti breaks), im buying the 9060xt 16gb, assuming I find it at it's msrp. Happy?

I honestly think it should be investigated how these meh products end up looking so similar.

It's almost like AMD/Nvidia got together 12-24 months ago and where like where we sticking the 60 class this generation smh.

Same with the 70 class lmao.
 
For prices to even make sense today, they should have been
RTX 5060 at $199
RX 9060XT 8GB at $229
RTX 5060 Ti 8GB at $249
RX 9060XT 16GB at $279
RTX 5060 Ti 16GB at $299

The performance of these cards is too low to have a price over $300. And the performance difference between these cards is so minimal, that the most expensive product shouldn't be over $100 more than the cheapest one. The difference in many cases resembles of that we where getting from overclocking alone 10-15 years ago.

That being said, let's hope that AMD and Nvidia will copy Intel and have 160bit and 192bit cards for the low end market. Or 80bit and 96bit cards with very fast VRAM if going with 160bit and 192bit data buses makes these cards... expensive to make. Because if they remain at 128bit, we might get more 8GB models in 2 years.
For these prices to be lower, or how they should be, it would require Nvidia to stop being stagnant on VRAM, and actually give a crap about the gaming market to stop overcharging for their cards while gamers get the scraps from their massive AI dies. Before anyone else says stop blaming Nvidia, they're the market leader, and they're the one who would have to take charge and lower prices.
As for 80 and 96 bit cards, I don't want Nvidia and AMD to nerf GPU bandwidth even more, cutting corners on things such as pci-e 5.0 X8 instead of X16 already receives a performance drop on older motherboards.
 
For prices to even make sense today, they should have been
RTX 5060 at $199
RX 9060XT 8GB at $229
RTX 5060 Ti 8GB at $249
RX 9060XT 16GB at $279
RTX 5060 Ti 16GB at $299
If we exclude inflation and all that and based on 2020 ampere

5060 = 129$
9060xt 8gb = 149$
5060ti 8gb = 169$ (nvidia tax)
9060xt 16gb=199$
5060ti 16gb = 229$ (nvidia tax)

We have to remember, all of these card are slower or barely faster than a 5 year old 3070 :roll:

I honestly think it should be investigated how these meh products end up looking so similar.
If I didn't know any better I'd think they are colluding :D
 
I agree that they should be cheaper but then I ask, maybe they are in the correct price, given how much the dollar and other currencies, have devalued and how we just had an insane jump in inflation?
Correct, inflation is the issue. In fact it's personally surprising to me that the $300 bracket even exists anymore and hasn't been completely replaced by APUs, and that these cards offer improvements over the previous generation while staying at the same price.

Remember, your money continuously devalues over time.

There's a lot of wishful thinking about pricing, and obviously vendors are obliged to earn as much profit as they can, but the low end cards have been remarkably consistent in pricing over the years, despite the expense of modern nodes, inflation and other issues that affect price. People like to point at the cost of "muh VRAM" chips and pretend that's the only reasonable metric to examine pricing against, but also seem to enjoy ignoring literally every other economic factor.

Just about every product, let alone luxury products—and make no mistake, dedicated GPUs and PC gaming in general is a luxury hobby—has risen in price over the past few years, smartphones, cloud subscriptions, the actual games we use these cards to play, etc. But somehow GPU performance jumps for the same pricing need to remain as good as they were in the golden years? One metric has to change, the performance jumps, or the pricing. You don't get to have the best of both worlds, and these companies aren't charities, nor are consumers forced to buy their products.

I honestly think it should be investigated how these meh products end up looking so similar.

It's almost like AMD/Nvidia got together 12-24 months ago and where like where we sticking the 60 class this generation smh.

Same with the 70 class lmao.
They exist in the same economic reality, and operate under the same economic rules, besides some economies of scale differences. Hence, similar outcomes.
 
50% performance between 9070 and 9060XT is unacceptable, this card should have been born with at least 40CUs, the price is good, but AMD is leaving a whole range uncovered, the 5070, and it's a range where many cards are sold... Really inconceivable.

I only checked two non NVIDIA "sponsoed", in my point of view, game titles.

radeon 9060XT is the weaker release of the previous radeon 7700XT

I'm kinda happy AMD did not totally ruin the current value of my radeon 7800xt. A faster card would have caused a price drop of 7800xt and similar cards. weaker 6800 non xt, equal 6800xt, better 69xx XT

9060XT was just made as a placeholder for the 7700XT. AMD maybe wanted a 1 vs 1 replacement.

16Gb will future proof you...

16gib vram is not really future proof. Especially for soon 2026. I'll not write it on every review, still i think it's time for 24Gib VRAM baseline for calling a card future proof. I like that some sponsored media even pick up that topic.
16gib vram maybe was future proof in 2022.

I don't know if it's accidental or a lucky "cocktail"of games, but the average fps of the 6800XT being 2% lower than the 7800XT in 2025 feels a bit odd.

I also saw this. I bought the 7800xt and sold my radeon 6800 non xt. The 6800 xt was always around 5-10 percent faster in the past.

--

NVIDIA fanboys enjoy your 5 minutes of happiness. One bad windows game crashed. With very bad game mechanics.
Star Wars jedi survivor crashed regularly with the 7800xt when i did the same for months. This is a windows 11 pro issue, a driver issue or a game issue. Windows Software and Drivers do not have high quality. Whatever that was the case since windows 95 as far as i remember.

edit:

Handicapping a modern processor to show disadvantage of PCIe3 is just silly.

No - Many will buy such entry graphic cards with AMD mainboards which only have PCIE 3.0.
E.g. B450 Mainboard = PCIE 3.0 but you can use decent processor and dram for such a card
B550 = PCIE 4.0

--

such graphic cards should not be tested with high bandwidth bus in teh first place. You will not see the VRAM limitations in some games.

Just because MAFIA 3 runs flawless does not mean every game title will run flawless. The test setting should have games and setting to show early before purchase the limitations. Therefore I think PCIE 3.0 should be used for any tests for a graphic card which is equal or worse as a radeon 7800xt, low to middle class graphic card. (i wrote middle class so some will not be upset)

edit

I think there is something going on with the relative performance charts when the 9060XT performs as well

I also dislike the relative performance chart. I think i wrote something similar ages ago.

That relative chart or the price per frame chart and such are metrics which I try to ignore now.

I have to accept opinions. Regardless if I like them or not.

The tester had ages ago a own topic where everyone was invited to state their claim. Which game to remove or to add. You may participate in such topics please.

--

edit

whataboutism

300 EURO / US dollar / beaver bucks is not enough money to make a graphic card

Pick any cheap processor like Ryzen 7400f / 7500F / 7600 / 7600X or the even cheaper ryzen 8000 desktop cpus.
Add some glue logic and pcb

The price should be not more than 170€. No one can argue with me a desktop processor is not similar to a graphic card core. I do not see any expensive parts on a graphic card pcb. The pcb may cost some cash. Let's be generous and make it 200€ including TAX.

I think i saw cheap AM5 processors for around 50 or 70 Euros for many weeks.

edit: changed a bit the text
 
Last edited:
thats something that has been bugging me for a little while.

I agree that they should be cheaper but then I ask, maybe they are in the correct price, given how much the dollar and other currencies, have devalued and how we just had an insane jump in inflation?

So thats my conundrum.

Then again, historically, new hardware offered more and many times, it launched at the same MSRP as the previous one, but then quickly would fall.

But that is not something that we have experienced much in recent times either.

That's definitely part of it the 5060 is $225 in 2016 money and $239 in 2019 money thr last two generations the 60 class was pretty decent.

I still think the 60 class from both sides should be a much better products but I also don't care if that means 10-20% higher MSRPs.

At the end of the day this isn't a charity they are going to maximize margins as much as they can anyone who thinks either companies care about anything besides separating $$$ from our wallets is delusional they will always take the most profitable route forward.
 
They exist in the same economic reality, and operate under the same economic rules, besides some economies of scale differences. Hence, similar outcomes.

i get that and I was mostly joking I just find it midly funny how two companies competing with each other for our dollars come up with very similar products tier for tier.

I think many people simply don't understand how bad it is.

On top of that both these companies are not going to sell us silicon for pennies on the dollar when they can sell it for 10x-20x to the professional market.

There is only one company making silicon good enough for both these companies and they are also maximizing their profits and dont care if the silicon goes into a 300 usd gpu or a 40000 usd one they still charge the same.
 
I'm impressed. The card appears to match its positioning, perfectly.

Approx. RTX 3070 ~ 7700XT in Raster
and
Better than 6900XT ~ 4060 Ti in RT

Price : Performance at $350-360 is fantastic, but too much over MSRP and it starts to look less-attractive.
 
There is only one company making silicon good enough for both these companies and they are also maximizing their profits and dont care if the silicon goes into a 300 usd gpu or a 40000 usd one they still charge the same.
TSMC's near monopoly on state-of-the-art silicon is heavily skewing tech prices up. Which makes me wonder if products in the low end could be backported to older nodes or if it would not be worth the engineering expenditure, even if it could allow moving more production through the channel.
 
About the same price I paid for my 6800 at same performance however better RT etc. Hopefully etailers sell at msrp…..

@W1zzard would it be possible to add the vanilla RX6800 to the charts?
 
TSMC's near monopoly on state-of-the-art silicon is heavily skewing tech prices up. Which makes me wonder if products in the low end could be backported to older nodes or if it would not be worth the engineering expenditure, even if it could allow moving more production through the channel.

I've thought about that as well but their last architecture that used a difference node was 30 series could they make a 3070ti 16 GB like product for less than a 5060ti 16GB I'm not so sure and it wouldn't have the technology appeal of the newer architecture.

I think the cost of designing an architecture specifically for N5/N6 would probably not save enough vs just using the same node for all products I also wouldn't be surprised if they get discounts for not using other manufacturers.

I think backporting isn't an option look at rocketlake as the most recent example of how that goes.
 
I've thought about that as well but their last architecture that used a difference node was 30 series could they make a 3070ti 16 GB like product for less than a 5060ti 16GB I'm not so sure and it wouldn't have the technology appeal of the newer architecture.

I think the cost of designing an architecture specifically for N5/N6 would probably not save enough vs just using the same node for all products I also wouldn't be surprised if they get discounts for not using other manufacturers.

I think backporting isn't an option look at rocketlake as the most recent example of how that goes.
Maybe backporting was not the word I truly meant to use, but you got my idea: designing the lesser products from the ground up with older nodes in mind.
 
TSMC's near monopoly on state-of-the-art silicon is heavily skewing tech prices up.
TSMC isn't even the actual monopoly. They're just the best foundry so they're always a few years ahead of Samsung and Intel with smaller nodes and their quality and yield is better. ASML is the actual monopoly, they're the ONLY manufacturer of the photolithography machines for small nodes.
 
TSMC's near monopoly on state-of-the-art silicon is heavily skewing tech prices up. Which makes me wonder if products in the low end could be backported to older nodes or if it would not be worth the engineering expenditure, even if it could allow moving more production through the channel.
Maybe not now, but I do expect to see some 'choice' silicon from recent times and/or near-future end up that way.
Maybe backporting was not the word I truly meant to use, but you got my idea: designing the lesser products from the ground up with older nodes in mind.
Esp. if nVidia wants to exit the consumer graphics market, I could see nV 'backporting' their last gen. of AI/MI GPGPUs onto a less 'congested' production node.
AI/MI tools internal to nVidia, probably would help make short work of an otherwise very complicated process. As long as a 'sufficient' node had ample production space, I could see it happen.
 
Back
Top