• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASUS Radeon RX 9060 XT Prime OC 16 GB

At the moment, it depends on the physical capabilities of the user. I myself can very much tell the difference between 120Hz and 240Hz in a FPS game, so much that black frame insertion would be noticable to me, but for others not so much.

I can agree that 240+ Hz is diminishing returns depending on how well the game handles latency and input lag. Same with 2000Hz+ polling rate on devices.
There are places that they still run the original Doom and even Quake 3 or similar, which we know at this point would push a gazillion FPS on todays hardware.

If we include them here, it would definitely make a mess on the overall averages.

So we go back to the original point, its fair to include them IN the overall averages or not?
 
Sure, but how many people are in the position of getting a $350 5800X3D or 5700X3D, a $400 5060-Ti, but can't spend another $80 for a cheap B550 mobo to get PCIe 4.0?
I don't see the point of replacing the motherboard if someone could upgrade an older board to an X3D, $80 is still too much to spend on a sidegrade,IMO.
In some circles it's called AMD Unboxed.
That hasn't been true for years, HUB changed their opinions when Nvidia tried to block them from getting review cards.
 
In some circles it's called AMD Unboxed.
Dont think they are, its just that antinvidia content generates more traffic (you know, the usual suspects consume everything anti nvidia). Some content creators have said it openly, if we dont trash talk nvidia we arent getting clicks!!
 
Personal opinion: I'd rather take the slight performance loss against the financial loss of this sidestep. Case in point: I run a 7900XTX on a X470 mb with a 5700X3D.
Which is a totally fair choice. And with a PCIe 4.0 x16 connection and 24GB of VRAM, you're barely losing anything anyway. I'm just saying that if someone is still holding onto a PCIe 3.0 motherboard, they shouldn't be complaining about performance loss on a new PCIe 5.0 GPU. AMD's extended support for AM4 motherboards is the oddity here, not Nvidia.
 
Simply because W1zzard had already said he’d replace it with The Dark Ages. CS2 is just too lightweight to be a meaningful GPU benchmark.
Again, CS2 isn't just in there because it's popular. It's there to provide representation for the Source 2 engine, which is important for many people, especially with Half-Life 3 rumors growing.

Also, sure it's lightweight and runs fast, but with a 9800X3D in the test bench you don't hit a CPU bottleneck even at 1080p. You can clearly see the expected relative performance of the cards, even if the base fps is very high.
1749067583477.png


It's obvious the CS2 performance is a mistake, the RTX 50 series had the same thing at launch, and it's fixed now. IIRC the 5090 was below the 4080 Super or something in the initial review. The game just needs a bit of extra love from the driver team. Nobody is saying the 9060 XT is a bad GPU because of this obvious driver issue.

What is unacceptable is how AMD fanboys are crying and whining about it and making absurd arguments about how it doesn't belong in a benchmark suite just because AMD's driver team dropped the ball on press drivers. It shows how any sort of reasonableness and logic goes out the window when your preferred mega corporation gets egg on their face. (And just to address this in advance, yes Nvidia fanboys do this as well. No that doesn't have any relevance to this thread).
 
Last edited:
Dont think they are, its just that antinvidia content generates more traffic (you know, the usual suspects consume everything anti nvidia). Some content creators have said it openly, if we dont trash talk nvidia we arent getting clicks!!

I mean their job is for people to watch their content but I fully believe they do not like the 5060ti 8/5060, neither does DF who love all of Nvidia stuff including framegen and leans way more heavily towards Nvidia to the point they demo their products pre release.

No matter what a reviewer chooses to do their results will always be skewed one way or another if W1z chose 23 different games results might swing a different way or they may even widen.

Still even if w1z removed CS I doubt results would move more than 1-2% so like I said in a previous post this has 100% to do with fanboys wanting the 9060XT 16G above the 5060ti 8G on a chart at 1440p and zero to do with the games selected.

People can easily add the fps from each game remove the ones they don't like and then divide them by the chosen number of games and get whatever average they are looking for....

This has everything to do with the 5060ti finishing on top and nothing to do with anything else.
 
People can easily add the fps from each game remove the ones they don't like and then divide them by the chosen number of games and get whatever average they are looking for....
Setting aside the AMD vs Nvidia debate, it would be really cool to have an interactive version of the price/performance chart. You could plug in the prices you found for a GPU (especially with the used market or if you're in another country) and it would do the value calculations right in the browser.
 
There are places that they still run the original Doom and even Quake 3 or similar, which we know at this point would push a gazillion FPS on todays hardware.

If we include them here, it would definitely make a mess on the overall averages.

So we go back to the original point, its fair to include them IN the overall averages or not?
Comparing CS2 with old games like Quake 3 (or Quake Live now, which is capped to 250 BTW) is disingenuous. More people play CS than pretty much all other games in the benchmark suite combined. So yes, it’s worthy of testing and yes, it absolutely should be included in the averages. RDNA 4 inexplicably shit performance with Source 2 is a legitimate issue and should be a knock against them. Regardless of anyone’s personal feelings. If they (as in AMD) also feel it’s an issue there is nothing stopping them from reaching out to Valve to sort this out. If one isn’t interested in CS they can just look at performance in the games/engines relevant to them. Easy. But aggregate is aggregate and omitting something makes no sense.
 
Setting aside the AMD vs Nvidia debate, it would be really cool to have an interactive version of the price/performance chart. You could plug in the prices you found for a GPU (especially with the used market or if you're in another country) and it would do the value calculations right in the browser.

Same with the relative performance chart would kill two birds with one stone although the majority of people complaining just want their gpu brand of choice on top so there is no placating them.
 
And I just dare to remind you AMD promised us unbelievable value and extraordinary shake up. "We are here for gamers!"

For whom, excuse me? This bell doesn't even toll, it's got stolen because everyone was busy trying to find at least one reason why RDNA4 is not a further status quo amplifier.

Like, single digit percent improvement over an INSANELY poor value nVidia card in some select scenarios where nVidia tech is irrelevant by design... I dunno, stinky. Reeks and sickens.
 
Last edited:
There are places that they still run the original Doom and even Quake 3 or similar, which we know at this point would push a gazillion FPS on todays hardware.

If we include them here, it would definitely make a mess on the overall averages.

So we go back to the original point, its fair to include them IN the overall averages or not?
And the point still stands, it is fair to include Counter-Strike 2 because it is currently being played today (it is the most popular game on Steam worldwide) and does have a "modern" engine, despite being easy to run.
 
And I just dare to remind you AMD promised us unbelievable value and extraordinary shake up. "We are here for gamers!"

For whom, excuse me? This bell doesn't even toll, it's got stolen because everyone was busy trying to find at least one reason why RDNA4 is not a further status quo amplifier.

I mean on one hand it's a big step up overall compared to RDNA2/3 just from how it handles upscaling to RT performance the problem and this goes for Blackwell other than the 5090 as well is it's just not doing anything people couldn't buy 12-24 months prior.

For people who would rather die than buy Nvidia good for them though they finally get a card similar to what Nvidia offered 1-2 years ago.

Personally I find RDNA4 way more appealing than the last couple generations from them I just wish it was a full stack and not just competing with 2 of the worst cards from Nvidia and the 3rd/4th step down. Those are the money makers according to AMD can't use up too much of that precious AI/professional market silicon for us pleb gamers.
 
Which is a totally fair choice. And with a PCIe 4.0 x16 connection and 24GB of VRAM, you're barely losing anything anyway. I'm just saying that if someone is still holding onto a PCIe 3.0 motherboard, they shouldn't be complaining about performance loss on a new PCIe 5.0 GPU. AMD's extended support for AM4 motherboards is the oddity here, not Nvidia.
Even, in the heyday of fast-moving advancement, Socket 7 / Super Socket 7 'lasted' 6+ years, in the consumer sector.
Socket 370, 478, lasted 5+
Socket T/LGA775 lasted 7+

AM4 (at this point in time) is just a little older than LGA775's run.
NtM, socketed platforms that have 'embedded' options, typically will have a 10+ year (support) lifecycle, even after retail-availability has faded.

Add another $80-100 for DDR5 RAM.
But yeah Ngreedia, because they cut too many corners on their midrange cards.
DDR5 cost is a big barrier for the platform upgrade, and DDR4 Intel boards are a marked performance loss.

@ this point, seeing how much bank nV is making on 'big data' products... can you blame them?
They have (nearly) 0 motivation to retain 'gaming/consumer' marketshare.
 
it's a big step up overall compared to RDNA2/3
After 2 and a half years of RDNA2 experience I can tell you even a five week old hot dog from the shadiest Bronx neighbourhood would be tastier than any Navi GPU.

Beating this obsolete-long-before-released stuff is not enough. What would be enough is establishing a real step up. Making it so you don't even think about buying nVidia, making it so you can get something that will please you and not make you wait indefinitely until X will be fixed or introduced. Too poisonous.

Far from being the case. What we see now is the value parity. Absolute dogshit value parity.

"They murdered a person with a knife. We are better, we used a longsword!"
 
9060XT 16G above the 5060ti 8G on a chart at 1440p

Id def pick the 9060XT 16GB over 5060Ti 8GB. 5060Ti 16Gb vs 9060 XT 16GB tougher choice.


1749070501655.png
 
Are stocks of 7800XT and 7900GRE depleted? Because that's where the 9070GRE will slot in raster-wise.
7800XT is still easy to find. 7900GRE has been out of production for a long time (more than 5 months ago) and I haven't seen it in retail for a while.
 
Id def pick the 9060XT 16GB over 5060Ti 8GB. 5060Ti 16Gb vs 9060 XT 16GB tougher choice.


View attachment 402509

Yeah, I was most mostly talking about the end graph and the 8G card being on top and fanboys not being able to handle it. I would also choose the 9060XT 16GB assuming it cost less than 400 usd for a decent model.

There are clear downsides to owning an 8GB gpu in 2025 but my guess is 70-80% of them are going to be sold in prebuilt gaming PCs by people who have no clue what TPU is. Beyond that it's likely just people who play esports games that want to hit 200-300fps and are on a 1060 2060 6G who won't touch an AMD GPU which AMD is going to need to continue making good products to change those peoples minds.... I am doing a 5060ti 8G build for someone who is just like that even though I told them they should wait for the 9060XT 16G they've had a ton of issues with a RX 5700 apparently. The only upside is I got some free DDR5 6000 CL30 with it the downside is it's silicon power branded and I had to get their NVME as well not saying it's a bad brand just never used it.
 
I'm okay with this card, decent price cut for 7700 XT + Extra VRAM and better Raytracing.
But the 8GB is where I'm concerned.
 
Beyond that it's likely just people who play esports games that want to hit 200-300fps and are on a 1060 2060 6G
Yep. Both AMD and Nvidia have identified that the entry-level market is primarily prebuilt and laptop gamers who spend 95% of their time playing esports games that don't need 6GB, let alone 16GB. So they've prioritized cost control and power efficiency over gen on gen performance gains.
 
they've had a ton of issues with a RX 5700 apparently.

Yeah, they were prone for it and probably lost a few customers because of it.

Nothing wrong with 8Gb cards, it's just no reviewers I have seen recommends them and for good reason. And I'll be honest, if one of my mates wanted to buy the 5060Ti 8GB version I would spot him the extra $$$ just so he could get the 16GB version. It would have more longevity and more importantly have better resale value, especially when nobody wants an 8Gb card anymore.
 
I'm okay with this card, decent price cut for 7700 XT + Extra VRAM and better Raytracing.
But the 8GB is where I'm concerned.

My final judgement on this card will be in two weeks.... If it sticks close to 350 usd It becomes my default recommendation if it jumps over 420 the 5060TI 16G is the better option.

Yeah, they were prone for it and probably lost a few customers because of it.

Nothing wrong with 8Gb cards, it's just no reviewers I have seen recommends them and for good reason. And I'll be honest, if one of my mates wanted to buy the 5060Ti 8GB version I would spot him the extra $$$ just so he could get the 16GB version. It would have more longevity and more importantly have better resale value, especially when nobody wants an 8Gb cards anymore.

Same my brother was going to buy a 4060 I ended up buying him a 4070 although he was the same and didn't want the 7800XT he probably made the right choice given how gaming is going though given that he also games at 1080p

This is for somebody I don't know so I am not spotting them 120+ usd lol They had a budget of 1200 with tax which was already a challenge for me and I refuse to build with parts I would not use myself unless they are free.
 
Back
Top