• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASUS Radeon RX 9060 XT Prime OC 16 GB

And while it is a meh product they went out of their way to make it look worse than it is using settings even I would never use with it so AMD was rewarded with a review basically trashing the product lol.

The only thing I found interesting is how bad it performed on an 8700K based system.

Tells me just using a fast system then limiting the pcie bandwidth doesn't show the full picture using some of these cards on older pcie versions.

HUB used settings that worked fine for the 16GB variant. Nothing wrong with that IMO. It shows the stark contrast between the versions and often the performance delta is a lot greater than the discount you get for going with 8GB of VRAM.

The 8700K tests were great though I agree, especially given that GPUs at this price point can often be purchase by piecemeal upgraders who may be running a much older platform. The F1 25 test was pretty funny. 2fps vs 60 fps for the 8GB vs 16GB card on the 8700K.
 
Nvidia has a cult following these days
Yeap, its quite well known that nvidia is the one with the cult following. Content creators that dare to criticize amd know fully well how big of a cult following nvidia has :D
 
Yeap, its quite well known that nvidia is the one with the cult following. Content creators that dare to criticize amd know fully well how big of a cult following nvidia has :D

Are you referring to HUB and GN? Please, GN is one of the good ones.
 
The F1 25 test was pretty funny. 2fps vs 60 fps for the 8GB vs 16GB card on the 8700K.

HUB explanation is "F1 25 makes no attempt to reduce visual quality" when VRAM is over-saturated to broken gameplay.

I'm not defending the 8GB card, however, it's unusual for developers (especially those of racing game or simulator) to not have some fail-safe method for rendering when this happens.
Even opening the menu and navigating to exit, may be impossible. So this should be submitted in a bug report.

But now that Codemasters is a EA Sports..o_O

1749137565886.png
 
Got the Microcenter email. Can’t tell what stock is like, but there are a few 8GB models at $299 and several 16GB models at $349. Some are marked up, but at least there are some at MSRP at launch.
 
HUB used settings that worked fine for the 16GB variant. Nothing wrong with that IMO. It shows the stark contrast between the versions and often the performance delta is a lot greater than the discount you get for going with 8GB of VRAM.

The 8700K tests were great though I agree, especially given that GPUs at this price point can often be purchase by piecemeal upgraders who may be running a much older platform. The F1 25 test was pretty funny. 2fps vs 60 fps for the 8GB vs 16GB card on the 8700K.

I don't have an issue with the review if it was just a comparative piece which it technically was but it should have been a normal review with the same test suite as every other review he's done. Changing games and settings review to review makes it seem like proving a point matters more than showing how the gpu actually compares to it's direct competitor the 5060.

I like HUB and hate 8GB gpus but even I'm getting annoyed at the content not showing me anything I don't already know.
 
Last edited:
HUB explanation is "F1 25 makes no attempt to reduce visual quality" when VRAM is over-saturated to broken gameplay.

I'm not defending the 8GB card, however, it's unusual for developers (especially those of racing game or simulator) to not have some fail-safe method for rendering when this happens.
Even opening the menu and navigating to exit, may be impossible. So this should be submitted in a bug report.

But now that Codemasters is a EA Sports..o_O

View attachment 402624
Failsafe? F1 2022 straight up crashed whenever it exceeded 8GB VRAM when I had a 3070. If 2025 is only slowing to a crawl but not crashing, at least the devs learned a little bit something.
 
Last edited:
I've only seen the Adrenaline review I linked earlier on the 8GB version, and I stand by what I concluded from it: at its price point, it is a no-brainer versus the 5060 non-Ti (both at ~R$2600 at Kabum).

I'll wait until it's below 2k prob.
 
Got the Microcenter email. Can’t tell what stock is like, but there are a few 8GB models at $299 and several 16GB models at $349. Some are marked up, but at least there are some at MSRP at launch.


If it goes like the 9070 launch they'll be sold out by Saturday morning, and you won't see those prices again until sometime late 2027 just before the next gen launches. I would expect them to wind up $50 above MSRP for the cheapest models.
 
You said 16GB is becoming the norm for low end graphics cards. That’s not true.

You might have misread!

Check again:

I think the biggest milestone is witnessing 16GB becoming the norm in the lower performance tier > finally hitting the sweet spot for the mainstream crowd.

lower "performance tier" aka 60-class GPUs! This was further cemented with "sweet spot for the mainstream crowd". With this being a 60-class thread and 16GB cards landing squarely in this tier, the intent is crystal clear. For the past two generations, in this tier, both AMD and Nvidia have offered 16GB models alongside their 8/12GB counterparts. So its reasonable to expect that 16GB will continue to be supported in this segment moving forward.

Low-end takes a different meaning.
 
You might have misread!

Check again:



lower "performance tier" aka 60-class GPUs! This was further cemented with "sweet spot for the mainstream crowd". With this being a 60-class thread and 16GB cards landing squarely in this tier, the intent is crystal clear. For the past two generations, in this tier, both AMD and Nvidia have offered 16GB models alongside their 8/12GB counterparts. So its reasonable to expect that 16GB will continue to be supported in this segment moving forward.

Low-end takes a different meaning.

I'm not so sure we had 12GB in 2020 with the 3060 only to see regression the following two generations and while they do seem to be offering 16GB now it's at fake MSRP with likely both products being over 400 usd at least in the states other regions may fair better.
 
If it goes like the 9070 launch they'll be sold out by Saturday morning, and you won't see those prices again until sometime late 2027 just before the next gen launches. I would expect them to wind up $50 above MSRP for the cheapest models.
Very possible. I don't think they do MSRP 2nd wave unless there's actually an abundance of these units just shitting on the shelves. There was rumors of 9070 stock hitting high numbers later this summer but I'll believe it when I see it.
 
lower "performance tier" aka 60-class GPUs! This was further cemented with "mainstream crowd". With this being a 60-class thread and 16GB cards landing squarely in this tier, the intent is crystal clear. For the past two generations, in this tier, both AMD and Nvidia have offered 16GB models alongside their 8/12GB counterparts. So its reasonable to expect that 16GB will continue to be supported in this segment moving forward.
Those cards have not sold, though. Let's do some math.

Per the latest Steam hardware survey, 5.9% of all GPUs have 16GB of VRAM. First, we need to take out all the GPUs that only have 16GB options and are popular enough (at least 0.15% market share) to be listed separately. Those are the 4070-Ti Super (0.86%), 4080 Super (0.81%), 4080 (0.76%), 5070-Ti (0.39%), 7800 XT (0.37%), 6800 XT (0.30%), 6900 XT (0.20%), and the 6800 (0.20%).

We take those out of the 16GB number, and we're left with the remaining 16GB GPUs having 1.54% market share.

There's only two other GPUs with 16GB options listed separately, the 4060-Ti (3.11%) and the 5060-Ti (0.21% already!). Even if we say that that entire 1.54% is 4060-Ti 16GB (which it absolutely is not), that's still less than half of all 4060-Tis sold. In reality, I'd guess maybe ~1% or so of the remaining 16GB number are 4060-Ti 16GB cards at most, given how many possible other 16GB cards there are.

Also, I'm not quite sure how or if Steam integrates iGPU memory into that VRAM stat. Windows allocates half of system ram to the GPU, so at least in Task Manager, if you have a laptop with 32GB of RAM, your iGPU will have 16GB of "GPU Memory." If Steam takes that same number, then any laptop with only an iGPU and 32GB of RAM will also report as a 16GB VRAM system, and there are LOTS of iGPU only systems.

However, that's speculative, so at most, I'm willing to grant the 4060-Ti 16GB being ~1/3 of all 4060-Tis sold.

I really wish that Steam broke these cards like the 4060-Ti or the Arc A770 with two different VRAM configurations apart so we could get an exact number.
 
Last edited:
At 1080p, serious fps multiplayer gamers already using 240hz or more refresh rate, while in the mainstream 1440p range, 144hz is the bare minimum for a gamer monitor. Meanwhile this 9060xt makes 200 FPS at 1080p and 128 fps at 1440p…

Dont know why u think that noone will notice the difference, since the input lag difference is quite big, 300 fps has way better response time than 200fps…
Are the "serious" multiplayer gamers using x60 tier cards? To me, seriously means they're playing professionally.
I can't personally notice any difference between 120 and 200fps, and anything past 144hz are very much diminishing returns, and I'd rather have higher performance in about every other game as tested than be concerned over a few outlier titles.
Are you referring to HUB and GN? Please, GN is one of the good ones.
GN is one the good ones, one the best even, but GN's reviews don't fit the bias of the team green fan base.
 
I couldn't tell from the Hub review he just spent 20m trashing it.
At least people will stop insisting HUB are AMD shills. :laugh:

But I agree. This review is actually even LESS favourable than what they gave the 5060 Ti 8GB and 5060 since they're not even doing the regular benchmarks, just instantly showing the issues.
 
Are the "serious" multiplayer gamers using x60 tier cards? To me, seriously means they're playing professionally.
I can't personally notice any difference between 120 and 200fps, and anything past 144hz are very much diminishing returns, and I'd rather have higher performance in about every other game as tested than be concerned over a few outlier titles.

GN is one the good ones, one the best even, but GN's reviews don't fit the bias of the team green fan base.

Anyone playing professionally at the big leagues (ESL, IEM majors, etc.) with multimillion dollar prize pools are likely playing on their sponsor's dime and have a computer designed around an unlimited budget to do so. It's not exactly the kind of PC you'll find anything below an RTX 4090, really. Reflex 2 with frame warp practically disqualifies any Radeon present or future until AMD has a competing solution and the performance to match if used in a latency sensitive application - and 5 ms is precious time lost for an athlete when getting hit in the game might mean taking a $1M bonus for the team or nothing home.

Most eSports players will be satisfied with this level of hardware, though. They are not pro players, they just play this competitive stuff - that's why to this day the GTX 1060 is still one of the most popular GPUs on Steam, competitive games were pretty much designed around its level of performance. The reason you'll find the 1060 enduring in that list (although it began to diminish as of late) and practically no RX 400 and 500 series GPUs is that gamers bought 1060s, and miners bought Polaris.
 
Are the "serious" multiplayer gamers using x60 tier cards? To me, seriously means they're playing professionally.

That seems silly tbf.

There are an estimated 1.8 Billion PC gamers as of 2024. There are between 5,000 and 10,000 "pro" game players who get paid and compete regularly.

That means "Pros" are 5.6X10^-4 percent, whatever that is.

You are way more likely to be a pro football player than a pro gamer.
 
Just ordered one (Sapphire Pulse 16GB) from overclockers UK. Postage at £7.99 or more was the only option but the price of £314.99 wasn`t three bad. Hopefully a decent upgrade from the trusty RX580 which has served me extremely well indeed...
Personally, and unfortunately, I don`t see prices going down much, if any. Looking on a few places AMD are doing the rebate stuff with this launch, as they did with the 9070 series not long back.
I sort of wish crossfire/sli was still a thing in consumer graphics, alas its gone by the way side :(

Mate, I thought you might have cocked it up and picked up a 8GB model. Had to check myself and lo and behold, 16GB for £314.99 it is.... now, thats more like it!! I was convinced 16GB wouldn't dip below £350, so didn't even bother looking locally since launch. Honestly thought these would hit the shelves starting north of £350, with the OC versions sinking their teeth into the £400+ range.

BIG Thanks for the heads-up. I've been holding back one of my nephews from upgrading his GTX 1660S for some time now. With the market taking an ugly turn some 12GB+ credible options were either eliminated or inflated prices were a deal-breaker. Hes got a budget of £200 and i promised to pay the difference providing he drops me the existing card, in hopes of grabbing something around the £300 mark. Adding another £15 on top for the 16GB model @ £314.99, thats an easy no problem!!.. all in quick time, I had a quick word with his old man, already wired over the top-up, dropped him a OCUK link and hes literally minutes away from work to pull that earth shattering trigger.

Inflation is the lazy man's answer to all price increases, let's Corpos get away with it too by shifting blame to the gubmint.

I agree. Inflation is one thing, inflated prices is another. Economic realities vs profiteering think-tanks inflating balloons with hot air, just to see how big they can get before they POP.
 
Last edited:
Just saw rx9060xt 16gb in my country at around 380 euro now.
There is no way on earth I pay 100 euro more for the rtx 5060 ti 16gb now.
That is not worth it.
 
That seems silly tbf.

There are an estimated 1.8 Billion PC gamers as of 2024. There are between 5,000 and 10,000 "pro" game players who get paid and compete regularly.

That means "Pros" are 5.6X10^-4 percent, whatever that is.

You are way more likely to be a pro football player than a pro gamer.

At lot of the less informed watch these pro gamers etc on twitch see that thet have Nvidia cards and want one.

At least 7 out of 10 builds I do are for people usually teenagers on their parents dime that think they are gonna make it streaming and want the same hardware brands the streamers use. They are usually coming from a garbage prebuilt that has a rx 500 or weak Nvidia gpu they over spent on at Bestbuy.

10 out of 10 times they have issues and they blame it on the Radeon card even though it's a shit prebuilt that didn't have adequate cooling performance and crappy maintenance habits but you know trying to tell clueless parents and teenagers that know everything that Goodluck.

Just ordered one (Sapphire Pulse 16GB) from overclockers UK. Postage at £7.99 or more was the only option but the price of £314.99 wasn`t three bad. Hopefully a decent upgrade from the trusty RX580 which has served me extremely well indeed...
Personally, and unfortunately, I don`t see prices going down much, if any. Looking on a few places AMD are doing the rebate stuff with this launch, as they did with the 9070 series not long back.
I sort of wish crossfire/sli was still a thing in consumer graphics, alas its gone by the way side :(

Awesome man at that price, hell yeah!
 
Last edited:
Silicon per mm2 value has gone up drastically in value due to the highly obvious cypto/ai boom but there is also a massive growth in other areas that impact it massively.

Now lets talk about the increase in production methods complications/requirements and time and you can very quickly see where that price increase comes from. Lets also add to the fact that silicon wafers are still the 300m diameter since 1999 and the ability to scale up certain aspects of the prodcution due to this just arent there.

If you use a wafer yield calculator, you will see that one Navi 44 costs around 80$ from the factory. If you add 20$ for all else - the PCB, VRAM, components, you see that from 100$ to the asking price of 360$ there is an extremely steep profit margin.


1749154626083.png
 
If you use a wafer yield calculator, you will see that one Navi 44 costs around 80$ from the factory. If you add 20$ for all else - the PCB, VRAM, components, you see that from 100$ to the asking price of 360$ there is an extremely steep profit margin.


View attachment 402690

LMAO, even if it does cost $100 for an assembled card (it doesn't), who's paying for the years of driver updates? Research and development of the next product? Engineering, testing and development for the new features? Assuming absolutely ZERO issues with the production line too.

You people seem to think you're owed a product at-cost, because I always see this flawed argument thrown around.
 
If you use a wafer yield calculator, you will see that one Navi 44 costs around 80$ from the factory. If you add 20$ for all else - the PCB, VRAM, components, you see that from 100$ to the asking price of 360$ there is an extremely steep profit margin.
You're forgetting opportunity cost. Every wafer that gets turned into an RDNA chip is a wafer that's not turned into an Instinct accelerator with 100x the profit. From a pure financial perspective, AMD (and Nvidia) essentially lose money with every gaming GPU.

LMAO, even if it does cost $100 for an assembled card (it doesn't), who's paying for the years of driver updates? Research and development of the next product? Engineering, testing and development for the new features? Assuming absolutely ZERO issues with the production line too.

You people seem to think you're owed a product at-cost, because I always see this flawed argument thrown around.
Yeah, why doesn't he just call TSMC and buy a wafer directly to get it at the "true" price? Oh yeah, because he doesn't have trillions of $ to invest over decades into making a GPU.
 
You're forgetting opportunity cost. Every wafer that gets turned into an RDNA chip is a wafer that's not turned into an Instinct accelerator with 100x the profit. From a pure financial perspective, AMD (and Nvidia) essentially lose money with every gaming GPU.

If so, please do a favour and exit the gaming market.
 
If so, please do a favour and exit the gaming market.
lmao. "If I have to pay $350 for a GPU, nobody should get a GPU at all!" This is peak sibling-breaking-toy-when-mom-said-to-share energy.
 
Back
Top