• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASUS Radeon RX 9060 XT Prime OC 16 GB

The 8gb model isnt an issue. Amd could price it at 250 and have the 16gb at 300. The problem IS the performance. These cards can't replace anything but 5 year old entry level cards like 6600s ans 3060s. Thats awful
But the 9070 cards didn't replace anything either. They're AMD's version of a 4070 (Ti Super). Yet look at how they've been received.
 
Amd likely makes more money from consoles than DGPU and both console makers wanted better Upscaling and better RT for next generation I still think that is the only thing pushing the AMD graphics division forward although Nvidia basically set them up for a layup this generation the 9070XT seems to be doing really well on Newegg US with 3 models in the top ten that consistently sell out.
They make good money on console chips, but, AFAIK, it’s not anything crazy. Those are extremely low margin products for AMD. Volume saves them, sure, but the bulk of peak demand for PS5/SeriesX has passed. The PS5 Pro IS new, but I don’t see much uptake on that, not at that price and not in the current economy. AMD seemingly want to try to muscle in on NV turf with enterprise offerings taking opportunity that comes from NV being unable to meet the demand fully due to allocation. I would imagine that’s where their focus will be with UDNA.
 
see like 10-15% OC potential on lots of NVIDIA Blackwell "middle" GPUs, there's some good potential here for "next" gen ;) Bump the voltage a bit and you get a solid 15%, plus +50% VRAM
If you're an enthusiast I don't see any reason not to OC Blackwell cards, you'd just be leaving performance on the table.

My hunch is that, knowing competition wouldn't be fierce Nvidia did this on purpose to leave that ~10-20% headroom for Super models with +50% VRAM.
 
The rumour is rx9060xt 16gb will be around 380 euro in my country.
If this holds up, I will definitely get that instead of paying 100 euro more for the rtx 5060 16gb version.

Hopefully more accurate price news will soon appear.
 
  • Like
Reactions: NSR
I think many people simply don't understand how bad it is.
I am one of them. I think price increases on GPUs is more a result of Nvidia's goal to cover the loss of the low-mid range market. With integrated GPUs in CPUs killing low end and APUs being a future(current) threat to mid range GPUs, I believe Nvidia had a goal to increase pricing for over a decade. My belief that Nvidia was driving prices up is pretty old and goes back to the first Titan model. That model was advertised as a semi pro model and at the same time as the ultimate gaming model. It was clear to me that Nvidia was trying to make enthusiasts get used to $1000 price tag. $1000 back then for a gaming card was ridiculous, but as I said Nvidia was also promoting Titan as a semi pro card, something that was true, because Titan had more features enabled in the driver level. But the goal was gamers to start paying 4 digits for a gaming card and Titan made it happen. I don't blame Nvidia here. Nvidia has seen what happened to Creative so they probably had set two goals 15 years ago. First to increase prices and secondly to create exclusive features that will be able to justify those higher prices.

Mining in the past and now AI and the fact that Nvidia is the de facto monopoly and AMD and Intel are not willing or not capable to start a price war, gives Nvidia all the opportunities they need to increase pricing beyond inflation. There might be costs today that are hidden to us consumers. For example, manufacturing could be much more expensive for the same type of PCBs, or more expensive PCBs might be needed for cards that look like older cards (same power consumption, same data bus, same memory chips, but different PCIe version). Maybe the production of electronics is much more expensive today compared to 10 -15 years ago. Don't know. But it's not inflation alone. Looking at the dollar, from 2016(when GTX 1060 was released) to 2025 it seems that $100 in 2016 equals to $133 today. So someone could justify an RTX 5060 costing as much as $400. But of course the RTX 5060 is not as the GTX 1060. In fact it is a GPU that when looking at 1000 series would have been placed between GTX 1050 Ti and GTX 1060 3GB. The GTX 1050 Ti was priced at $139 and the GTX 1060 3GB at $199. So a fair price to use as comparison from 2016 would be probably $179 that today after inflation equals to $239. I don't know, I think $239 looks much more acceptable as a price. And that's about the low end RTX 5060. Comparing the prices of GTX 1080 Ti with the price of RTX 5090, well, things look much worst there. We could say that the price of 5090 should be compared to the price of Titan Xp for example, but Titan Xp is a semi pro card.
 
But the 9070 cards didn't replace anything either. They're AMD's version of a 4070 (Ti Super). Yet look at how they've been received.
Because the 9070 is a decent enough upgrade for a lot or cards. These 4 x60 gpus aren't

Prices are up, 400 euros for the 8GB 9060xt
 
And HL3 could be a real thing that matters this generation of cards. Depends on the release.

HL2 came bundled with ATi cards and in DX9 even the slowest ATi card was faster than the fastest FX series card. But people bought the FX.

This is hypothetical testing and merely a thought experiment.
Handicapping a modern processor to show disadvantage of PCIe3 is just silly.
Like putting 1980s tires on a 2020 Ferrari...

CPU+mobo combos that only offer PCIe 3.0 are now, what, 5 to 7 years old?
Intel 10th gen and AMD Ryzen 2000 series (Zen+)?
These systems have CPU constraints outside of just PCIe bandwidth.

For example, look at the bottom of the TPU chart:
View attachment 402482

I'm sure there are plenty of people out there with a 300 or 400 series chipset that upgraded to Zen 3. That x16 link will matter to those one PCIe 3.

Correct, inflation is the issue. In fact it's personally surprising to me that the $300 bracket even exists anymore and hasn't been completely replaced by APUs, and that these cards offer improvements over the previous generation while staying at the same price.

Remember, your money continuously devalues over time.

There's a lot of wishful thinking about pricing, and obviously vendors are obliged to earn as much profit as they can, but the low end cards have been remarkably consistent in pricing over the years, despite the expense of modern nodes, inflation and other issues that affect price. People like to point at the cost of "muh VRAM" chips and pretend that's the only reasonable metric to examine pricing against, but also seem to enjoy ignoring literally every other economic factor.

Just about every product, let alone luxury products—and make no mistake, dedicated GPUs and PC gaming in general is a luxury hobby—has risen in price over the past few years, smartphones, cloud subscriptions, the actual games we use these cards to play, etc. But somehow GPU performance jumps for the same pricing need to remain as good as they were in the golden years? One metric has to change, the performance jumps, or the pricing. You don't get to have the best of both worlds, and these companies aren't charities, nor are consumers forced to buy their products.


They exist in the same economic reality, and operate under the same economic rules, besides some economies of scale differences. Hence, similar outcomes.

Inflation is a crap excuse. Look at other components. CPU's have gotten cheaper over time. The problem is GPU's make less money for Nvidia/AMD, so they focus their fab space on higher margin items.

I'm sure China is not happy about ASML being the sole supplier for EUV machines, but it's too complicated for them to steal and replicate, so they're stuck lol. And for the US and EU, it's better for their national security to keep ASML as the sole supplier, under their control. Blacklisting China and Russia from advanced EUV sales has crippled their domestic semiconductor industry. China has to resort to smuggling in whatever Nvidia chips they can get. So it works for the US and EU natsec.


Sure, but how many people are in the position of getting a $350 5800X3D or 5700X3D, a $400 5060-Ti, but can't spend another $80 for a cheap B550 mobo to get PCIe 4.0?

Probably zero. I'd bet a lot of Zen 3 upgrades were done with older motherboards. Otherwise they would have went with a 500 series chipset to begin with.

Even, in the heyday of fast-moving advancement, Socket 7 / Super Socket 7 'lasted' 6+ years, in the consumer sector.
Socket 370, 478, lasted 5+
Socket T/LGA775 lasted 7+

AM4 (at this point in time) is just a little older than LGA775's run.
NtM, socketed platforms that have 'embedded' options, typically will have a 10+ year (support) lifecycle, even after retail-availability has faded.


DDR5 cost is a big barrier for the platform upgrade, and DDR4 Intel boards are a marked performance loss.

@ this point, seeing how much bank nV is making on 'big data' products... can you blame them?
They have (nearly) 0 motivation to retain 'gaming/consumer' marketshare.

Socket 7 was probably the most versatile socket ever. Socket 370 lastest three years, if you include the rare Tualatian CPU's. Socket 478 also lastest about three years. But you couldn't "mix and match" CPU's as they used different RAM and chipsets. LGA 775 may have been a physical socket for some time, but was a mess compatibility wise. Just look it up on Wikipedia. Nobody has ever come close to AM4's versatility (since Socket 7) except maybe Socket AM2 which lasted quite awhile.
 
Just ordered one (Sapphire Pulse 16GB) from overclockers UK. Postage at £7.99 or more was the only option but the price of £314.99 wasn`t three bad. Hopefully a decent upgrade from the trusty RX580 which has served me extremely well indeed...
Personally, and unfortunately, I don`t see prices going down much, if any. Looking on a few places AMD are doing the rebate stuff with this launch, as they did with the 9070 series not long back.
I sort of wish crossfire/sli was still a thing in consumer graphics, alas its gone by the way side :(
 
Sure, but how many people are in the position of getting a $350 5800X3D or 5700X3D, a $400 5060-Ti, but can't spend another $80 for a cheap B550 mobo to get PCIe 4.0?

Me. The PC I have was originally a 2200G + 16GB ram on a B350 motherboard which was built in 2018 for around £600. Then I added a 6600XT in 2021 so it had a bit more GPU grunt. I got into CIV, Stellaris and other sorts of games so got a 5800X3D in 2022 as a drop in upgrade to help with those games. Now I have been on an ARPG kick and mainly play games via streaming from my PC to my steamdeck so I just ordered a 9060XT at MSRP today.

The next build will be a scratch build but I will probably hold out for another 3/4 years before doing that and then this rig can be handed to one of my children or it can become the family PC.
 
Inflation is a crap excuse. Look at other components. CPU's have gotten cheaper over time. The problem is GPU's make less money for Nvidia/AMD, so they focus their fab space on higher margin items.

-THANK YOU for finally saying it.

Inflation is the lazy man's answer to all price increases, let's Corpos get away with it too by shifting blame to the gubmint.

Plenty of things get cheaper over time. I can go out and buy a 4K TV today for $300 but 10 years ago the same TV would have been $5,000.

PC gaming is more popular than ever, the total addressable market it larger than ever, more people are using GPUs to go non-gaming things than ever. Even holding the price relatively steady and producing more would have resulted in higher profits thanks to more sales.

But Nvidia (and AMD follows) are margin driven companies, and Nvidia is chasing AI and AMD makes more margin on their CPUs than they ever will on their GPUs. TSMC having a monopoly on advanced nodes means they can set whatever price per wafer they want as well, which gets passed to consumers.

Then there are Tarrifs.

And of course somewhere in there, inflation due to cost increases of inputs is ofc making things pricier as well.

It's complicated and simply hand waving "inflation" is overly reductionist in this context.
 
Remember when the 9070 and 9070XT were gonna flood the market and save us from Nvidia's high prices?

1749133349571.png
 
200fps vs. 300fps, IMO no one would notice the difference unless looking at at FPS counter.

I disagree, the 5060Ti 16GB is way overpriced, these midrange budget cards shouldn't be above $300 from Nvidia and AMD. And even at 1080P, some games crash or performance worsens as the VRAM buffer fills up. 8GB of VRAM may be enough for those sticking to older games but I don't see the point in spending more than $200 on a GPU to play older titles.

At 1080p, serious fps multiplayer gamers already using 240hz or more refresh rate, while in the mainstream 1440p range, 144hz is the bare minimum for a gamer monitor. Meanwhile this 9060xt makes 200 FPS at 1080p and 128 fps at 1440p…

Dont know why u think that noone will notice the difference, since the input lag difference is quite big, 300 fps has way better response time than 200fps…
 
Inflation is a crap excuse.
No, it isn't. When food prices the world over have effectively doubled and the prices of a lot of everything else has gone up by 50% or more, all since the pandemic, it's is a perfectly valid perspective.

Please stop telling others it's nonsense. Everyone has a right to an opinion not only you.
I'll do as I damn-well please. Sure, they have the right to their opinion. And I have the right to point out how silly, narrow thinking and daft such an opinion is. It's how criticism works. People do it to me all the time.
There are many others that agree, this card isn't that good.
See? This is silly, narrow thinking and daft as a brush. The merits of the benchmarks PROVE such. These opinions are not meritful as they fly in the face of the measurements shown in this review. You all are welcome to your opinions and to state them. You are also welcome to look silly and have others point that fact out to you.

Cheers! :toast:

I see like 10-15% OC potential on lots of NVIDIA Blackwell "middle" GPUs, there's some good potential here for "next" gen ;) Bump the voltage a bit and you get a solid 15%, plus +50% VRAM
Are you speaking from what you've tried with your samples or from what you've seen elsewhere? Just wondering..
 
Last edited:
I'll do as I damn-well please. Sure, they have the right to their opinion. And I have the right to point out how silly, narrow thinking and daft such an opinion is. It how criticism works. People do it to me all the time.

See? This is silly, narrow thinking and daft as a brush. The merits of the benchmarks PROVE such. These opinions are not meritful as they fly in the face of the measurements shown in this review. You all are welcome to your opinions and to state them. You are also welcome to look silly and have others point that fact out to you.

Cheers! :toast:


Are you speaking from what you've tried with your samples or from what you've seen elsewhere? Just wondering..

Only one daft and silly is you, and as you pointed out many has proven that to you, then unfortunately theirs no hope for you.
 
There's nothing wrong with the 9060XT 16GB, but in terms of value, it's only a 5% discount compared to Nvidia.
This is EXACTLY how AMD lost all its marketshare over the last 5 years.

What are you talking about? The 9060 XT 16GB, $350, is competing against the 16 GB 5060 ti which is a $425 GPU msrp, that is a 17% discount. The 9060 XT 8 GB. $300, competes against the 8 GB 5060 ti which is $375. That is a 20% discount.
 
Only one daft and silly is you, and as you pointed out many has proven that to you, then unfortunately theirs no hope for you.
See, this is reacting with ego instead of debating the merits. This is how you know you've lost the debate. Instead of pointing out facts that support your position, you've just lashed out with an insult.

The benchmarks prove this card is solid and good step forward for AMD and the gaming public. You can't refute that. Therefore your opinion is meritless.
 
HL2 came bundled with ATi cards and in DX9 even the slowest ATi card was faster than the fastest FX series card. But people bought the FX.



I'm sure there are plenty of people out there with a 300 or 400 series chipset that upgraded to Zen 3. That x16 link will matter to those one PCIe 3.



Inflation is a crap excuse. Look at other components. CPU's have gotten cheaper over time. The problem is GPU's make less money for Nvidia/AMD, so they focus their fab space on higher margin items.



Probably zero. I'd bet a lot of Zen 3 upgrades were done with older motherboards. Otherwise they would have went with a 500 series chipset to begin with.



Socket 7 was probably the most versatile socket ever. Socket 370 lastest three years, if you include the rare Tualatian CPU's. Socket 478 also lastest about three years. But you couldn't "mix and match" CPU's as they used different RAM and chipsets. LGA 775 may have been a physical socket for some time, but was a mess compatibility wise. Just look it up on Wikipedia. Nobody has ever come close to AM4's versatility (since Socket 7) except maybe Socket AM2 which lasted quite awhile.
Are you kidding with that?
 
But I have yet to see any proof of AMD not allowing reviews of 8GB cards, I don't see it being relevant here either, and it's such an outlandish claim it sounds more like the fanatics in green skidmark pants upset because the 9060XT is a better value than a 5060Ti. If AMD were to block reviews then all of the tech press would be be posting articles.
So you don’t read the reviews you comment about??

Unlike other launches, AMD has changed the embargo rules. Today, only reviews for cards seeded by AMD can go live—OC or non-OC, doesn't matter. Tomorrow, all the remaining reviews, for cards provided by AMD's partners can go live, we have three more reviews coming up.
 
Back
Top