Monday, April 14th 2025

AMD Radeon RX 9060 XT Reportedly Capable of Boosting Up To 3.3 GHz, New Leak Suggests "Navi 44 XT" GPU

AMD has not publicly announced its Radeon RX 9060 XT 16 GB and 8 GB graphics cards, but board partners have inadvertently "revealed" the existence of forthcoming custom designs. Team Red's RDNA 4 kick-off events did tease a second quarter launch of a Radeon RX 9060 Series cards, but have remained coy since the conclusion of late February celebrations. Over a month ago, VideoCardz cited AIB insider knowledge—regarding early specification details. In this morning's follow-up report, unnamed board partner moles have theorized a possible public unveiling of Radeon RX 9060 XT models: at next month's Computex 2025 trade show. Industry watchdogs believe that Team Red's lower end RDNA 4 are specced to compete closely with Team Green's GeForce RTX 5060 Ti lineup. NVIDIA and involved AIBs are reportedly gearing up for a retail launch this week.

The latest leak suggests AMD's Radeon RX 9060 XT design being readied—as standard—with (reference) game clock frequencies set at 2620 MHz, and boost clocks going up to 3230 MHz. In addition, VideoCardz has heard mutterings about "overclocked variants" boosting up to the 3.3 GHz mark. The much-rumored Navi 44 GPU die could sport 2048 stream processors—half of Navi 48's full SP count. Prior to this week, TechPowerUp's GPU database entry indicated the utilization of a speculative "Navi 48 LE" unit. Now amended, the Radeon RX 9060 XT listing mentions a tentative "Navi 44 XT" variant. Leaked guideline info allegedly specifies 500 W power supplies, as minimum requirements for incoming cards. A 550 W base level could be advised for overclocked/overengineered models. VideoCardz did not see any 16-pin power connected SKUs within leaked material; "most specs" feature 8-pin power connectors.
Sources: VideoCardz, PC Gamer, Tom's Hardware, Wccftech
Add your own comment

47 Comments on AMD Radeon RX 9060 XT Reportedly Capable of Boosting Up To 3.3 GHz, New Leak Suggests "Navi 44 XT" GPU

#1
RejZoR
RX 9060 XT, if it's really made of faulty RX 9070 chips, it should clearly hit those clocks with certain amount of shaders disabled, meaning it'll have bigger surface area for cooling, less shaders to drive meaning lower power consumption from that.

I hope AMD will nail a sweet spot with it and that it will sell well. RX 9070 XT is a fantastic card and I really like gaming on it. If RX 9060 series hit that mark in own segment it's gonna be pretty sweet for gamers. I just hope whole market won't be yet again fked up because of America's childish behavior...
Posted on Reply
#2
Lionheart
Tsk tsk DOA just like Nvidia's 5060's series but somehow they'll sell even with gimped memory configs, should of been simply 12GB all around, no confusion for customers.
Posted on Reply
#3
alwayssts
We knew clock potential already by the die size.

Basically since it's less than half, you could essentially ascertain it was built for the current clock capabilities of 9070 xt (20gbps), where-as N48 has the potential to clock higher (and use faster ram) with more power.
Probably also ~225w/16GB core equalized with the stock ram the same way N48 is ~304w; perhaps >225w for overclocked 16GB models, but also OC models with 8GB within 225w. Make sense?
It does (at least to me), and I think is a very intelligent design. No real use in making a card optimized (for performance; obviously you could lower PL) to less than 225W imho, but up to/over it is good.

N44, being a value product, will stick to 20gbps. Makes sense. If 20gbps is normalized to ~2.97ghz, and the max oc (allowed) for 20gbps is ~2800mhz (22.4gbps), it shouldn't need core higher than ~3327mhz.
And some/most ram may not clock that high, so it would make sense to have many models with perhaps lesser-binned (still 20gbps) ram closer to 3200-3300mhz, or ~2700mhz+ on the ram; a typical OC.
And again, probably stock/low-end models (like a Pulse) targeting stock ram clocks w/ 16GB, and staying within 225w.

Also, we knew about clearing up the confusion between naming of 9060XT (N44) versus the actual 9070 GRE (N48 LE) for a while, especially since W1z added the latter to GPU-Z. Videocardz confused everyone imo.
DGMW, I was making that mistake for a while myself given the naming was anything but a guarantee (especially not knowing if AMD would use the GRE title again); products have weird name splits sometimes.
I think they should just go back to XL for the ~3/4 design. Or even just use LE. Both of those make more sense and are easier to convey to people. I get those may sound 'low-end', but GRE is just odd.
Although it does 'sound' better/stronger, I suppose. Perhaps they should stay gold, them and the rabbit Radeon.

Given the responses to this article (already), some people are STILL confused. Don't be; N48 LE (9070 GRE) is about keeping 48fps where higher-end N48 is ~60fps in contemporary games. N44 (9060 XT) 30fps.
These are the GRE and '600' markets, respectfully. They don't really change; we just get new ones as games that require (or rather can make use of) them arrive.
Posted on Reply
#4
Grebber01
Performance and price? 9060XT = 7800XT (+ FSR4 and better RT)??
Posted on Reply
#5
AlienIsGOD
Vanguard Beta Tester
Grebber01Performance and price? 9060XT = 7800XT (+ FSR4 and better RT)??
i would think closer to a 7700XT based on some other sites speculations, i.e. Moore's law is dead.
Posted on Reply
#6
rainzor
Performance probably closer to 7700xt, price no clue, maybe $380
Posted on Reply
#7
lexluthermiester
IF this is accurate, these seem like good options for AMD. I like them!
Posted on Reply
#8
yfn_ratchet
Given there's no in-between when it comes to 9070/9060 series, I would assume the 9060 XT to meet or beat the 5060Ti (in raster) for sure, probably wiggle around in 4070 Non-SUPER territory if we're being optimistic. That makes it a very attractive buy, especially with an extra bit of VRAM in its court, but the price will make or break it. Sub-$400 is this thing's home field, but knowing AMD the 9060 XT 16GB will be at exactly $400, get inflated even harder by the current GPU squeeze, and end up losing the leverage RDNA 4 has—value.
Posted on Reply
#9
alwayssts
Grebber01Performance and price? 9060XT = 7800XT (+ FSR4 and better RT)??
That would be the assumption. I think it should to be $300 (16GB), which means they'll probably be about tree-fiddy. :p You should see Navi4 (msrp) as either a ~40% perf upgrade or 40% price cut imho.
Because higher clock, and sometimes less ram because balancing to the 1440p/1080p market. Hence 9070xt MSRP is $600 and not $1000. It's like a real gen upgrade; 'member those?!
No way they'll go above 5060ti 8GB ($379?) IMO, even with 16GB, because that's just asking for low sales (because again, some people are very stupid loyal to nVIDIA).

Most don't actually think about PPC increase of the ROPs/L2/RT etc from Navi 4. On paper, you would think 7700xt, in reality probably closer to 7800xt.
The PPC the unit difference from ~7700xt/4070, then the clock difference to ~7800xt. Pretty close to that. People forget the 7800xt was clocked at 2430mhz! 3230mhz is 1/3 higher! OC that ram...
You wouldn't think 9070xt>7900xtx sometimes either, yet that is the reality, because where 7900xtx was tuned for ~1080p (nvidiaesque RT) and 4k raster, 9070xt is 1440p for both...so sometimes that equals >40%.

I would just look at games and think: What performance target on this graph makes sense for a card this level? The answer is *usually* 1080p48-60 (editted for market clarity); same market as 7700xt I suppose.
But actually doing at 1080p what N48 does at 1440p across different use-cases (and peoples' varying system configs) slightly better than 7700xt. If N48 just 'good-enough' for 1440p, this for 1080p.

Again, "*there-abouts*. Which is why I always think of cards in terms of not necessarily stock, but realistically tuned to hit a certain performance target. Just like I would tune a 9070/xt for 1440p48/60fps mins.
And this probably at least 1080p48, with overclocks bringing it closer to 60 (in many newer titles).

I think 9060xt will surprise a lot of people, especially those that already are making graphs about how they need the GRE to compete. Maybe in avg RT, but not in actual general acceptable performance (per rez).
These should, for people that are not idiots, compete with the 5060Ti extremely well. GRE probably the 5070 very well. And now you know how things line up; and why that 9070 bios exists (to compete w/ 5070ti).

They'll likely undercut nVIDIA for similar perf, again, which will surprise me absolutely zero, again, but some won't get it (because they...test differently), but those people can (rec to) overspend and/or upgrade faster.

Such is life. And I will continue to reccomend you do not take their advice, nor become one of them. :p
Posted on Reply
#10
TheinsanegamerN
If the base models are more conservatively clocked, could we finally get a 16GB LP card to replace the LP 4060? We proved we can handle ~125 watt cards in such a form factor.
alwaysstsWe knew clock potential already by the die size.

Basically since it's less than half, you could essentially ascertain it was built for the current clock capabilities of 9070 xt (20gbps), where-as N48 has the potential to clock higher (and use faster ram) with more power.
Probably also ~225w/16GB core equalized with the stock ram the same way N48 is ~304w; perhaps >225w for overclocked 16GB models, but also OC models with 8GB within 225w. Make sense?
It does (at least to me), and I think is a very intelligent design. No real use in making a card optimized (for performance; obviously you could lower PL) to less than 225W imho, but up to/over it is good.

N44, being a value product, will stick to 20gbps. Makes sense. If 20gbps is normalized to ~2.97ghz, and the max oc (allowed) for 20gbps is ~2800mhz (22.4gbps), it shouldn't need core higher than ~3327mhz.
And some/most ram may not clock that high, so it would make sense to have many models with perhaps lesser-binned (still 20gbps) ram closer to 3200-3300mhz, or ~2700mhz+ on the ram; a typical OC.
And again, probably stock/low-end models (like a Pulse) targeting stock ram clocks w/ 16GB, and staying within 225w.

Also, we knew about clearing up the confusion between naming of 9060XT (N44) versus the actual 9070 GRE (N48 LE) for a while, especially since W1z added the latter to GPU-Z. Videocardz confused everyone imo.
DGMW, I was making that mistake for a while myself given the naming was anything but a guarantee (especially not knowing if AMD would use the GRE title again); products have weird name splits sometimes.
I think they should just go back to XL for the ~3/4 design. Or even just use LE. Both of those make more sense and are easier to convey to people. I get those may sound 'low-end', but GRE is just odd.
Although it does 'sound' better/stronger, I suppose. Perhaps they should stay gold, them and the rabbit Radeon.

Given the responses to this article (already), some people are STILL confused. Don't be; N48 LE (9070 GRE) is about keeping 48fps where higher-end N48 is ~60fps in contemporary games. N44 (9060 XT) 30fps.
These are the GRE and '600' markets, respectfully. They don't really change; we just get new ones as games that require (or rather can make use of) them arrive.
I think your power use is way off, the RX 9070 factory OC models are sitting at 232w maximum. I'd put the 9060xt at something closer to 140-150 watt.
Posted on Reply
#11
Nin
Industry watchdogs believe that Team Red's lower end RDNA 4 are specced to compete closely with Team Green's GeForce RTX 5060 Ti lineup.
That is some pretty impressive sleuthing by those industry watchdogs.
Posted on Reply
#12
Jism
RejZoRRX 9060 XT, if it's really made of faulty RX 9070 chips, it should clearly hit those clocks with certain amount of shaders disabled, meaning it'll have bigger surface area for cooling, less shaders to drive meaning lower power consumption from that.

I hope AMD will nail a sweet spot with it and that it will sell well. RX 9070 XT is a fantastic card and I really like gaming on it. If RX 9060 series hit that mark in own segment it's gonna be pretty sweet for gamers. I just hope whole market won't be yet again fked up because of America's childish behavior...
Any videocard you see out there starts with the best slice of the wafer (high end) towards lower end. The lower end are cut down chips, that did not meet Quality guidelines. It is a waste to throw half of a wafer away because they are not matching up to the best ones.

So your a bit complaining while all fabs are doing the same for years really: it's only a good thing. Less E-waste and such. More for us.

3.3Ghz is nothing spectacular. Some 9070XT models have reached that. I'd say that that card is going to be a perfect all round 1440p card.
Posted on Reply
#13
Tomorrow
RejZoRRX 9060 XT, if it's really made of faulty RX 9070 chips...
It will not. 9070 series uses Navi 48.
9060 series will use dedicated Navi 44.
Posted on Reply
#14
dartuil
i HOPE 7700xt minimum performance
Posted on Reply
#15
Quicks
I think this is going to fail hard with that 128bit bus. Should have made it atleast 192bit. Atleast Nvidia has GDDR7 but even at that 128bit bus is a waste.
Posted on Reply
#16
john_
There is only one reason to boost a chip so high and that is to close the gap with the competition. I guess the 9600XT will be more cut down than what it should, to be able to compete with the 5060s, so AMD is pushing it as much as possible.
Posted on Reply
#17
LabRat 891
john_There is only one reason to boost a chip so high and that is to close the gap with the competition. I guess the 9600XT will be more cut down than what it should, to be able to compete with the 5060s, so AMD is pushing it as much as possible.
:laugh:

OG ATI Radeon 9600Pro and XT were quite cut down from the 9700Pro and 9800Pro/XT. -but, clocked higher.



Notice, the 9600 XT is clocked higher than even the 9800XT.

It's really not uncommon to have a 'thinner' faster-clocked lower SKU.
Posted on Reply
#18
john_
LabRat 891:laugh:

OG ATI Radeon 9600Pro and XT were quite cut down from the 9700Pro and 9800Pro/XT. -but, clocked higher.



Notice, the 9600 XT is clocked higher than even the 9800XT.

It's really not uncommon to have a 'thinner' faster-clocked lower SKU.
I have a 9600 Pro but unfortunately it throws artifacts on screen. I am thinking for some time now to cut a few potatoes and throw it in the oven hoping to fix it.
Posted on Reply
#19
Vayra86
Hey look another 4060ti 16gb that you can completely skip
320gbps... what a fucking joke
Posted on Reply
#20
SilentPeace
QuicksI think this is going to fail hard with that 128bit bus. Should have made it atleast 192bit. Atleast Nvidia has GDDR7 but even at that 128bit bus is a waste.
Sometimes I think people would be happier if the bus was 192 or 256, even if the performance was exactly the same. It would just make them happier that the bus number is bigger.
Posted on Reply
#21
Vayra86
SilentPeaceSometimes I think people would be happier if the bus was 192 or 256, even if the performance was exactly the same. It would just make them happier that the bus number is bigger.
No, its always a 'bus width' combined with the specified GDDR memory, which results in a speed + a width = bandwidth.

The GDDR speeds are generally same within a generation. The bus width turns them into something you can actually use.
Posted on Reply
#22
SilentPeace
Vayra86Hey look another 4060ti 16gb that you can completely skip
320gbps... what a fucking joke
Do you know how much the memory holds back cards in this class? I know if I overclock the memory on my 4060 Ti by 11% (from 288 to 320) it improves performance hardly at all.
Posted on Reply
#23
Vayra86
SilentPeaceDo you know how much the memory holds back cards in this class? I know if I overclock the memory on my 4060 Ti by 11% (from 288 to 320) it improves performance hardly at all.
Depends on what you're using it for, as usual, the fact remains, this kind of bandwidth was normal 10 years ago in a midrange card, but x60 got stuck on it indefinitely.
Posted on Reply
#24
SilentPeace
Vayra86No, its always a 'bus width' combined with the specified GDDR memory, which results in a speed + a width = bandwidth.

The GDDR speeds are generally same within a generation. The bus width turns them into something you can actually use.
Understand what I am asking:

Two cards equal in spec except one is 192-bit, GDDR6 and 448GB/s, the other is 128-bit, GDDR7 and 448GB/s. Which performs better?

They both perform identically. So why the obsession with bus width? It's boring.
Posted on Reply
#25
Vayra86
SilentPeaceUnderstand what I am asking:

Two cards equal in spec except one is 192-bit, GDDR6 and 448GB/s, the other is 128-bit, GDDR7 and 448GB/s. Which performs better?

They both perform identically. So why the obsession with bus width? It's boring.
Agreed, in that context its irrelevant.
Posted on Reply
Add your own comment
Apr 21st, 2025 23:12 CDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts