• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Upcoming UDNA / RDNA 5 GPU Could Feature 96 CUs and 384-bit Memory Bus

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,264 (1.13/day)
According to one of the most reliable AMD leakers, Kepler_L2, AMD's upcoming UDNA (or RDNA 5) GPU generation will reintroduce higher-end GPU configurations with up to 96 Compute Units (CUs) in the top-end Navi 5X SKU, paired with a 384-bit bus for memory. We still don't know what type of memory AMD will ultimately use, but an early assumption could be that GDDR7 is on the table. When it comes to the middle of the stack, AMD plans a GPU with 64 CUs and a 256-bit memory bus, along with various variations of CUs and memory capacities around that. For the entry-level models, AMD could deliver a 32 CU configuration paired with a 128-bit memory bus. So far, memory capacities are unknown and may be subject to change as AMD finalizes its GPU lineup.

After the RDNA 4 generation, which left AMD without a top-end contender, fighting for the middle-end market share, a UDNA / RDNA 5 will be a welcome addition. We are looking forward to seeing what UDNA design is capable of and what microarchitectural changes AMD has designed. Mass production of these GPUs is expected in Q2 2026, so availability is anticipated in the second half of 2026. A clearer picture of the exact memory capacity and type will emerge as we approach launch.



View at TechPowerUp Main Site | Source
 
Unless it's some earth-shattering improvement, I'm afraid I'll probably still go with NVidia. The pure raster performance is not that relevant anymore. Subsequently, the power of the RTX/DLSS ecosystem and its market penetration are just impossible to ignore and worth the premium for me. But I guess it'll be a good option for people who have time for tweaking stuff and playing with tools such as Optiscaler.
 
All I need is a successor to the 9060 XT that actually meets/exceeds a 7800 XT. If UDNA can keep the costs down (at least in the low-midrange segment), I'm happy. The rest is just the rest.
 
128bit = 8GB VRAM for the "low end" costing $300?

Unless it's some earth-shattering improvement, I'm afraid I'll probably still go with NVidia. The pure raster performance is not that relevant anymore. Subsequently, the power of the RTX/DLSS ecosystem and its market penetration are just impossible to ignore and worth the premium for me. But I guess it'll be a good option for people who have time for tweaking stuff and playing with tools such as Optiscaler.
AMD FSR 4 Now Available in Any DX11/12 Title via OptiScaler | TechPowerUp
People who usually care about DLSS are those who CAN play with such tools.
 
128bit = 8GB VRAM for the "low end" costing $300?
Ideally the 8GB variant would be a cut down version of the full-size 32 CU die, thinking something like 24CU/8GB in the 75-100W range. That much could shave it down to a $230-250 card. But I'm not the one approving product designs/price stacks, so we'd have to wait and see what AMD has planned in that regard.
 
All I need is a successor to the 9060 XT that actually meets/exceeds a 7800 XT. If UDNA can keep the costs down (at least in the low-midrange segment), I'm happy. The rest is just the rest.
AMD needs to replace RX 6600 with RX 7600 8GB at <$200 and drop RX 9060XT 8GB at $249. RX 9060 XT is even faster than RTX 5060, so someone would say that AMD needs to do nothing, but the average consumer will buy the Nvidia sticker, especially if it is cheaper. We have seen it before with RX 6600 vs RTX 3050.
 
cool, can we get something more than midrange please?
150-200 CUs please.
 
Ideally the 8GB variant would be a cut down version of the full-size 32 CU die, thinking something like 24CU/8GB in the 75-100W range. That much could shave it down to a $230-250 card. But I'm not the one approving product designs/price stacks, so we'd have to wait and see what AMD has planned in that regard.
I wouldn't expect a new product at $230, no matter how much cut down it will be. AMD will not waste TSMC N3E wafers for a low end product. But we could hope for a cheaper RX 7600 8GB at $200, because I suppose TSMC's 6nm is cheaper today than what it was when 7000 series was released.
 
Oh goody, my next upgrade... plenty of time to save up! :roll:
But for the next year at least, I'm happy with my 7900XTX factory OC card.
 
AMD FSR 4 Now Available in Any DX11/12 Title via OptiScaler | TechPowerUp
People who usually care about DLSS are those who CAN play with such tools.
It's not whether you CAN, it's whether you have TIME. I can barely scrape a few hours per week to play games themselves, then there is adding necessary mods/reshades. Deep diving into things like Optiscaler leaves very little time for playing itself (and that's aside from the fact it's not always perfect).
------------------------------
I'm also very glad some people are amused by my post, expected no less. I guess to me it's equally amusing that a lot of folks still operate on the "fake frames" & "upscaling bad" argument level in 2025 ;)
 
It's not whether you CAN, it's whether you have TIME. I can barely scrape a few hours per week to play games themselves, then there is adding necessary mods/reshades. Deep diving into things like Optiscaler leaves very little time for playing itself (and that's aside from the fact it's not always perfect).
You make it look like someone needs a couple of hours to enable FSR 4 with optiscaler in a game.
 
Unless it's some earth-shattering improvement, I'm afraid I'll probably still go with NVidia. The pure raster performance is not that relevant anymore. Subsequently, the power of the RTX/DLSS ecosystem and its market penetration are just impossible to ignore and worth the premium for me. But I guess it'll be a good option for people who have time for tweaking stuff and playing with tools such as Optiscaler.
No and no. they only need to bring down the costs, deliver better prices for mid-range producs and that's a win. RASTER is still king. Nobody cares about fake frames, distorted frames.
 
Personally I am still hoping for a 9080xt/9070xtx with 24gb vram.
 
cool, can we get something more than midrange please?
150-200 CUs please.
I was thinking that you are quite optimistic about the needed CU count. And then I remembered that 5090 has 170SM :)
Still, that is halo product, above high end and stuff. Calling anything below it mid-range isn't quite right.
 
I was thinking that you are quite optimistic about the needed CU count. And then I remembered that 5090 has 170SM :)
Still, that is halo product, above high end and stuff. Calling anything below it mid-range isn't quite right.
AMD doesn't need 150-200 CUs. 9070XT is performing more or less like a 7900XTX based on TPUs ranking(9070XT 6% slower). So a 9000 series with 96 CUs will be probably between 5080 and 4090. With 128 CUs it will be at 5090 level of performance.
 
It's not whether you CAN, it's whether you have TIME. I can barely scrape a few hours per week to play games themselves, then there is adding necessary mods/reshades. Deep diving into things like Optiscaler leaves very little time for playing itself (and that's aside from the fact it's not always perfect).
------------------------------
I'm also very glad some people are amused by my post, expected no less. I guess to me it's equally amusing that a lot of folks still operate on the "fake frames" & "upscaling bad" argument level in 2025 ;)
Its the same crap at nvidia camp , you have to twidle around to get dlss 4 since older games dont have it. I cant be bothered doing that.

The main issue with fsr 4 is that because of how few cards actually support it, its not going to get traction. Now imagine UDNA launching with yeyt a new version of fsr that is exclusive to the new gpus. No thanks, id stick to nvidia.
 
I was thinking that you are quite optimistic about the needed CU count. And then I remembered that 5090 has 170SM :)
Still, that is halo product, above high end and stuff. Calling anything below it mid-range isn't quite right.
UDNA comes out probably in 2027 and even right now it would just be slightly more than 50% of the current consumer GPU SM/CU count.
that's pretty much exactly the middle. But on the other side, the TPU database calls the 5060 Ti a "High End" GPU...:roll:
in non clown world where people don't justify getting ripped off and defending stagnation (not meaning you... just in general) we would have a 5060 Ti with the specs of a 5080 for 450 bucks for a long time now. But instead we get the "too good to throw away" AI Accelerator binning rejects.

And UDNA is the same strategy as NVidias.
We just make a single architecture specifically for AI Accelerators and sell the trash to the Gamers for exorbitant amounts of money because from their eyes the full die is no longer a 1000 dollar product but a 10000 dollar one.
 
But instead we get the "too good to throw away" AI Accelerator binning rejects.

And UDNA is the same strategy as NVidias.
We just make a single architecture specifically for AI Accelerators and sell the trash to the Gamers for exorbitant amounts of money because from their eyes the full die is no longer a 1000 dollar product but a 10000 dollar one.
It's not that simple: the underlying architecture might be the same, but the silicon configuration is different. The 5090 silicon isn't used in the DGX rack used for HPC.
here's his the TOTL datacenter Blackwell vs the TOTL desktop blackwell. Notice the massive difference in ROP count, FP64 units, INT32 units and FP16 units relative to the amount of shading units. GB202 is not a cut-down GB200. Blackwell for the desktop is optimized for gaming.

It's not even like "Gaming only" RDNA had an advantage in performance or efficiency over Nvidia "desktop and HPC architecture".
1753090998395.png
1753091017833.png
 
Hmm, so a newer gen of my 7900 XTX will possibly be available, very interesting. I think if it turns out to be a significant upgrade I might have to upgrade!

But then again if it's not a chiplet design I might not, I like the chiplets purely because it's a cool idea :) Still, make it happen AMD! Give us a monstrous card to buy and show off again!
 
Give us a monstrous card to buy and show off again!

The problem is that to show off, you actually need to be in the lead. With those specs, unless there's a historically unprecedented IPC gain per WGP, that GPU isn't gonna be a threat to an RTX 5090. It'd be impressive if it released today, but by late 2026/early 2027? That's another story entirely.
 
You make it look like someone needs a couple of hours to enable FSR 4 with optiscaler in a game.
One will need only a little bit of time to make it work.
Where it can.

But in most of games(*) it wont work that way. Ever.
In the best case some Cheap-Anti will show you the mid finger
In the worst .. your account will be baned. Forever.

(*) All On-Line competitive multiplayer games, all gachagames and, for better measure, some Single player games.

Its the same crap at nvidia camp , you have to twidle around to get dlss 4 since older games dont have it. I cant be bothered doing that.
DLSS Swapper in most of the case does not violate Cheap-Anti.
Optiscaler does.
 
What i would like to see is AMD starts pricing their cards reasonably like they used to do, so they put back their 256bit cards into midrange with midrange pricing (not these 600-800€ overpriced monstrums), even with inflation and TSMC price hikes, i'd like to see 256 bit cards around 400-500$/€ maximum.

And core count increases of course, the RX 470 came in 2016 with 2048 cores and 8 GB vram, now fast forward to 2025 and we have 9060 XT with 2048 and 8 GB for 300-320€, and 16 GB for 360-400€.
It is like AMD pulling a core stagnation strategy for a decade now, like Intel used to do on the cpu front with their 4 cores i5/7s.

As for upscaling, the best case scenario should be that a card don't need to use it at all (but seeing as how HW maker so stingy these days i idont see this happening) , the worst is, well Microsoft should get their sh*t together and release DirectSR already and make its use manditory, so game devs and NV/AMD/Intel can't pull crap on us at all, and all latest and greatest upscaler is avaialble in all games from all vendors and everyone can select the best upscaler for our/their current GPU brand.

 
You make it look like someone needs a couple of hours to enable FSR 4 with optiscaler in a game.
If you think that Optiscaler is some magic wand which allows you to instantly "enable" things, without learning anything, checking the wiki, compatibility, etc, then you clearly have never used it.
Its the same crap at nvidia camp , you have to twidle around to get dlss 4 since older games dont have it. I cant be bothered doing that.
To some extent this is true, but on balance so far I had much easier ride with my 5070Ti than when I had 9070XT earlier this year (especially using DLSS Swapper)
The main issue with fsr 4 is that because of how few cards actually support it, its not going to get traction. Now imagine UDNA launching with yeyt a new version of fsr that is exclusive to the new gpus. No thanks, id stick to nvidia.
Exactly, the propagation is the main issue.
RASTER is still king. Nobody cares about fake frames, distorted frames.
Folks like you are living memes :) Nobody cares? What was the AMD GPU market share again?
 
Back
Top