• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD RDNA 5 a "Clean Sheet" Graphics Architecture, RDNA 4 Merely Corrects a Bug Over RDNA 3

Couldn't care less about RT, thing that concerns me most is performance per watt with RDNA 4 @ 3K (21:9 aspect) . Yesterday I was playing starfield & started a new quest from the 10 June patch update & yet just talking to an NPC & my PC box is drinking over 450W of power (Ultra settings less shadows on high) for just this mundane non action scene! game is still un optimised well for PC platform imo. Could be the way my RDNA 2 card behaves when OC but I have power tab disabled in performance options. All this with latest Adrenalin drivers (24.5.1) so there's that.
Ah yeah, I had a 6800XT previously and undervolted it down to around 200W to keep my gaming room a little cooler.
If you've OC'd yours I presume you've put the power limit slider all the way to the right and likely have the GPU using ~300W?
 
The DGPU roadmap has been leaked:

1720340317415.png


RDNA 5 GFX13 3nm GDDR7 very late 2026 or early 2027. AMD abandons the ultra high-end segment, namely what RX 6950 XT and RX 7900 XTX are, and the entry level, namely where RX 6400 is.

For me, it is a really bad roadmap.
 
The DGPU roadmap has been leaked:

View attachment 354314

RDNA 5 GFX13 3nm GDDR7 very late 2026 or early 2027. AMD abandons the ultra high-end segment, namely what RX 6950 XT and RX 7900 XTX are, and the entry level, namely where RX 6400 is.

For me, it is a really bad roadmap.
Rather, there are not enough details yet, and that is why the field from the end of 2026 and beyond is like this.
 
Rather, there are not enough details yet, and that is why the field from the end of 2026 and beyond is like this.
Considering that no chip is named (Navi 5 is a product family, not a chip), I think this is the case. How much of the vertical segment is covered by "Navi 5" is inconsequential.
 
If the trend continues (Navi 2 = 4 chips, Navi 3 = 3 chips, Navi 4 = 2 chips, Navi 5 -> ... ), one can expect Navi 5 to be one single chiplet that will serve and cover all market tiers.
Remember the Raja Koduri's original promise, there was one thing called "scalability".

Navi 58 - 4 chiplets.
Navi 56 - 3 chiplets.
Navi 54 - 2 chiplets.
Navi 53 - 1 chiplet.

1720351303729.png
 
Ah, I don't remember that as a promise. At best, shared intentions. But how does this continue to matter today, since the gentleman has long been working for another company?
 
If the trend continues (Navi 2 = 4 chips, Navi 3 = 3 chips, Navi 4 = 2 chips, Navi 5 -> ... )...
That's not a trend, just a coincidence. Remember, Navi 1 was also 2 chips. Sure it would be good for AMD to achieve total scalability with chiplets for Navi 5, but whether it'll happen or not, we'll see.
 
  • Haha
Reactions: ARF
That's not a trend, just a coincidence

Definitely not a coincidence. Can't be a coincidence, but rather a carefully executed plan to decrease the projects costs, and in the future - maybe exit the market segment altogether. We will see.
Depends on whether they will be able to create an extremely fast interconnect between the chiplets.
 
Definitely not a coincidence. Can't be a coincidence, but rather a carefully executed plan to decrease the projects costs, and in the future - maybe exit the market segment altogether. We will see.
Depends on whether they will be able to create an extremely fast interconnect between the chiplets.
Decreasing project costs sounds like a sensible plan, but it's not a necessary step to leave the business.
 
Decreasing project costs sounds like a sensible plan, but it's not a necessary step to leave the business.

Today, AMD has so much money to afford high quality projects in the graphics department, that its intention to not compete with the black-leather jacketed man is really bizarre.
It is actually becoming worse by the year. AMD is virtually absent in the OEM market, including laptops.
 
Today, AMD has so much money to afford high quality projects in the graphics department, that its intention to not compete with the black-leather jacketed man is really bizarre.
If you think that not competing on the high-end is the same thing as not competing at all because only x90 halo cards matter, then I can't entertain this conversation any further.
 
If you think that not competing on the high-end is the same thing as not competing at all because only x90 halo cards matter

That is not the point. If it turns out that the chiplet designs don't work at all, then all market segments will suffer, and will take the hit.
We will see if they will invent the breakthrough technology needed to make GPU chiplets work without very significant performance loss.
 
That is not the point. If it turns out that the chiplet designs don't work at all, then all market segments will suffer, and will take the hit.
We will see if they will invent the breakthrough technology needed to make GPU chiplets work without very significant performance loss.
So far, the only problem of the chiplet design is idle / low load (video playback) power consumption, and the fact that they can't split the main components into chiplets without adding latency that a GPU architecture can't bear. Rumour says that Navi 4 will be monolithic, but we'll see if they manage to get around the above with Navi 5.
 
So far, the only problem of the chiplet design is idle / low load (video playback) power consumption, and the fact that they can't split the main components into chiplets without adding latency that a GPU architecture can't bear.

Idle / videoplayback high power consumption is because of engineering incompetence, not because of the chiplets. AMD has done it for generations, actually I remember that I upgraded Radeon HD 4890 1GB to Radeon HD 6870 1GB (gaining what, measly 30% more performance), simply because the newer card had the memory clocks fixed. :banghead:
 
Idle / videoplayback high power consumption is because of engineering incompetence, not because of the chiplets. AMD has done it for generations, actually I remember that I upgraded Radeon HD 4890 1GB to Radeon HD 6870 1GB (gaining what, measly 30% more performance), simply because the newer card had the memory clocks fixed. :banghead:
I'm not so sure considering that idle power consumption of their CPUs has been high ever since chiplets are a thing, and hasn't been fixed.
 
I'm not so sure considering that idle power consumption of their CPUs has been high ever since chiplets are a thing, and hasn't been fixed.

For some reason they want to maintain the clocks extremely high. If an intel CPU can downclock to 900 MHz, or even lower to 450 MHz, the AMD CPUs don't fall under 3.9 GHz.
That's a nasty design choice. Have you heard why they want this to happen?
 
For some reason they want to maintain the clocks extremely high. If an intel CPU can downclock to 900 MHz, or even lower to 450 MHz, the AMD CPUs don't fall under 3.9 GHz.
That's a nasty design choice. Have you heard why they want this to happen?
CPU core clocks aren't the problem. Even at high clocks, they eat peanuts. The IO die is the problem.
 
CPU core clocks aren't the problem. Even at high clocks, they eat peanuts. The IO die is the problem.

This means they have a problem with the north bridge. They have to learn how to design one.
 
Most likely I will keep my 7900 XT until RDNA5, I have plenty of backlog and don't give a damn about Ray tracing or Physx or any of the nonsense that has come and gone before.
I’m waiting for at least 2.5-3 times the performance on raster and at least 5 years before I change my 7900XT, as I did from my Vega 64 to 7900XT upgrade (mid ‘23).
 
I’m waiting for at least 2.5-3 times the performance on raster and at least 5 years before I change my 7900XT, as I did from my Vega 64 to 7900XT upgrade (mid ‘23).
The way it should be. :)

I don't understand people complaining about not having massive uplifts across each generation. Are they really so desperate to spend money? :D
 
If RDNA 4 is the last RDNA, then it's better to wait the next generation now.

RX 7900 GRE performance in the form of an RDNA 4 RX 9070 XT for $550 is DOA.
 
If RDNA 4 is the last RDNA, then it's better to wait the next generation now.

RX 7900 GRE performance in the form of an RDNA 4 RX 9070 XT for $550 is DOA.
To you perhaps. To people coming from older/weaker GPUs, not necessarily.
 
Can't you hold on with an RX 7600? Maybe later this or next year they will release UDNA 1 cards.
Sure you can. It all depends on one's circumstances - what games you play, at what settings/resolutions, how much money you have, what GPU you have right now, etc.

Let's say you're just about to retire your RX 480. Would you get a 7900 GRE (which is already out of stock, so you'd have to settle for a 7800 XT), or a 9070 XT? :)
 
Would you get a 7900 GRE (which is already out of stock, so you'd have to settle for a 7800 XT), or a 9070 XT? :)
Second hand Navi 21 (if VRAM is a concern) / GA102 (in any other case).
 
Back
Top