Wednesday, April 24th 2024

AMD's RDNA 4 GPUs Could Stick with 18 Gbps GDDR6 Memory

Today, we have the latest round of leaks that suggest that AMD's upcoming RDNA 4 graphics cards, codenamed the "RX 8000-series," might continue to rely on GDDR6 memory modules. According to Kepler on X, the next-generation GPUs from AMD are expected to feature 18 Gbps GDDR6 memory, marking the fourth consecutive RDNA architecture to employ this memory standard. While GDDR6 may not offer the same bandwidth capabilities as the newer GDDR7 standard, this decision does not necessarily imply that RDNA 4 GPUs will be slow performers. AMD's choice to stick with GDDR6 is likely driven by factors such as meeting specific memory bandwidth requirements and cost optimization for PCB designs. However, if the rumor of 18 Gbps GDDR6 memory proves accurate, it would represent a slight step back from the 18-20 Gbps GDDR6 memory used in AMD's current RDNA 3 offerings, such as the RX 7900 XT and RX 7900 XTX GPUs.

AMD's first generation RDNA used GDDR6 with 12-14 Gbps speeds, RDNA 2 came with GDDR6 at 14-18 Gbps, and the current RDNA 3 used 18-20 Gbps GDDR6. Without an increment in memory generation, speeds should stay the same at 18 Gbps. However, it is crucial to remember that leaks should be treated with skepticism, as AMD's final memory choices for RDNA 4 could change before the official launch. The decision to use GDDR6 versus GDDR7 could have significant implications in the upcoming battle between AMD, NVIDIA, and Intel's next-generation GPU architectures. If AMD indeed opts for GDDR6 while NVIDIA pivots to GDDR7 for its "Blackwell" GPUs, it could create a disparity in memory bandwidth performance between the competing products. All three major GPU manufacturers—AMD, NVIDIA, and Intel with its "Battlemage" architecture—are expected to unveil their next-generation offerings in the fall of this year. As we approach these highly anticipated releases, more concrete details on specifications and performance capabilities will emerge, providing a clearer picture of the competitive landscape.
Sources: @Kepler_L2 (on X), via Tom's Hardware
Add your own comment

114 Comments on AMD's RDNA 4 GPUs Could Stick with 18 Gbps GDDR6 Memory

#51
Tomorrow
ARFRay-tracing.
As long as all out games are Raster-RT hybrids this is less of an issue. Until we start getting pure RT games that run exclusively or RT hardware i very much doubt RT becomes the deciding factor. Also everyone knows Nvidia is faster in RT but i would not call AMD's RTX 30 series like RT performance that bad.
ARFCounter-Strike 2.
What a GRE got to do with this?
ARFIt is a problem. Because the halo product sells all the other siblings.
With Nvidia's mindshare i very much doubt RX 9900 XTX or whatever will be able to outsell or outperform 5090 even of it costs a 1000 compared to Nvidia's 2000.
People spending 1000+ generally already go for the best. Nvidia has been for years selling their cards on mindshare or software, more than hardware.
Posted on Reply
#52
ARF
AusWolfThe 8800 XT is positioned to be at roughly 7900 XT level by rumours. Where you get that it's a 7600 XT successor is beyond me.
Die size is rumoured to be around 200 mm^2. In the ballpark of RX 7600 XT.

RX 7900 XT level of performance will not be reached with 500ish GB/s memory bandwidth. Forget it.
TomorrowWhat a GRE got to do with this?
Latest graphics card TPU review. www.techpowerup.com/review/?category=Graphics+Cards&manufacturer=&pp=25&order=date
Used because it shows current state of affairs.
Posted on Reply
#53
kapone32
TomorrowWith Nvidia's mindshare i very much doubt RX 9900 XTX or whatever will be able to outsell or outperform 5090 even of it costs a 1000 compared to Nvidia's 2000.
People spending 1000+ generally already go for the best. Nvidia has been for years selling their cards on mindshare or software, more than hardware.
How does Nvidia mind share work? When I built my first PC I used an AMD CPU and remembered those Super Bowl commercials and got myself a GTS 450. Years later I had moved to AMD and was at the PC store. The tech told me I should check out Sapphire and I looked at him like he had 2 heads. Thankfully Nvidia did enough to me personally to make me not want to use them and realize that Sapphire get some of the best binned chips for AMD. Now I enjoy Gaming and the only setting I change usually is turning off Vsync. Maybe I have been lucky but I am enjoying my current PC more than ever.
Posted on Reply
#54
AusWolf
ARFDie size is rumoured to be around 200 mm^2. In the ballpark of RX 7600 XT.
What's die size got to do with it? Based on this comparison, the 4060 should sell for $150 and the 4080 for $500, they're so tiny chips.
ARFRX 7900 XT level of performance will not be reached with 500ish GB/s memory bandwidth. Forget it.
I don't estimate the importance of memory bandwidth that high, but we'll see.
Posted on Reply
#55
Makaveli
ARFDie size is rumoured to be around 200 mm^2. In the ballpark of RX 7600 XT.

RX 7900 XT level of performance will not be reached with 500ish GB/s memory bandwidth. Forget it.
So are you expecting something in between 7800XT - 7900GRE performance since it will have a 256-bit memory bus?
Posted on Reply
#56
ARF
AusWolfWhat's die size got to do with it? Based on this comparison, the 4060 should sell for $150 and the 4080 for $500, they're so tiny chips.
Yes, but nvidia is a monopoly.
MakaveliSo are you expecting something in between 7800XT - 7900GRE performance since it will have a 256-bit memory bus?
Not only because of the memory bus itself. But we shall see. Let's wait. But honestly, better expect a failure, than a success, so in the end you would be pleasantly surprised, than terribly disappointed.
Posted on Reply
#57
kapone32
Do we have any specs for these cards?
Posted on Reply
#58
Tomorrow
ARFRX 7900 XT level of performance will not be reached with 500ish GB/s memory bandwidth. Forget it.
With GRE? No of course not. But with a new generation product - yes it's very possible.
kapone32How does Nvidia mind share work? When I built my first PC I used an AMD CPU and remembered those Super Bowl commercials and got myself a GTS 450. Years later I had moved to AMD and was at the PC store. The tech told me I should check out Sapphire and I looked at him like he had 2 heads. Thankfully Nvidia did enough to me personally to make me not want to use them and realize that Sapphire get some of the best binned chips for AMD. Now I enjoy Gaming and the only setting I change usually is turning off Vsync. Maybe I have been lucky but I am enjoying my current PC more than ever.
You're asking the wrong person. I've owned roughly the same number of Nvidia and AMD (ATI back then) GPU's.

My worst experience was with Nvidia during their bump gate scandal where my 8800 GTS 320 kept dying and had to be revived in an oven - albeit temporarely. It was also a second hand EVGA product so i had no warranty either. Currently im on 2080 Ti that i managed to buy for a reasonable price before the latest crypto boom sent prices to the sky. Also made more than 1k on it by mining on the side. If i had to buy a new card now it would likely be AMD as my modded games require more VRAM and i despise the new power connector Nvidia mandates even for 4070 Super, a <250W card that could easily be powered by a single 8-pin.

My fear when buying Nvidia is the next feature they lock me out of when they release their next series. I've already locked out of ReBAR that 30 series introduced and DLSS FG that 40 series introduced. Im sure 50 series will further widen the gap.
Posted on Reply
#59
kapone32
TomorrowWith GRE? No of course not. But with a new generation product - yes it's very possible.

You're asking the wrong person. I've owned roughly the same number of Nvidia and AMD (ATI back then) GPU's.

My worst experience was with Nvidia during their bump gate scandal where my 8800 GTS 320 kept dying and had to be revived in an oven - albeit temporarely. It was also a second hand EVGA product so i had no warranty either. Currently im on 2080 Ti that i managed to buy for a reasonable price before the latest crypto boom sent prices to the sky. Also made more than 1k on it by mining on the side. If i had to buy a new card now it would likely be AMD as my modded games require more VRAM and i despise the new power connector Nvidia mandates even for 4070 Super, a <250W card that could easily be powered by a single 8-pin.

My fear when buying Nvidia is the next feature they lock me out of when they release their next series. I've already locked out of ReBAR that 30 series introduced and DLSS FG that 40 series introduced. Im sure 50 series will further widen the gap.
It was rhetorical.

That is exactly what happened to me but the kicker was that they did not even inform me when they disabled SLI on the GTS450. Imagine how stupid I felt after I had sold them to a friend.
Posted on Reply
#60
arbiter
kapone32The thing is even today AMD does not have even half the money for R&D for both sectors they are in. I feel people are looking at the Glass half empty argument without looking at the positives.

1. For the life of the PS5/PS5 Pro and Xbox 1 games will be created on Ryzen/Radeon platform. As they age, programming on those will improve that will mean an advantage for console ports using AMD PCs. It is already happening.

2. AMD are making crazy money on their APUs. The Steam Deck is in the top 10 in Global sales on the Steam platform consistently. The release of the MSI Claw (even if it is for future proofing) is evidence of how far AMD has come in the APU space. This will also mean more programming for Ryzen/Radeon as Games start to get ported (likely from Mobile) onto these platforms. In fact I am confident that someday soon on Amazon you will be able to buy a Ryzen based handheld with those retro Roms like PS/PS2/Dreamcast and Arcade. I have already built one with my 5600G (desktop).
So depending on intel in how fast and if they integrate their arc gpu tech in cpu package. Those 2 segments could not be $ for amd for much longer. Intel has only been serious in gpu for what 1year? maybe 2 at most and they have gpu that price wise made waves as an option including major performance strides with each driver update. They could be a real threat to that market in a year or 2 if they put the effort in to it.
Posted on Reply
#61
kapone32
arbiterSo depending on intel in how fast and if they integrate their arc gpu tech in cpu package. Those 2 segments could not be $ for amd for much longer. Intel has only been serious in gpu for what 1year? maybe 2 at most and they have gpu that price wise made waves as an option including major performance strides with each driver update. They could be a real threat to that market in a year or 2 if they put the effort in to it.
Of course nothing lasts forever but Intel are still at least 2 generations away at the IGPU space. AMD will be releasing new APUs as well and Intel have to solve the power draw/performance problem as well. I am not saying they can't just not yet. As an additional thought there is nothing preventing Sony or MS from jumping into this space with portable PS and Xbox/Gamepass handhelds. As it stands right now the MSI Claw is the only handheld that does not come with AMD.

When we start to get laptops with just these APUs in them I expect they will sell well too. Acer has one that they announced for $599 with a 8700G laptop chip.
Posted on Reply
#62
Panther_Seraphin
AMD will have a lead in the iGPU space for a while I think with their knowledge in Consoles and things like Strix Point etc and their success.

Intel I would hope would focus on desktop/datacenter for Celestial then we see in Druid/E series parts a funnel down into iGPU power/efficency.
Look at how Alchemist has performed/developed I am near 100% sure there has been a massive accidental bottleneck put in in the hardware and I would guess it was in the scheduler/load store functions as moving from 1080p to 1440p in most games on the arc has been single percentage points drop in performance RT on or Off. Yet on nearly all other manufacturers cards you can see a respectable drop in performance or should I say a respectable gain dropping the resolution.

Get that fixed for Celestial/Druid and then they have a real contestant into the iGPU space.


I suspect with RDNA4 and the now cancelled top end offering they either went too far on the chiplet design and realised it would either need a full rework (RDNA5/Successor arch?) or wonder if they had intended for HBM3/e to be used on the top end parts similar to the MI300 but the AI craze has just priced them out of the market again.
Posted on Reply
#63
Random_User
TomorrowWell if you call struggling by losing only to 4090 outright (in both raster and RT), then id say AMD is not doing so bad at 4080 performance.
Not to mention, that the sole 4090 users are either prosumers like content creators, broke 3D design folks, that are unable to buy themselves Pro line GPUs, the AI freaks, that want to enter that noncence bubble at all costs, and the "1337" "gamur" crowd, with more money than intelligence, just to show off their soapy upscaled 4K DLSS screenshots, somewhere on forums and social network. The most of people do not even look at that area, enjoying their "modest" gaming with 4080/7090XTX as most. Most people look forward to to 7800XT/4070 Super, and the sales just show that.

So I don't see any trouble leaving that tiny segment for Ultra rich kids and selfish menchildren, considering that among all premium products, AMD has more profits from enterprise anyway. No point for them to sell many premium GPU products if they can sell WS cards instead of top tier "gamer" counterparts, for people that needs them, ad will gladly pay that premium. And gamers can keep up with something akin to what being used in consoles (RX6700), anyway.
But most of gamer segment come from low/mid end GPUs anyway. No point to invest in something, that is basically a placeholder. And even if there won't be any successor as RDNA5, AMD can live with just such low cost cards, until they feel like they are able to release something top. They did it with RX580, Radeon VII (Vega II), RX5700, until they make RX6800XT/RX6900, that sold like the hot cakes, and was basically on par with nVidia rivals.
Thus, there's absolutely no point into putting the expensive VRAM, in such temporary and low cost. It's more reasonable to use newest GDDR7 along with breakthrough solutions that may, or not be be RDNA5.

As of Intel. I can't say they are absolutely hopeless. As it's not guaranteed that they will make the great achievements with Battlemage. However, as much as I don't like intel, I must admit, they have already made a significant progress in the GPU division. I dare to say, even bigger than AMD did for a decade, but instead within of couple of years. Of course, they have miles bigger R&D budget, but still. From what I've seen and read, Intels compression/decoders are miles better than AMD, even on lowest end cards. Their RTRT is also better. And this considering, Intel is on huge decline, and selling their assets left and right. AMD on the other hand, is blooming, but still reclutant to invest in their consumer areas like Radeon, because they went all-in on Enterprise, because it doesn't rely heavilly on AMD drivers, and they they don't need the streaming/decoding capabilities anyway. So AMD can invest less, while having more. At this point AMD is seems to be even greedier, than Nvidia. They lacking at every area, but still get the hubris to ask the premium for absent features/options.
Posted on Reply
#64
gffermari
AMD gpus performance is not bad. Actually it is great.
The package is bad.
Intel made FSR look like a joke in their first attempt.

I would sacrifice the performance crown for a better package overall.
It took years to change the mindset from Intel Core to AMD Ryzen but it did happen.
That’s why most of us have ryzens now.
It may happen on the gpu side if nVidia continues asking 1000+ for midrange cards.
Posted on Reply
#65
Makaveli
gffermariIt may happen on the gpu side if nVidia continues asking 1000+ for midrange cards.
Not sure of this last part nvidia has such a huge mindshare and following that their users don't have a problem paying 2k for a 4090. I don't have access to the sales data but I wouldn't be surprised if the 4090 is the best seller in the whole 4000 series.

I totally agree that we need to get mid range back in the $500 range and high end to $1000 and under but that cat seems to be out of the bag now and may never go back.

And with both companies shifting their focus more on AI and putting more resources into it we may continue to see a squeeze or pricing going up on discreet gpu's.
Posted on Reply
#66
DavidC1
The difference is actually Nvidia is a much more formidable competitor than Intel ever was and still.

You remember all the talk about engineer CEOs? Well, Nvidia still has the engineer as the CEO and not only that, he is the one who founded the company! That's if Intel still had Gordon Moore, Robert Noyce, and Andy Grove, those that are still considered legendary.

Nvidia's engineering IS really good. They consistently push out reticle-sized dies. Sure they make mistakes, but over a long period of time, nowhere near badly as both AMD and Intel. Nvidia made handful of mistakes while AMD and Intel stumbled like they were peg-legged. Remember though, Nvidia has one of the highest if not the highest employee satisfaction ratings. No wonder why they are successful!

That's part of why AMD's GPU division is struggling, and CPU is not.

Battlemage should in theory be a lot better even if places itself in the same relative position to competitors as Alchemist. They can fix the idle/low load power consumption issue, ReBar issue, and hardware quirks from lack of experience such as abnormal resolution/detail scaling. ReBar for one is a big thing, as it automatically rules out/discourages most of older systems, which is counterintuitive considering how affordable ARC cards are. ReBar doesn't just affect older systems. Recently it had a bug where some systems had half-working ReBar with Vulkan API. So random-ass low performance in modern games might be due to lack of ReBar performance having a great impact on ARC(where it's negligible on competitors).

I know from tracking Intel GPUs for a long time what was thought to be software/driver problems turned out to be a hardware problem. No doubt such problems exist on Alchemist. In fact even if driver bugs exist, it might be easier to fix on Battlemage and successors.
Posted on Reply
#67
rv8000
MakaveliNot sure of this last part nvidia has such a huge mindshare and following that their users don't have a problem paying 2k for a 4090. I don't have access to the sales data but I wouldn't be surprised if the 4090 is the best seller in the whole 4000 series.

I totally agree that we need to get mid range back in the $500 range and high end to $1000 and under but that cat seems to be out of the bag now and may never go back.

And with both companies shifting their focus more on AI and putting more resources into it we may continue to see a squeeze or pricing going up on discreet gpu's.
I highly doubt that unless the caveat is all sales including prosumer, small businesses, and those on a budget when it comes to commercial uses. In terms of consumers such as gamers or mixed use, 4090 probably gets crushed in sales by the 4070/4070s.

Prices will never return to what use to be the relative norm, people just finance everything from what I’ve seen and are probably drowning in debt if everyone and their mother is buying a 7900/4080/4090. I’ve said it before, but we’re continuously moving towards GPUs of any kind being a luxury and gaming on PC being largely unaffordable for an average person.
Posted on Reply
#68
kapone32
DavidC1The difference is actually Nvidia is a much more formidable competitor than Intel ever was and still.

You remember all the talk about engineer CEOs? Well, Nvidia still has the engineer as the CEO and not only that, he is the one who founded the company! That's if Intel still had Gordon Moore, Robert Noyce, and Andy Grove, those that are still considered legendary.

Nvidia's engineering IS really good. They consistently push out reticle-sized dies. Sure they make mistakes, but over a long period of time, nowhere near badly as both AMD and Intel. Nvidia made handful of mistakes while AMD and Intel stumbled like they were peg-legged. Remember though, Nvidia has one of the highest if not the highest employee satisfaction ratings. No wonder why they are successful!

That's part of why AMD's GPU division is struggling, and CPU is not.

Battlemage should in theory be a lot better even if places itself in the same relative position to competitors as Alchemist. They can fix the idle/low load power consumption issue, ReBar issue, and hardware quirks from lack of experience such as abnormal resolution/detail scaling. ReBar for one is a big thing, as it automatically rules out/discourages most of older systems, which is counterintuitive considering how affordable ARC cards are. ReBar doesn't just affect older systems. Recently it had a bug where some systems had half-working ReBar with Vulkan API. So random-ass low performance in modern games might be due to lack of ReBar performance having a great impact on ARC(where it's negligible on competitors).

I know from tracking Intel GPUs for a long time what was thought to be software/driver problems turned out to be a hardware problem. No doubt such problems exist on Alchemist. In fact even if driver bugs exist, it might be easier to fix on Battlemage and successors.
AMD was pretty much in the same boat as what Intel is today when they bought ATI. The first product that was AMD was Polaris and that was a success. Indeed AMD did not have enough engineers (or resources) to improve but they took a gamble on Vega with HBM (What a joy to Water cool) but Mining took over the scene and put everything awry. Today Radeon is very competitive and Wizzard himself references AMD software as a reason to buy Radeon.

What Nvidia is good at is making something and making people want it, even though it might be in 1% of Games. The narrative then picks it up and it becomes a feature. I look at how Frame Gen was received and how that morphed into a good thing. The key though is a lot of the talk about AMD's response is real but considered snake oil by the community. I remember how people use to say Gsync was much better than Freesync because it was a hardware module. Sound familiar.
Posted on Reply
#69
Beginner Macro Device
1. Yes, GPUs of 7900 XT performance and below don't need GDDR7. Not a big deal.
2. Yes, AMD aren't trying to compete. If we don't count Germany and a couple other countries where AMD production sells we're in the 99:1 NVIDIA win situation. Only because AMD GPUs of same price match or barely exceed the raster performance and lose in everything else.
3. Prices will stabilise, the bubble isn't going to grow forever. The most recent examples include real estate crisis of late 00s.
4. I disagree with "Intel made FSR look like a joke." FSR looks like a joke by itself, it required no help from competition. I bought my GPU more than a year ago and FSR is still in the same shape as it was when I bought this card, give or take two games where things became ever so slightly better after FSR 2.2 introduction. FSR 3.1 would've been late to the party even on the first day of 2023; yet it's almost mid-2024 and 3.1 is absolutely nowhere to be seen.
5. "6900 XT is on par with Ampere" is a deranged statement. It barely outperforms 3080 at 4K, sometimes even loses to it, also lacking any DLSS and RT performance whatsoever, whilst being far more expensive. More VRAM doesn't mean anything if framerate is still lower.
6. "4060 for $150 and 4080 for $500." I mean, these are exactly as cut-down as it gets. Halving their MSRP would've represented reasonable pricing. $220 and $620 respectively would be completely fine.
7. We don't need beating an NV halo GPU but we do need a price war. 7900 XTX is a great GPU by itself, it's just $1000 is beyond schizophrenic for it. $570ish would've striked hard, leading to much more pleasant market. Never happened though.
Posted on Reply
#70
THU31
AusWolfI disagree. High-end RDNA 4 (Navi 48) is rumoured to be around 7900 XT level in performance at best. GDDR6 is more than enough there. GDDR7 would only increase manufacturing costs with probably no performance advantage.
I think it's safe to assume that GPU would have a 256-bit bus, which would be a pretty significant bottleneck with regular GDDR6. You can see how limited the 7900 GRE is by its stock memory speed.

Initially GDDR7 was supposed to launch at 32 Gbps and go up to 36, but they're actually going to start with 28 Gbps, probably to reduce costs.
Posted on Reply
#71
dj-electric
GPUs: chocking to death with video memory bandwidth limitations
Some users: let em have cake, i'm fine with this situation.
Posted on Reply
#72
Nordic
MakaveliI don't have access to the sales data but I wouldn't be surprised if the 4090 is the best seller in the whole 4000 series.
It is clearly not from the Steam data.
Posted on Reply
#73
rhaoul
I bet on a new "cache" system.
Posted on Reply
#74
AusWolf
DavidC1You remember all the talk about engineer CEOs? Well, Nvidia still has the engineer as the CEO and not only that, he is the one who founded the company! That's if Intel still had Gordon Moore, Robert Noyce, and Andy Grove, those that are still considered legendary.
Guess what, AMD also has an engineer CEO. ;)

Congrats for the rest of your post, you couldn't have written a better Nvidia advert if you tried.
THU31I think it's safe to assume that GPU would have a 256-bit bus, which would be a pretty significant bottleneck with regular GDDR6. You can see how limited the 7900 GRE is by its stock memory speed.

Initially GDDR7 was supposed to launch at 32 Gbps and go up to 36, but they're actually going to start with 28 Gbps, probably to reduce costs.
Altogether, the GRE is not a bad product. I tend to look a GPU as the sum of its parts, and not the bottleneck that one of its parts may or may not show. It's not like you can upgrade your VRAM after all.
Posted on Reply
#75
starfals
Who knows, its all rumors at this time. Even Nvidia might stick to this as well. People keep thinking they will use the next gen stuff... what if they don't for 1 more generation? What if they don't even add display port 2.1? We don't know until we know. So far, i see a lot of bashing on AMD, but literally 0 on Nvidia. We are all talking about rumors here, that's the key thing.
Posted on Reply
Add your own comment
May 9th, 2024 04:54 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts