• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 9070 XT Could Get a 32 GB GDDR6 Upgrade

The idea's cute, but honestly I don't see a reason to reach for more than 24GB to a card for my usual use cases. 12GB is the passing grade for me as-is, running most of the tasks I want without too much trouble. If they can offer me a card with the same memory and average performance of a 7800 XT but shave $80 off the price tag, I'm sold. Nothing else is as appealing as that.
 
:love: :love: :love: :love: :love: :love:

Give It To Me Laugh GIF by Dead Meat James
 
Not necessarily; remember, nGreedia moved to GDDR7, there is going to be a lot of GDDR6/X chips that still need to be sold.
Wouldn't bet on GDDR6X, that was developed by Micron together with NVIDIA, so I guess that stays exclusive.

I am to see, however, Samsung's GDDR6W. Has anything ever used it? It's been 2 years since Sammy announced it and still nothing?
 
Wouldn't bet on GDDR6X, that was developed by Micron together with NVIDIA, so I guess that stays exclusive.

I am to see, however, Samsung's GDDR6W. Has anything ever used it? It's been 2 years since Sammy announced it and still nothing?
Samsung hasn't launched anything the last couple of years sooooo....
 
YESSSS!!! I've been running bunch of baby skynets on my 7800XT and it has been AWESOME! AMD has gotten much better with LLMs, ROCm support out of the box and can't complain about performance.
I specially suggested 32GB 9070XT to AMD on Linkedin, super glad this is closer to reality! 32GB GPU can outperform 24GB models like 7900XTX and 4090 simply because bigger models will not fit 24GB. Smaller model speed is determined by VRAM bandwidth and here 4090 and 7900XTX still have advantage over 9070XT. Since nVidia is using GDDR7 on all new GPUs, it is very difficult for them to make relatively affordable 32GB GPU.
 
A 32GB 9070XT doesn't excite me as much as something like an RDNA4 version of a 7900XTX.
 
A 32GB 9070XT doesn't excite me as much as something like an RDNA4 version of a 7900XTX.
Same. If the 9070xt is, in fact, competitive with the 5070ti, it will fuel speculation on what a true rdna4 4090 killer would have been.
 
Try to run a 30GB model on a fancy 24 or 16GB RTX with CUDA and compare the experience against a Radeon with 32GB VRAM.
VRAM is valuable real-estate!
At $500-1k a day for developer time your idea probably won't work out
 
At $500-1k a day for developer time your idea probably won't work out
If money and firehazard is not an issue then the 5090 is a fantastic card :D
 
Well at least AMD is not stringy with their VRAM.
But they are however stringy with their features compared to NVIDIA though. And that's where AMDs problem is, or most of it.
 
As usual, gpu is just rumored to exist and people are already calling it trash just because is from AMD.

And as usual, Ngreedia "features" are not called what they really are, proprietary lock-in crap which exist to limit your options and keep you locked into their hardware.

Its sad how todays consumers dont demand platform agnostic tech.
 
These are obviously aimed at AI target audience. No games will utilize these gigabytes, due to "comparably" weak GPU. Only LLM/compute will.
Both nVidia, and AMD keep falsely advertise compute SKUs to compute folks, as "gaming" protucts. This does wash away the boundaries of both, gaming and enterprise products. This surely is welcome by both GPU makers, and AI gang, but is horrible for the regular buyer and user, to which these "Gaming" cards should be aimed towards.
Those who use GPUs for work, should buy the corresponding workstation/enterprise products. This inflates the bottom end GPU stack, which gaming GPUs indeed are, thus making them even more beyond the reach, no only by "regular" folks, but for the prosumers themselves. People really like to shoot themselves in their feet.
 
Utter waste of VRAM. No game uses that much VRAM, any VRAM not being used just sits idle doing nothing. All this does is artificially inflates the cost of the card with zero performance improvements. Maybe do better with faster memory than more of it.
 
Clamshell usually. Pro and datacenter cards have this often enough. Rarer in consumer space. 3090 and 4060Ti 16GB come to mind from recent times.
Either way, a 32GB 9070XT would be immediately bought up for AI. Given relative lack of performance improvement for gaming the SKU probably does not make much sense.
Exactly what I thought. The reason probably is that there will be no more developement on GDDR6, so there will be no 3GB or even 4GB-modules. Only way to go over 16GB on a 256Bit bus is clamshell with 8 2GB-modules on each side.
I don't see the point. Sure, Nvidia's 12 GB lineup is anaemic as f*ck, but this is unnecessary in this class of GPU (unless it's targeted at purely AI purposes, I guess).
The point is to compete with 5070Ti(S)/5080(S) with 24GB via 3GB-modules that will be offered later this year, or to offer something that meets the VRAM-demands of AI.
Gaming wise, there is little point moving above 12~16GB VRAM in 2025Q1.
See above, even 16GB can't be called future-proof anymore, but doubling down is the only way for GDDR6.
What's the bloody point?
See above.
Fantastic news i hope they do a 24gb model.
See above, that would only be possible with 3GB-modules, which are not likely to come in GDDR6 anymore.
If they did that with 3 GB memory modules while keeping the Navi 48 core intact, that would be awesome (still not necessary for gaming, but makes a bit more sense than 32 GB).
See above, I very much doubt that there will be any future developement on GDDR6, so 2GB-modules with 20-24Gbps is all we'll get for RDNA4.

Now I'm curious again about 9070XT. If it beats 5070Ti in raster and isn't much slower in RT, I would consider buying team red instead of green. I wanted to wait for 5070Ti 24GB, which now will compete with 9070XT 32GB.
 
What's the bloody point?
Useful for LLM's.

2 x 9070 XT 32GB gives me 64GB of usable VRAM.

Which means I can run 70B or larger models without having to hit system ram.

if price is $750 x 2 = will be cheaper than 3090,4090 x 2 or 7900,7900XTX x 2 for the same use case with more VRAM.

At $500-1k a day for developer time your idea probably won't work out

Valid point but I also think people are just throwing around the AI term without talking about the details and use case which is very important.

Are we talking AI image generation? LLM's and interference speed etc. For somethings even with a ton of VRAM you may not want to leave the CUDA ecosystem and for others cases its fine.

The details matter.
 
Last edited:
If this comes to fruition then i know what my next card will be. As a heavy VRChat user VRAM is very important when you're in a lobby of 80 people.

My old 12gb 3060 would at times out perform 3070's and in once case a friends (10gb) 3080 cause they would run out of VRAM before I would.
 
Lots of VRAM, not enough horsepower.. cool!
 
If the reviews are reasonable, this could be an affordable upgrade to my 6950XT. Assuming they can be bought at that price....
 
Amazing how sentiments vary with RT necessary to tank FPS, but higher resolution and texture detail quality and diversity apparently doesn't matter VRAM capacity useless more FPS at lower quality with RT is the way it's meant to be played.
 
Ok. And what about non-game applications ? Is that also an utter waste for them ?

That’s what CUDA apps running on Nvidia is for. I don’t think I have to take off my shoes for the number of non-gaming apps that support AMD consumer GPUs. Hell , AMD themselves only support 7900 series desktop GPUs.

CUDA runs on freaking everything.
 
Amazing how sentiments vary with RT necessary to tank FPS, but higher resolution and texture detail quality and diversity apparently doesn't matter VRAM capacity useless more FPS at lower quality with RT is the way it's meant to be played.
Whatever our glorious leader Jensen says. :respect:
(That is, whatever bullshit he comes up with to sell cards with inadequate VRAM to hold out any longer than the next greatest thing being released.)

That’s what CUDA apps running on Nvidia is for. I don’t think I have to take off my shoes for the number of non-gaming apps that support AMD consumer GPUs. Hell , AMD themselves only support 7900 series desktop GPUs.

CUDA runs on freaking everything.
That's what they're planning to change with UDNA.
 
Back
Top