• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

[TT] AMD rumored Radeon RX 9080 XT: up to 32GB of faster GDDR7, up to 4GHz GPU clocks, 450W power

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
18,300 (4.71/day)
Location
Kepler-186f
Processor 7800X3D -25 all core ($196)
Motherboard B650 Steel Legend ($189)
Cooling RZ620 (White/Silver) ($32)
Memory 32gb ddr5 (2x16) cl 30 6000 ($80)
Video Card(s) Merc 310 7900 XT @3200 core -.75v ($705)
Display(s) Agon QHD 27" QD-OLED Glossy 240hz ($399)
Case NZXT H710 (Black/Red) ($62)
Power Supply Corsair RM850x ($109)

TL;DR: AMD is reportedly developing the Radeon RX 9080 XT, an RDNA 4-based enthusiast graphics card featuring up to 32GB of GDDR7 memory and GPU clocks reaching 4.0GHz. It promises 15-40% better gaming performance than the RX 9070 XT, potentially outperforming NVIDIA’s RTX 5080 and rivaling the RTX 5080 SUPER.

if I can get this on launch day at MSRP I probably will do it. microcenter don't fail me now!

AMD, THE WAY WE PLAY :rockout: :rockout: :rockout:
 

TL;DR: AMD is reportedly developing the Radeon RX 9080 XT, an RDNA 4-based enthusiast graphics card featuring up to 32GB of GDDR7 memory and GPU clocks reaching 4.0GHz. It promises 15-40% better gaming performance than the RX 9070 XT, potentially outperforming NVIDIA’s RTX 5080 and rivaling the RTX 5080 SUPER.

if I can get this on launch day at MSRP I probably will do it. microcenter don't fail me now!

AMD, THE WAY WE PLAY :rockout: :rockout: :rockout:
That's cool. Really cool. Dislike the 32gb vram though, 24 would be enough. Lot's of ram is one of the issues with 5090 - power draw goes way up.
 
That's cool. Really cool. Dislike the 32gb vram though, 24 would be enough. Lot's of ram is one of the issues with 5090 - power draw goes way up.

Agreed, would rather bring down costs and have 24gb, to be clear this is just rumor, might amount to nothing.
 
More like 9070XTXXXTX version selling for 1000usd that perform 10% better than the normal 9070XT LOL
 
More like 9070XTXXXTX version selling for 1000usd that perform 10% better than the normal 9070XT LOL

this is the AMD thread, why you here bruh, we all know you Nvidia boyo
 
32 GB is a fine amount. Better have and not need than need and not have.

Going close to 4 GHz sounds risky, we might see a lot of dies frying themselves if this rumour comes true.
 
That's cool. Really cool. Dislike the 32gb vram though, 24 would be enough. Lot's of ram is one of the issues with 5090 - power draw goes way up.
Monster Hunter: Wilds already uses over 16 GB of VRAM at 1440p on a mobile RTX 5090 (not the same as a desktop RTX 5090!) at ultra settings as seen below:

24 GB is starting to look like it is starting to have slim margins. 32GB does not look excessive in this light.
 
Monster Hunter: Wilds already uses over 16 GB of VRAM at 1440p on a mobile RTX 5090 (not the same as a desktop RTX 5090!) at ultra settings as seen below:

24 GB is starting to look like it is starting to have slim margins. 32GB does not look excessive in this light.
It looks like a ps4 game so does it matter if it uses 16 or 50 gb of vram? The point of extra vram is to have better looking games, not have worse looking games that just use more vram for no reason.
 
Not to detract from his merits... But I think this is one of those cases where MLID is making things up out of thin air.
For AMD to adopt GDDR7, it would require modifications to the design, particularly the memory interface.

While the space on the die might not be a significant cost factor, other considerations, such as mask expenses and the validation process—could add up to tens of millions of dollars. Given that this would target a high-end segment (over $1000) with relatively low sales volume, it could conflict with AMD's professional lineup(the 9700 32GB). LLMs are mainly limited by bandwidth, a 9080 with GDDR7 32Gbps would beat the 9700 in pretty much everything.

If this move is merely a stopgap until UDNA, it seems questionable. In my opinion, the only scenario where it makes sense is if AMD refreshes its entire lineup with this chip, considering that the N48 struggles with bandwidth deficiencies in certain situations.
 
24 GB VRAM would be fine,.. I hope AMD sticks with options for the 8-pin connector vs the 12+4 on most AIB cards,.. I don't want a 12+4 pin connector on my card.
 
A lot of rumors about AMD are used to get clicks. AMD has it's marketing mob and acolytes. That's all I'll say about that.

But my opinion about this supposed leak is that if it is real then it is likely a pro variant or something for use with Ai stuff. Maybe it could be used for gaming but it won't be cheap.
AMD already could have sold the 9070xt for $599 but chose to go $100 more. And everywhere I have looked you can't even get the 9070xt near msrp. The AMD fans who wanted cheaper are only enabling buying at higher prices by buying the AMD gpus at $200-$400 higher prices. Just makes their argument really empty.

I still don't think the GPU will come close to a 5090 in performance. And at 450 watts it just again kills the AMD fan argument of NVIDIA bad/AMD Good.

MY 4090 has a 450 watt rating but rarely ever come close. Even at games with high demand. You really need to bump the resolution well above 4k to push it into the 400 watt range.
 
The AMD fans who wanted cheaper are only enabling buying at higher prices by buying the AMD gpus at $200-$400 higher prices.

I mean ... what we would historically call "appropriately priced" GPUs hasn't been a thing for a while now. There are exceptions, but this argument is true of most things. I don't want to get into any AMD vs Nvidia shit (because it is shit), but here the 9070xt is about €60 more than the 5070 and €150 less than the 5070ti. The 5060 is the only card today that I would almost call sensibly priced, but even that should have been €250, and that is within context of here and now.
 
Sure, MLID. And I'm the King of England. Enjoy the ad revenue from the droves of idealistic dreamers who genuinely think AMD can conjure such a card out of thin air, I guess if you ask the genie in the bottle really nicely it's probably gonna happen. I'll eat my words if they ship anything with Navi 40.

Not to detract from his merits... But I think this is one of those cases where MLID is making things up out of thin air.
For AMD to adopt GDDR7, it would require modifications to the design, particularly the memory interface.

Yup, my thoughts exactly. It's a good thing that dreaming is free, because the chance that you will run GDDR7, especially the higher 32 Gbps bin off a GDDR6 PHY, is about as high as me actually being the King of England.

Monster Hunter: Wilds already uses over 16 GB of VRAM at 1440p on a mobile RTX 5090 (not the same as a desktop RTX 5090!) at ultra settings as seen below:

24 GB is starting to look like it is starting to have slim margins. 32GB does not look excessive in this light.

There are two things to account for here, one is actual VRAM usage vs. allocation, and that Monster Hunter Wilds has the most absolute dogshit port in recent memory - with grotesque RAM and VRAM requirements.

With all the commotion over VRAM recently, I've actually decided to do some research to gauge the viability of low-memory dGPUs and I happened to have the perfect test subject on hand. If I manage to come up with a balanced suite of games that makes sense, I will make a thread about it sometime - but what I can tell you is that I've found the claims that 8 GB GPUs are no longer operable to be somewhere between greatly exaggerated and utter tripe. Most games will run on 4 GB with low enough settings, but you may have a taste:

With a healthy dose of DLSS (25% scale) and low settings, I was able to run Black Myth Wukong, at 4K, on a GPU that's basically a mere step above what you'll find on a Nintendo Switch 2:

b1-Win64-Shipping_2025_05_30_04_04_33_952.png


Here, I'm providing the CapFrameX profiling data of this benchmark run as well (JSON included in the zip attached to this post, if you want to load it on the software yourself):

CapFrame Black Myth.png


MH Wilds on the other hand, I didn't even bother. It was a total writeoff from the start, the game is simply not functional, its memory management is terrible and the game actively malfunctions with all sorts of performance, shading and texturing issues on a VRAM-limited scenario. It also crashes extremely frequently, one run completed out of the 5 I've tried.

MonsterHunterWilds_2025_05_30_03_33_02_813.png


This indicates more of a problem with MH Wilds than a problem with low VRAM hardware in itself - this game has a very bad port and we, as gamers, must demand better.
 

Attachments

With all the commotion over VRAM recently, I've actually decided to do some research to gauge the viability of low-memory dGPUs and I happened to have the perfect test subject on hand. If I manage to come up with a balanced suite of games that makes sense, I will make a thread about it sometime - but what I can tell you is that I've found the claims that 8 GB GPUs are no longer operable to be somewhere between greatly exaggerated and utter tripe. Most games will run on 4 GB with low enough settings, but you may have a taste:

With a healthy dose of DLSS (25% scale) and low settings, I was able to run Black Myth Wukong, at 4K, on a GPU that's basically a mere step above what you'll find on a Nintendo Switch 2:

View attachment 401934

Here, I'm providing the CapFrameX profiling data of this benchmark run as well (JSON included in the zip attached to this post, if you want to load it on the software yourself):

View attachment 401936

MH Wilds on the other hand, I didn't even bother. It was a total writeoff from the start, the game is simply not functional, its memory management is terrible and the game actively malfunctions with all sorts of performance, shading and texturing issues on a VRAM-limited scenario. It also crashes extremely frequently, one run completed out of the 5 I've tried.

View attachment 401938

This indicates more of a problem with MH Wilds than a problem with low VRAM hardware in itself - this game has a very bad port and we, as gamers, must demand better.
It was never really a question of whether or not 8gb vram gpus can run games. Of course they can - well there might be some very vocal people hallucinating otherwise but we can ignore those. The question is whether or not 8gb cards can play without major visual quality compromises. The answer is yes in the majority of cases
 
It was never really a question of whether or not 8gb vram gpus can run games. Of course they can - well there might be some very vocal people hallucinating otherwise but we can ignore those. The question is whether or not 8gb cards can play without major visual quality compromises. The answer is yes in the majority of cases

I'd say it's a question of cost. If I'm buying a budget card, sure 8GB is fine, but if I'm spending close to €500 (5060ti)? Nah. Another valid, but not quite as important, question is of course how say UE6 games, or whever else is coming in say 2027/2028 will perform on such cards. This of course ties in with the cost. Lets say the RTX 5060 is the baseline, a €300+ card. I'd be pissed if I bought that and it turns out in a few years it's no longer good enough.
With a healthy dose of DLSS (25% scale) and low settings, I was able to run Black Myth Wukong, at 4K, on a GPU that's basically a mere step above what you'll find on a Nintendo Switch 2:

View attachment 401934

How does the game actually play though?

MH Wilds on the other hand, I didn't even bother. It was a total writeoff from the start, the game is simply not functional, its memory management is terrible and the game actively malfunctions with all sorts of performance, shading and texturing issues on a VRAM-limited scenario. It also crashes extremely frequently, one run completed out of the 5 I've tried.

This indicates more of a problem with MH Wilds than a problem with low VRAM hardware in itself - this game has a very bad port and we, as gamers, must demand better.

Yes, I agree, gamers should definitely become more vocal. (that was sarcasm btw) And you know as well as anyone this is not going to happen. Everything is ports (this way or that).
 
I'd say it's a question of cost. If I'm buying a budget card, sure 8GB is fine, but if I'm spending close to €500 (5060ti)? Nah. Another valid, but not quite as important, question is of course how say UE6 games, or whever else is coming in say 2027/2028 will perform on such cards. This of course ties in with the cost. Lets say the RTX 5060 is the baseline, a €300+ card. I'd be pissed if I bought that and it turns out in a few years it's no longer good enough.


How does the game actually play though?



Yes, I agree, gamers should definitely become more vocal. (that was sarcasm btw) And you know as well as anyone this is not going to happen. Everything is ports (this way or that).

I don't own the game, sadly. But the benchmark scene runs rather acceptably for what it's worth. At 1080p, I've no doubt you can get a palatable experience.

Agreed though :)
 
I'd say it's a question of cost. If I'm buying a budget card, sure 8GB is fine, but if I'm spending close to €500 (5060ti)? Nah. Another valid, but not quite as important, question is of course how say UE6 games, or whever else is coming in say 2027/2028 will perform on such cards. This of course ties in with the cost. Lets say the RTX 5060 is the baseline, a €300+ card. I'd be pissed if I bought that and it turns out in a few years it's no longer good enough.
Well the thing is, we already have incredibly looking games that play great with 8gb of vram and some of them have some really high quality textures (Plague Tale - no RT for example, or TLOU @ High preset). So if a UE6 game only works at the low preset with an 8gb vram, shouldn't that mean that it will look as good or even better than Plague tale maxed out? I mean it should be. The problem is, it probably won't. It will need 16gb of vram and will look like a 2015 games (monster hunter is a good example). But I ask you, is that the point of getting more vram? Just so the games use more and more of it without actually looking better?

There is absolutely 0 reason for games to start looking worse and worse on 8gb vram gpus besides horribly optimized games. Btw, have you tried hogwarts recently? A megatuned 9800x 3d drops to below 50 fps average and 24 0.1% lows just running through a small vilage (hogsmeade)!!! Are we really hardware limited here? I think not, I think the software is absolute horsecrap.
 
Well the thing is, we already have incredibly looking games that play great with 8gb of vram and some of them have some really high quality textures (Plague Tale - no RT for example, or TLOU @ High preset). So if a UE6 game only works at the low preset with an 8gb vram, shouldn't that mean that it will look as good or even better than Plague tale maxed out? I mean it should be. The problem is, it probably won't. It will need 16gb of vram and will look like a 2015 games (monster hunter is a good example). But I ask you, is that the point of getting more vram? Just so the games use more and more of it without actually looking better?
I'm basically assuming this is what will happen. Personally I don't think games should look "better" than they do now anyway.
There is absolutely 0 reason for games to start looking worse and worse on 8gb vram gpus besides horribly optimized games. Btw, have you tried hogwarts recently? A megatuned 9800x 3d drops to below 50 fps average and 24 0.1% lows just running through a small vilage (hogsmeade)!!! Are we really hardware limited here? I think not, I think the software is absolute horsecrap.

This has been a thing for a very long time. I'm pretty sure parts of Yavin in Jedi Outcast will play poorly no matter the hardware.
 
you only need 32gb for 8K. it's nothing but a silly gimmick putting that on a 9070xt.
That's just for AI appeal. Even though NVIDIA dominates AI with CUDA, having a lot of VRAM is such a hard requirement for good models that people will buy an AMD card just for that.
 
If you believe that I've got a bridge to sell.
 
Technically speaking, there exists a scenario where AMD develops a larger RDNA4 core with a 384bit bus, connect it to 24GB of GDDR6 and sell a product capable of surpassing an RTX 5080 on paper.

The question here is if they should, and considering we more often than not see the fruits of such developments reach the enterprise space first, where there's smoke there should be fire. I'm not aware of any current smoke.

I know I would love it to exist, but I don't speak for AMD's decision makers, just for people who want to see competition at all price segments.
 
Looks like a Pro card to me not a consumer card.
 
Back
Top