• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

[TT] AMD rumored Radeon RX 9080 XT: up to 32GB of faster GDDR7, up to 4GHz GPU clocks, 450W power

I have Spiderman 2 and don't not need to use FSR for high frame rates.
 
I have Spiderman 2 and don't not need to use FSR for high frame rates.
If you drop the other settings sure. But maxed out youd beggeting 30-35 fps on your 7900xt at 4k.

There is no point making enemies because we like different things anyways :roll: .

BTW AMD fans better pony up for the 32GB VRAM version, looks like 16GB on the 9070XT might not cut it for 4K RT with FSR

View attachment 402127
It's not about how big your vram buffer is, it's about how you use it
 
Last edited:
lots of rumour floating around...
nvidia rumour cutting back on rtx50 series production to increase other segment productions
also nvidia is releasing 50 series super with minimal upgrade to performance...
if 40 series going to super to go by i dont think amd will release a higher end of RDNA4 to counter 50 series super...
the 9070xt is plenty fast...
current price is already high... i dont want to think how much the 9080xt will cost...
 
Well the thing is, we already have incredibly looking games that play great with 8gb of vram and some of them have some really high quality textures (Plague Tale - no RT for example, or TLOU @ High preset). So if a UE6 game only works at the low preset with an 8gb vram, shouldn't that mean that it will look as good or even better than Plague tale maxed out? I mean it should be. The problem is, it probably won't. It will need 16gb of vram and will look like a 2015 games (monster hunter is a good example). But I ask you, is that the point of getting more vram? Just so the games use more and more of it without actually looking better?

There is absolutely 0 reason for games to start looking worse and worse on 8gb vram gpus besides horribly optimized games. Btw, have you tried hogwarts recently? A megatuned 9800x 3d drops to below 50 fps average and 24 0.1% lows just running through a small vilage (hogsmeade)!!! Are we really hardware limited here? I think not, I think the software is absolute horsecrap.
You know, poorly optimised games are thing. They have been forever. I suspect they will appear in the future. 8GB cards are not fit for purpose if they can't handle poorly optimised games. Period.

On topic, AMD will need to counter the foreshadowed Super series...
 
You know, poorly optimised games are thing. They have been forever. I suspect they will appear in the future. 8GB cards are not fit for purpose if they can't handle poorly optimised games. Period.
There are poorly optimized games that don't care about vram, they just require ungodly amount of raster performance to run properly. So, we should also delete the whole xx60 and xx70 tier cause it can't play those poorly optimized games. XX90 or go home.
 
Efficiency is going to the trash also 32GB of vram will make this gpu not so good in terms of p/p current RX 9070 XT already is power hungry enough and it needs good case for cooling. Not a fan of high tdps.....
 
Last edited:
This sounds like a workstation card or a VRAM upgrade getting the XTX moniker, there is not much frequency headroom left and unless AMD have harvested all golden GPUs from day 0 I can't see it.

The 9070XT is already well past the knee in it's efficiency curve, it also scales worse than the 9070 (due to cache ?) and neither of them is greatly improved by VRAM overclocking. With this into the consideration, how more performance is going to get squeezed out of that die while keeping the power in check sounds iffy, but who knows maybe there are process tweaks and harvesting.... doubt it though.

A more VRAM edition for workstation and deep pockets sounds more reasonable.
 
I really wish that there was a way for a GPU to dynamically turn on and off different dies of VRAM. That way, we could get the efficiency benefits of low VRAM when not much VRAM and not much VRAM throughput are being used, and to turn all the dies on when some sort of app starts to use a lot of VRAM capacity and/or VRAM throughput. More VRAM dies running in parallel increases throughput.
 
I want a Dual GPU card with two RX 9070 XT on one PCB,.. the AMD 32GB 750W RX Kraken. :D
 
Nvidia destroyed what makes GeForce a GeForce with Blackwell.
Average Joes disagree. They buy 5060s and 5070s and happily play their video games, absolutely oblivious to the fact their vidya could run even better.
However, the problem is also the fact they can't afford an RDNA4 GPU. 6 C-note floor is way too high.
 
more cards no one can get at MSRP, especially with that vram size. Nice, said no one! well, except the LLM crowd.
 
If you drop the other settings sure. But maxed out youd beggeting 30-35 fps on your 7900xt at 4k.


It's not about how big your vram buffer is, it's about how you use it
Too bad it was made on Ryzen and Radeon hardware. I don't have to drop settings. Have you ever used a 7900XT?
 
Sure, show me. Post a clip at 4k native maxed out.
Screenshot 2025-06-02 080145.png


Unfortunately TPU tells me the file is too large. At any rate I was getting 120 FPS with everything turned off and 4K native. I use AMD software Overlay.
 
View attachment 402160


Unfortunately TPU tells me the file is too large. At any rate I was getting 120 FPS with everything turned off and 4K native. I use AMD software Overlay.
Bud, there is less than 0 chance you are getting anything over 40 at 4k maxed out. W1z has your card at 26 fps average and you are trying to convince me you are getting 164...
 
You know, poorly optimised games are thing. They have been forever. I suspect they will appear in the future. 8GB cards are not fit for purpose if they can't handle poorly optimised games. Period.

On topic, AMD will need to counter the foreshadowed Super series...
Has AMD ever launched a card to compete with a Super refresh?
My opinion is the Super cards are what the non-Super cards should've been in the first place, it's egregious double dipping on consumers. But I doubt a 9080XT exists, if it does it's a Radeon Pro or workstation card.
There are poorly optimized games that don't care about vram, they just require ungodly amount of raster performance to run properly. So, we should also delete the whole xx60 and xx70 tier cause it can't play those poorly optimized games. XX90 or go home.
No such thing as an "ungodly amount of raster performance" since Nvidia should be improving raster performance, not just slapping more RT cores on the same basic architecture since Turing.
And it would make sense to eliminate everything x80 and below, Nvidia needs to stop milking their customers on the x60 and x70 cards with small dies and meager amounts of VRAM, or only make high end cards. IMO, the market would be better off with Intel and AMD to have some competition on the low and mid range.
Average Joes disagree. They buy 5060s and 5070s and happily play their video games, absolutely oblivious to the fact their vidya could run even better.
However, the problem is also the fact they can't afford an RDNA4 GPU. 6 C-note floor is way too high.
The RTX 5070 is also above the $600 floor, the fact is the x70 tier shouldn't be above $500 for both Nvidia and AMD.
 
No such thing as an "ungodly amount of raster performance" since Nvidia should be improving raster performance, not just slapping more RT cores on the same basic architecture since Turing.
And it would make sense to eliminate everything x80 and below, Nvidia needs to stop milking their customers on the x60 and x70 cards with small dies and meager amounts of VRAM, or only make high end cards. IMO, the market would be better off with Intel and AMD to have some competition on the low and mid range.
I think amd needs to stop milking their customers, their fastest card barely matches a 4070ti super, a 3 year old architecture. But since there are people that will buy amd regardless, why would they bother right?
 
Bud, there is less than 0 chance you are getting anything over 40 at 4k maxed out. W1z has your card at 26 fps average and you are trying to convince me you are getting 164...
Let me ask you if you think AMD is faking the numbers? Does Wiz use AMD software to record his numbers?
 
Its using less becuase a bunch is spilling over into system ram. Somthing with AMDs memory management is broken in that game is what it looks like to me.
I actually think this is one of the few cases we see GDDR7 actually doing something. Those settings are bordering 16GB and SM2 is taking advantage of GDDR7 on the 5070 Ti.

Though, this is also technically 1080p, not 4K. You'd need a 4090 or 5090 to actually render this game at 4K max.
 
I think amd needs to stop milking their customers, their fastest card barely matches a 4070ti super, a 3 year old architecture. But since there are people that will buy amd regardless, why would they bother right?
Sure, you want them to launch a x60 tier card that competes with a 5090,lol. The 9070XT competes with a 5070Ti, which only matches the 4080, but that's more of an issue of the market leader not caring about the gaming market.
A theoretical 9080XT could be faster than a 5080 but no one with that kind of budget would buy it anyway, the issue is people with $1000+ to spend on a GPU alone are loyal Nvidia customers.
 
Let me ask you if you think AMD is faking the numbers? Does Wiz use AMD software to record his numbers?
No i think you are not using the same settings

A theoretical 9080XT could be faster than a 5080 but no one with that kind of budget would buy it anyway, the issue is people with $1000+ to spend on a GPU alone are loyal Nvidia customers.
Of course, if you have 1k+ why would you want to deal with amd drivers and old outdated 8pin cables. So maybe amd should focus on the midrange and release products faster instead of 2 months after nvidia.
 
Last edited:
No i think you are not using the same settings


Of course, if you have 1k+ why would you want to deal with amd drivers and old outdated 8pin cables. So maybe amd should focus on the midrange and release products faster instead of 2 months after nvidia.
Screenshot 2025-06-02 105709.png
 
AMD already said they will only concentrate on mid-level GPUs.
 
Upload a video on youtube playing the game at 160 fps and make sure to show us the settings in said video. It's not that hard. What you posted doesn't prove anything, it doesn't even have the settings, lol.
It is painfully obvious that you do not have any usage with a 7900XT. Are those not the settings that you apply in Game? Let me see what is in what I posted.

Monitor: Check
Window Mode: check
Resolution: Check
Aspect Ratio: Off
HDR: Off
Upscaling: Off
AA: TAA
Nvidia Reflex: No
Grame Gen: No
Resolution scaling 144hz
Refresh rate 144hz
Vsync: Off

What else were you looking for? Display Settings? as they are on the next tab.

Screenshot 2025-06-02 111605.png
 
Back
Top