• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

[TT] AMD rumored Radeon RX 9080 XT: up to 32GB of faster GDDR7, up to 4GHz GPU clocks, 450W power

It is painfully obvious that you do not have any usage with a 7900XT. Are those not the settings that you apply in Game? Let me see what is in what I posted.

Monitor: Check
Window Mode: check
Resolution: Check
Aspect Ratio: Off
HDR: Off
Upscaling: Off
AA: TAA
Nvidia Reflex: No
Grame Gen: No
Resolution scaling 144hz
Refresh rate 144hz
Vsync: Off

What else were you looking for? Display Settings? as they are on the next tab.

View attachment 402196
Nope, very high preset + maxed out RT with 10 object range. Go try it at native, youll be getting 25-30 fps. The settings you are using, 4x anisotropic? What is this, nintendo 64?
 
I don't think this would happen, otherwise AMD would have already launched it.
Unless the competition's less than stellar product stack this time around made AMD reconsider. No one saw that coming, at least not to that extent.
States rumor...
Yup, all these forum members sharing their internet wisdom by kicking in open doors:

MLID's claimed rumors are rumors.

Who would have thought.
 
The double standards from Nvidia fans are amazing.
Nvidia fans: running max settings is dumb, you'll never need more than 8GB vram and it's all the fault of console games for any title using more than 8gb.
Also Nvidia fans: run ultra with max RT bro or else your GPU is shit.
Nobody said anything remotely similar. A video was posted with the 9070xt dropping to 40 fps - he claimed his 7900xt gets 160 fps at those settings - im claiming it doesn't. The rest is you inserting your poisonous anti nvidia hatred and making stuff up, as usual.
 
Well you were right I do get less FPS but no where like you mentioned . Went to 158 average instead.

Screenshot 2025-06-02 112123.png
 

TL;DR: AMD is reportedly developing the Radeon RX 9080 XT, an RDNA 4-based enthusiast graphics card featuring up to 32GB of GDDR7 memory and GPU clocks reaching 4.0GHz. It promises 15-40% better gaming performance than the RX 9070 XT, potentially outperforming NVIDIA’s RTX 5080 and rivaling the RTX 5080 SUPER.

if I can get this on launch day at MSRP I probably will do it. microcenter don't fail me now!

AMD, THE WAY WE PLAY :rockout: :rockout: :rockout:
To be clear, I am all for it (Though I am with yall on less VRAM, 24gb is plenty and would save on power and cost). But I really doubt they are going to do something like that because of the way they did this generation (I mean releasing a 9080 series card). Of course I am not an insider just an outsider looking in LOL.

To be honest though, it would not surprise me if they do a RX 9070 XTX with a higher chip just to get into a $750ish dollar card area this generation (Which I am all for!).
 
I think amd needs to stop milking their customers, their fastest card barely matches a 4070ti super, a 3 year old architecture. But since there are people that will buy amd regardless, why would they bother right?
Lol wtf are you on about, people barely buy AMD gpus, its why Nvidia has 85%> of the market.

AMDs fans arent the company's problem ;)
 
Nobody said anything remotely similar. A video was posted with the 9070xt dropping to 40 fps - he claimed his 7900xt gets 160 fps at those settings - im claiming it doesn't. The rest is you inserting your poisonous anti nvidia hatred and making stuff up, as usual.
Anyone that steps back can see how hypocritical this is. You just spent the morning telling me that I am not getting the FPS that I get and then you made me go into my settings and change what I don't change but then the numbers were still not in the 40s as you claim. It is interesting that the 9070XT all of sudden is better because it uses new upscaling as what people say makes it worth it. I guess AMD were lying when they said this gen would not replace last gen High end. Once again I have to come back to did Wiz use AMD software when he was recording his numbers? Don't worry though If the 9080XT is real I will be getting one. That 32GB Frame buffer sounds great, as does 4Ghz. I have a 1000W PSU so 450 is nothing. I want you to touch that one. I saw you complaining about the power draw of the 6800XT like you did not know that the default power limit is 255 Watts. Now live in a world where all of the problems that Nvidia lovers complained about AMD are blowing up in their faces but it is ok. I have even seen some of you say "I am waiting for the drivers to mature", I set my power limit to 66% and now the best part of it all. The Tech tubers that helped create this are now AMD shills. Just like MSI afterburner vs AMD software means AMD is cheating. Yes people have said that to me on other forums.

Especially when we are discussing a Game that represents the Elephant in the Room. Spiderman 2 was made and optimized for AMD hardware. That is true for all current gen console Games. I can already hear the Switch argument but Fzero 9 is not innovation. It could be Hogwarts, Starfiled, Horizon, Outriders or Avatar. You know what they all are? Yes console Games made on AMD hardware. Just like how those handhelds have led to Steam OS. Guess who can't run Steam OS well? What is Steam OS? A OS made specifically for Gaming on AMD hardware, You can take the Extra math that is Team Green RT and the other math that creates Frame Gen but getting 25 FPS at 4K in Spdierman 2 is not what I see. and not why X3D chips are some of the hardest to find on the used market.

So your 7900xt is 3 times faster than my 4090. Aged like fine wine I guess.:toast:
I am telling you what I get. I could care less about your 4090. I never claimed you were not getting FPS. You did.
 
Anyone that steps back can see how hypocritical this is. You just spent the morning telling me that I am not getting the FPS that I get and then you made me go into my settings and change what I don't change but then the numbers were still not in the 40s as you claim. It is interesting that the 9070XT all of sudden is better because it uses new upscaling as what people say makes it worth it. I guess AMD were lying when they said this gen would not replace last gen High end. Once again I have to come back to did Wiz use AMD software when he was recording his numbers? Don't worry though If the 9080XT is real I will be getting one. That 32GB Frame buffer sounds great, as does 4Ghz. I have a 1000W PSU so 450 is nothing. I want you to touch that one. I saw you complaining about the power draw of the 6800XT like you did not know that the default power limit is 255 Watts. Now live in a world where all of the problems that Nvidia lovers complained about AMD are blowing up in their faces but it is ok. I have even seen some of you say "I am waiting for the drivers to mature", I set my power limit to 66% and now the best part of it all. The Tech tubers that helped create this are now AMD shills. Just like MSI afterburner vs AMD software means AMD is cheating. Yes people have said that to me on other forums.

Especially when we are discussing a Game that represents the Elephant in the Room. Spiderman 2 was made and optimized for AMD hardware. That is true for all current gen console Games. I can already hear the Switch argument but Fzero 9 is not innovation. It could be Hogwarts, Starfiled, Horizon, Outriders or Avatar. You know what they all are? Yes console Games made on AMD hardware. Just like how those handhelds have led to Steam OS. Guess who can't run Steam OS well? What is Steam OS? A OS made specifically for Gaming on AMD hardware, You can take the Extra math that is Team Green RT and the other math that creates Frame Gen but getting 25 FPS at 4K in Spdierman 2 is not what I see. and not why X3D chips are some of the hardest to find on the used market.


I am telling you what I get. I could care less about your 4090. I never claimed you were not getting FPS. You did.
You are not getting anywhere near 150 fps with the settings he was using in that video. W1z in his review has your card at 24. Post a video showing the settings or stop lying.


Here, another one that verifies it, 26 fps...

 
Nobody said anything remotely similar. A video was posted with the 9070xt dropping to 40 fps - he claimed his 7900xt gets 160 fps at those settings - im claiming it doesn't.
He showed you what he got, I trust someone showing their actual settings more than I do of a video biased towards Nvidia. IMO it's pointless anyway as the video is pointless comparing a 9070XT to a 4090, not even a comparison.
The rest is you inserting your poisonous anti nvidia hatred and making stuff up, as usual.
I guess calling out a greedy company for being greedy is "poisonous nvidia hatred". Nvidia are the ones being poisonous to the gaming market, and the waving around a 4090 comparing it to midrange cards is just sad, who really cares. You're cherrypicking a single game to boast about how great your favorite brand is in an AMD thread.
And I'm not making anything up, you've been saying those things in other threads lol.
 
He showed you what he got, IMO it's pointless anyway as the video is pointless comparing a 9070XT to a 4090, not even a comparison.
The video is comparing the 9070xt to a 5070ti. Page 2. I don't know what video you are talking about

I guess calling out a greedy company for being greedy is "poisonous nvidia hatred". Nvidia are the ones being poisonous to the gaming market, and the waving around a 4090 comparing it to midrange cards is just sad, who really cares. You're cherrypicking a single game to boast about how great your favorite brand is in an AMD thread.
And I'm not making anything up, you've been saying those things in other threads lol.
I'm not cherrypicking any game as I wasn't the one that posted the video. But when someone claims he is getting 160 fps in a game that his card is barely getting 30 yeah, im gonna call him out on that. People are reading. Imagine someone buying a 7900xt cause it can get 160 fps in spiderman 2 maxed out and he finds out he only gets 23. This is a technological forum, let's not spread missinformation, please.
 
You are not getting anywhere near 150 fps with the settings he was using in that video. W1z in his review has your card at 24. Post a video showing the settings or stop lying.


Here, another one that verifies it, 26 fps...

MSI Afterburner. Now you are calling me a liar. Well I guess you can call AMD liars then.
 
I guess we ran out of MLID rumors to yak about!

Doesn't dynamic resolution scaling mean the game will drop resolution to hit the target framerate?

I can see that helping just about any card play the game at 4K at the cost of image quality:

View attachment 402242

Yeah, it's turned into my gpu gets 2000 fps in that game maxed settings and only uses 50 watts.... Every benchmark shows it getting 30-50fps.... No they are all wrong they dont know how to use the control panel to unlock next gen performance today everyone is just stupid...

#goodtimes
 
Yeah, it's turned into my gpu gets 2000 fps in that game maxed settings and only uses 50 watts.... Every benchmark shows it getting 30-50fps.... No they are all wrong they dont know how to use the control panel to unlock next gen performance today everyone is just stupid...

#goodtimes
20 to 30 in this case. But whatever, its pointless
 
20 to 30 in this case. But whatever, its pointless

It's a story as old as time

With some people it's cars with others it's their pe#!$ size...

Humans like to embellish whatever they have...

For me it's my wife she's better than everyone else's, she gets all the frames and ain't nobody telling me any different :laugh: :laugh: :laugh:
 
I guess we ran out of MLID rumors to yak about!

Doesn't dynamic resolution scaling mean the game will drop resolution to hit the target framerate?

I can see that helping just about any card play the game at 4K at the cost of image quality:

View attachment 402242
I assume that was a rhetorical question, but yes it will lower render resolution and scale the image to reach the desired performance target.

It’s a useful feature if you have big variations in performance and need to reach a target fps. Not sure I’d use it for 144fps though.

This is why you stick to reviews with well documented “standard” benchmarks, go ape in the game or driver configs and there is no logic in the settings, just numbers that have no relation to IQ or game experience.
 
Upload a video on youtube playing the game at 160 fps and make sure to show us the settings in said video. It's not that hard. What you posted doesn't prove anything, it doesn't even have the settings, lol.
He’s got DRS turned on and thinks he’s running at 4k lollololll
 
i just hope amd can accomplish what they want with this monster
 
i just hope amd can accomplish what they want with this monster
There is no monster. Anyone that thinks AMD would replace the memory controller on an existing GPU needs their head examined.

4Ghz? Come on, if anyone thinks that’s possible with RDNA they are clueless.
 
He’s got DRS turned on and thinks he’s running at 4k lollololll
Bro really said
I don't have to drop settings ....... At any rate I was getting 120 FPS with everything turned off and 4K native
And they were using the high preset, no RT and dynamic resolution scaling lolol :roll:

But surely we've got it all wrong and it's an Nvidia fan double standard though, right? Some people either need to work on reading comprehension, or take a good hard look at how their own bias contaminates their judgement, or both.
 
There is no monster. Anyone that thinks AMD would replace the memory controller on an existing GPU needs their head examined.

4Ghz? Come on, if anyone thinks that’s possible with RDNA they are clueless.
Wasn't there a rumor that they were developing a big Navi 4X with separate MCDs that was canceled? Seems like they could possibly bring that back to go up against a super refresh. I believe the rumor was it was too expensive to make so it was canceled. Assuming that's actually true, and given they can't keep navi 48 in stock at $100 over MSRP, perhaps that could be plausible in these market conditions. But yeah, 4 Ghz N48 with 32 GB GDDR7 sounds like total nonsense.
 
Last edited:
Wasn't there a rumor that they were developing a big Navi 4X with separate MCDs that was canceled? Seems like they could possibly bring that back to go up against a super refresh. I believe the rumor was it was too expensive to make so it was canceled. Assuming that's actually true, and given they can't keep navi 48 in stock, perhaps that could be plausible in these market conditions. But yeah, 4 Ghz N48 with 32 GB GDDR7 sounds like total nonsense.
They could “do” anything.

Let’s say the Navi 4X was fully designed and was cancelled just before first silicon was ordered. You are now 18 months away from having a product on the shelves - 3 months to get first silicon back, 12 months of validation, 3 more months for production silicon. Assuming you are willing to spend the money for hot lots.

That makes the 9080 a 2027 product. Even AMD isn’t dumb enough to spend the money and effort to release a “high end” RDNA 4 product in 2027.
 
Last edited:
Back
Top