• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

[TT] AMD rumored Radeon RX 9080 XT: up to 32GB of faster GDDR7, up to 4GHz GPU clocks, 450W power

Technically speaking, there exists a scenario where AMD develops a larger RDNA4 core with a 384bit bus, connect it to 24GB of GDDR6 and sell a product capable of surpassing an RTX 5080 on paper.

The question here is if they should, and considering we more often than not see the fruits of such developments reach the enterprise space first, where there's smoke there should be fire. I'm not aware of any current smoke.

I know I would love it to exist, but I don't speak for AMD's decision makers, just for people who want to see competition at all price segments.

Such a scenario was scrapped, if the same "leakers" are to be believed, back in 2023.

If you believe that I've got a bridge to sell.

Yup. It's like... "source: I made it tf up."

A lot of rumors about AMD are used to get clicks. AMD has it's marketing mob and acolytes. That's all I'll say about that.

But my opinion about this supposed leak is that if it is real then it is likely a pro variant or something for use with Ai stuff. Maybe it could be used for gaming but it won't be cheap.
AMD already could have sold the 9070xt for $599 but chose to go $100 more. And everywhere I have looked you can't even get the 9070xt near msrp. The AMD fans who wanted cheaper are only enabling buying at higher prices by buying the AMD gpus at $200-$400 higher prices. Just makes their argument really empty.

I still don't think the GPU will come close to a 5090 in performance. And at 450 watts it just again kills the AMD fan argument of NVIDIA bad/AMD Good.

MY 4090 has a 450 watt rating but rarely ever come close. Even at games with high demand. You really need to bump the resolution well above 4k to push it into the 400 watt range.

Radeon AI Pro R9700. 32 GB, exact same core used in the 9070 XT. RTX 4080-like inferencing performance with double the VRAM, meaning it will load larger models that 16 GB cards won't. But don't expect it to run Mistral or DeepSeek at 200 tokens a second, because it won't.

Looks like a Pro card to me not a consumer card.

No such card exists or will exist, regardless of the segment. Instinct won't use RDNA IP.
 
No such card exists or will exist, regardless of the segment. Instinct won't use RDNA IP.
Isnt the Pro R9700 a RDNA4 card. That is what i mean by Pro, i should have qualified it as Prosumer not 100% Enterprise like Instinct.
 
Isnt the Pro R9700 a RDNA4 card. That is what i mean by Pro, i should have qualified it as Prosumer not 100% Enterprise like Instinct.

Yes, it is exactly the same product as the RX 9070 XT, down to the last transistor. It only has clamshell memory, so double the capacity. Throughput and AI training performance is the same, although with 32 GB you can load larger LLMs. That's where it ends, really. If they had finalized Navi 42, they'd certainly have made that known.

I'm just hoping the pricing on the R9700 is sensible.
 
Yeah please stop with the copium folks, AMD is done with this gen after N44.

We're more likely to see RDNA5/UDNA pulled forward to mid-2026 than any more N4x chips.

If AMD could juice N48 to 4Ghz they would have done it already.
The way I interpreted the rumor:
AMD designed a GDDR7 N48, it was un-viable
-until, GDDR7 production increased and supply opened up.

The part about a mid-gen node change, though? I'm not so sure on. Some nodes don't require re-working the uArch, others do. Not sure there.
 
Sure, MLID. And I'm the King of England. Enjoy the ad revenue from the droves of idealistic dreamers who genuinely think AMD can conjure such a card out of thin air, I guess if you ask the genie in the bottle really nicely it's probably gonna happen. I'll eat my words if they ship anything with Navi 40.



Yup, my thoughts exactly. It's a good thing that dreaming is free, because the chance that you will run GDDR7, especially the higher 32 Gbps bin off a GDDR6 PHY, is about as high as me actually being the King of England.



There are two things to account for here, one is actual VRAM usage vs. allocation, and that Monster Hunter Wilds has the most absolute dogshit port in recent memory - with grotesque RAM and VRAM requirements.

With all the commotion over VRAM recently, I've actually decided to do some research to gauge the viability of low-memory dGPUs and I happened to have the perfect test subject on hand. If I manage to come up with a balanced suite of games that makes sense, I will make a thread about it sometime - but what I can tell you is that I've found the claims that 8 GB GPUs are no longer operable to be somewhere between greatly exaggerated and utter tripe. Most games will run on 4 GB with low enough settings, but you may have a taste:

With a healthy dose of DLSS (25% scale) and low settings, I was able to run Black Myth Wukong, at 4K, on a GPU that's basically a mere step above what you'll find on a Nintendo Switch 2:

View attachment 401934

Here, I'm providing the CapFrameX profiling data of this benchmark run as well (JSON included in the zip attached to this post, if you want to load it on the software yourself):

View attachment 401936

MH Wilds on the other hand, I didn't even bother. It was a total writeoff from the start, the game is simply not functional, its memory management is terrible and the game actively malfunctions with all sorts of performance, shading and texturing issues on a VRAM-limited scenario. It also crashes extremely frequently, one run completed out of the 5 I've tried.

View attachment 401938

This indicates more of a problem with MH Wilds than a problem with low VRAM hardware in itself - this game has a very bad port and we, as gamers, must demand better.

It was never really a question of whether or not 8gb vram gpus can run games. Of course they can - well there might be some very vocal people hallucinating otherwise but we can ignore those. The question is whether or not 8gb cards can play without major visual quality compromises. The answer is yes in the majority of cases
Sure, you can cut down some games to fit within 8 GB VRAM, but you can lose visual quality by doing so.

See these recent YouTube videos about how bad 8 GB cards are no longer viable in recent games.



The Hardware Unboxed benchmarks show that 8 GB cards are no longer viable on plenty of modern games, and 12 GB cards are pushing it on viability. That video also shows the difference between 8 GB and 16 GB video cards that use the same GPU.

Sure, JayzTwoCents was later contradicted when AMD and its board partners introduced the RX 9060 XT with both 8 GB and 16 GB options after the video was recorded because we believed at the time that AMD had abandoned 8 GB video cards. However, his failed game play demonstrations with 8 GB video cards show that 8 GB cards are no longer viable.

As for the Monster Hunter Wilds VRAM situation, I am not happy with it. I along with many other people who enjoy Street Fighter 6 are speculating on why its upcoming Elena and Season 3 DLC bundle is taking so long to release. It is very possible that Capcom had to pull developers from Street Fighter 6 to work on Monster Hunter Wilds. Both Street Fighter 6 and Monster Hunter Wilds run on Capcom's RE Engine, and Street Fighter 6 is a very well optimized game while Monster Hunter Wilds is a mess.
 
The Hardware Unboxed benchmarks show that 8 GB cards are no longer viable on plenty of modern games, and 12 GB cards are pushing it on viability. That video also shows the difference between 8 GB and 16 GB video cards that use the same GPU.
No, HUB has shown no such thing. He has shown that's it's no longer viable for ultra settings in all games. Neither is my 4090 at 4k, that doesn't mean it's not a viable 4k gpu
 
No, HUB has shown no such thing. He has shown that's it's no longer viable for ultra settings in all games. Neither is my 4090 at 4k, that doesn't mean it's not a viable 4k gpu
Mean while The developer's ,after you own a RTX 5090 & 9800 x 3D release this statement from an interview about the game.
"You may need to upgrade your rig to play this game"
(to what?)
 
So long as "Jensen's Law" is circular in nature and factors in e-waste...
 
No, HUB has shown no such thing. He has shown that's it's no longer viable for ultra settings in all games. Neither is my 4090 at 4k, that doesn't mean it's not a viable 4k gpu
So 8 GB video cards are viable for PC gaming at low resolutions where lots of things are turned down with some of today's games. I misinterpreted the video because it is hard to remember everything when I have to listen and watch at the same time, but the video does show times when low VRAM causes the game to grind into a slideshow, which is something that a written article cannot convey very well. When I found the article version of the Hardware Unboxed video at https://www.techspot.com/review/2856-how-much-vram-pc-gaming/ , I was able to interpret it a lot better and get your point.

However, I find that kind of gaming hardware pointless if I can buy a PlayStation 5 Pro for cheaper than a PC with an 8 GB video card and get a better experience than with such an imbalanced PC. Also, I also want to consider the longevity of the hardware when I do buy hardware. I find buying stuff that is viable for today but is too obsolete in a short amount of time a waste of time and money. With the PlayStation 6 likely coming soon, developers are likely going to target that as the main design goal. The PS5 already has 16 GB of unified CPU and GPU RAM (with the PS5 Pro adding 2 GB of CPU-only RAM), so I would expect at least 16 GB or more of memory that the PlayStation 6's GPU will be able to use. I also use 4K not only for gaming but also for home computing productivity, so gaming would take up more VRAM due to my monitor's 4K resolution. A dedicated high speed action esports computer might use a 1080p or 1440p monitor, but home computing and work productivity (some people who work at home use their PCs for work) will suffer because reading web pages, documents, and work will suffer due to the monitor's lower resolution. Also, some positions in esports like a team's sniper would greatly benefit from a high resolution monitor in order to spot enemies.

On another note, I would not recommend buying PlayStation 5 Slims after several of them failed at recent fighting game tournaments such as Evo Japan and Combo Breaker due to overheating from being used for extremely long times. The overheating incidents caused several console hardware malfunctions, interrupting and fouling matches up at these fighting game tournaments. I have read that the PlayStation 5 Pros have better cooling hardware, so they should be able to withstand being left on for long times without overheating unlike PlayStation 5 Slims.
 
Yup. It's like... "source: I made it tf up."
I agree, he was like infinitely close to 100% sure that Intel Battlemage was dead and now B580 Steel Legend is consistently in stock for $300 in my place.
Afterwards he keeps shitting about Intel was impossible to deliver another GPU and G31 is dead long ago but now we have news stating that the G31 is shipping to the AIB manufacturers.
He is a clown of himself and he hate Intel like how Userbenchmark hate AMD.
Graphically Challenged made shit up (mostly assumption and he did highlight that) as well but at least he never hate on something like MLID does.
 
who believes that? i do not

Short: hoax - does not exists - will never be sold.
 
who believes that? i do not

Short: hoax - does not exists - will never be sold.
States rumor...

Anywho, I believe with the AI 9070 being available, I dont believe there will be a 9080. I believe the next cards will be udna.
 
Last edited:
The way I interpreted the rumor:
AMD designed a GDDR7 N48, it was un-viable
-until, GDDR7 production increased and supply opened up.

The part about a mid-gen node change, though? I'm not so sure on. Some nodes don't require re-working the uArch, others do. Not sure there.
There are other reasons that a high end AMD card is much more viable than expected.

Nvidia destroyed what makes GeForce a GeForce with Blackwell. The GeForce brand used to stand for things that other cards did not have like great drivers, great backwards compatibility with older games, CUDA, PhysX, good OpenCL 1.x and 3.x, and cards that just work. Driver quality is greatly declining. Blackwell has gotten reports of poor Direct3D 9 and older performance. Nvidia used to be the champion of old API compatibility and performance. No more. 32-bit CUDA support was removed from Blackwell. Only 64-bit CUDA remains. Since Nvidia used shims to translate OpenCL and PhysX into CUDA, removing 32-bit CUDA also removed 32-bit OpenCL and 32-bit PhysX. Now AMD is the champion of OpenCL, with its ability to run not only 32-bit OpenCL as well as 64-bit OpenCL, it also runs OpenCL 2.2. Nvidia never fully supported any OpenCL 2.x version. Now those who depend on 32-bit CUDA are stuck porting their 32-bit CUDA and/or 32-bit OpenCL code to OpenCL and use a competitor's GPU, porting their code to DirectCompute (who knows how slow 32-bit DirectCompute will run on Blackwell?), or porting their code to 64-bit CUDA or Vulkan (will 32-bit Vulkan even run on Blackwell or not? If so, how slowly?). Nvidia also has quit making Ada Lovelace GPUs, so buying a new GeForce that has almost everything that makes GeForce a GeForce is almost impossible. Also, having cards that melt their power connectors is giving the GeForce brand bad press about being literal fire hazards.

With the GeForce brand destroyed with Blackwell, everything that locks GeForce users to the GeForce brand is gone. People are now willing to look for competitors now that vendor lock-in to GeForce is mostly gone. AMD had no reason to expect Nvidia to commit brand suicide with Blackwell, or it would have designed, readied, and released high end Radeon RX 9000 series GPUs to capitalize on and punish Nvidia's Blackwell blunder.
 
this is the AMD thread, why you here bruh, we all know you Nvidia boyo
The incursion of AMD stalwarts into Nvidia threads happens every time, without fail. So on that basis I have to assume what you said was an attempt at humour/irony, surely.

As for this 'rumor', it's MLID, so here I have to assume based on his track record that it's effectively worthless beyond entertainment (for those who see it as such, I also find his voice to be nails on a chalkboard) as far as consumers are concerned.
 
The incursion of AMD stalwarts into Nvidia threads happens every time, without fail. So on that basis I have to assume what you said was an attempt at humour/irony, surely.

As for this 'rumor', it's MLID, so here I have to assume based on his track record that it's effectively worthless beyond entertainment (for those who see it as such, I also find his voice to be nails on a chalkboard) as far as consumers are concerned.

yes it was humor, @nguyen and I are friends, at least i consider us as friends.
 
I see no point in this card if all if can take on is the RTX 5080.
It would only be note worthy if it can take on RTX 5090
other wise it's a complete waste of time.
It's better to move on to UDNA1/RNDA5 & get a big design out from that.
 
I see no point in this card if all if can take on is the RTX 5080.
It would only be note worthy if it can take on RTX 5090
other wise it's a complete waste of time.
It's better to move on to UDNA1/RNDA5 & get a big design out from that.

-4090 performance with a smaller die and more reasonable (but still high) price would be good, sort of a no man's land in the current market that would give AMD plenty of room for margins and still get to claim the "for actual gamers" brand credit.

Not going to happen but I can dream.
 
The incursion of AMD stalwarts into Nvidia threads happens every time, without fail. So on that basis I have to assume what you said was an attempt at humour/irony, surely.

As for this 'rumor', it's MLID, so here I have to assume based on his track record that it's effectively worthless beyond entertainment (for those who see it as such, I also find his voice to be nails on a chalkboard) as far as consumers are concerned.

I try to blow up threads on both sides because honestly fanboys on both sides are ridiculous:laugh: :laugh: :laugh: :toast:


Me rolling into every Nvidia/AMD thread.....
Lord Of The Rings Burn GIF by Amazon Prime Video



As far as this card I would say what is the point unless AMD thinks they can charge 12-1500 usd for it otherwise it would be seen as a waste of time and resources from them. Could they be working on a refresh of RDNA4 sure we all know both AMD/Nvidia loves their refreshes but otherwise #Doubt
 
yes it was humor, @nguyen and I are friends, at least i consider us as friends.

There is no point making enemies because we like different things anyways :roll: .

BTW AMD fans better pony up for the 32GB VRAM version, looks like 16GB on the 9070XT might not cut it for 4K RT with FSR

4k RT.jpg
 
I don't think this would happen, otherwise AMD would have already launched it.
 
The 9070XT is using less VRAM, but neither of those cards provide what I would consider acceptable for 4K.

Its using less becuase a bunch is spilling over into system ram. Somthing with AMDs memory management is broken in that game is what it looks like to me.
 
Its using less becuase a bunch is spilling over into system ram. Somthing with AMDs memory management is broken in that game is what it looks like to me.
Interesting, I had no idea Spiderman 2 was broken on AMD cards, more VRAM wouldn't help then. I don't watch much of their channel, so I didn't know what the video was but I see it's a re-review.
The 9070XT and 5070Ti are very close at 1440P.
 
Back
Top