• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,683 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Apparently, AMD's next-generation gaming graphics card is closer to launch than anyone in the media expected, with mass-production of the so-called Radeon RX 8800 XT poised to begin later this month, if sources on ChipHell are to be believed. The RX 8800 XT will be the fastest product from AMD's next-generation, and will be part of the performance segment, succeeding the current RX 7800 XT. There will not be an enthusiast-segment product in this generation, as AMD looks to consolidate in key market segments with the most sales. The RX 8800 XT will be powered by AMD's next-generation RDNA 4 graphics architecture.

There are some spicy claims related to the RX 8800 XT being made. Apparently, the card will rival the current GeForce RTX 4080 or RTX 4080 SUPER in ray tracing performance, which would mean a massive 45% increase in RT performance over even the current flagship RX 7900 XTX. Meanwhile, the power and thermal footprint of the GPU is expected to reduce with the switch to a newer foundry process, with the RX 8800 XT expected to have 25% lower board power than the RX 7900 XTX. Unlike the "Navi 31" and "Navi 32" powering the RX 7900 series and RX 7800 XT, respectively, the "Navi 48" driving the RX 8800 XT is expected to be a monolithic chip built entirely on a new process node. If we were to guess, this could very well be TSMC N4P, a node AMD is using for everything from its "Zen 5" chiplets to its "Strix Point" mobile processors.



View at TechPowerUp Main Site | Source
 
When RX 7000 came out I was screaming about the low RT performance. I was called an Nvidia fanboy back then.
A few years latter and probably with SONY pushing AMD in that direction, the rumors talk about a new RX 8000 series that mostly increases performance in Raytracing.
Better late than never....
 
This all's cool and dandy but if it's priced beyond 750 then what's the point.

I was screaming about the low RT performance. I was called an Nvidia fanboy back then.
Low? You were being too generous. RT performance in NVIDIA is extremely low; in AMD, doesn't even exist.
 
Man, if I based my GPU purchase decision on RT performance I wouldn't buy any GPU from any company.
 
it honestly isnt worth the performance hit to have a few shiny puddles.
 
When RX 7000 came out I was screaming about the low RT performance. I was called an Nvidia fanboy back then.
A few years latter and probably with SONY pushing AMD in that direction, the rumors talk about a new RX 8000 series that mostly increases performance in Raytracing.
Better late than never....
Low compared to what? 3090 Ti RT perf was not good enough improvement from AMD?
I'll also remind you that Nvidia themselves did not massively increase RT performance from 30 series.

This all's cool and dandy but if it's priced beyond 750 then what's the point.
Why would a 256bit 16GB G6 card with 7900XT raster and 4080 RT perf be priced above 750?
It would make zero sense. I can already buy 4070 Ti Super for less that equals 7900XT raster and beats it in RT.

For 8800XT to have a chance it must not be more expensive than 7800XT is new. Meaning around 500. 550 at most.
The less it costs the better deal it will be. 450 would be good. 400 would be amazing. I doubt it will be less than 400.
 
Last edited:
Meanwhile, the power and thermal footprint of the GPU is expected to reduce with the switch to a newer foundry process, with the RX 8800 XT expected to have 25% lower board power than the RX 7900 XTX.

355W for the RX 7900 XTX minus 25% will be ~270W for the RX 8800 XT.
Still high, the cards will beg for undervolting and underclocking to keep the nasty power draw in check.

1733151164065.png



Unlike the "Navi 31" and "Navi 32" powering the RX 7900 series and RX 7800 XT, respectively, the "Navi 48" driving the RX 8800 XT is expected to be a monolithic chip built entirely on a new process node. If we were to guess, this could very well be TSMC N4P, a node AMD is using for everything from its "Zen 5" chiplets to its "Strix Point" mobile processors.

Unfortunately, the TSMC N4P node cannot be labelled "new", since it's a 2022 thingie.

1733151229310.png

 
Low compared to what? 3090 Ti RT perf was not good enough improvement from AMD?
I'll also remind you that Nvidia themselves did not massively increase RT performance from 30 series.


Why would a 256bit 16GB G6 card with 7900XT raster and 4080 RT perf be priced above 750?
It would make zero sense. I can already buy 4070 Ti Super for less that equals 7900XT raster and beats it in RT.

For 8800XT to have a chance it must not be more expensive than 7800XT is new. Meaning around 500. 550 at most.
The less it costs the better deal it will be. 450 would be good. 400 would be amazing. I doubt it will be less than 400.
3090ti RT was not enough, no.

4090 RT isn't clearly enough either.

This whole thing is a farce.
 
I can't wait to not get my hands on it because it's literal unobtanium.
Edit: I thought this was about a mobile GPU. I'm dumb. Well, it applies to that lol.

If the 8800XT is truly a 7900XT tier card in performance and consumes 25% less power than a 7900XTX it means we will get single digit performance/W improvements this gen.
 
Last edited:
This whole thing is a farce.
That’s what happens when pro-oriented features are repurposed for consumer segment usage. RT cores themselves are useful. RT and PT rendering is also obviously a useful thing. Real-time RT as it is in GAMES today is little more than a gimmick.
I am not saying that RT based engines aren’t the way forward - they inevitably are as essentially THE holy grail of real-time rendering. But the push started way, waaaaay too early. Then again, could be argued that said push was necessary for hardware evolution, but NV at that point already was on RT train for their architectures, so… eh. Kind of a chicken and egg situation.
 
Why would a 256bit 16GB G6 card with 7900XT raster and 4080 RT perf be priced above 750?
Because AMD are AMD. Can't remember any of their 2020s products priced right.
 
That’s what happens when pro-oriented features are repurposed for consumer segment usage. RT cores themselves are useful. RT and PT rendering is also obviously a useful thing. Real-time RT as it is in GAMES today is little more than a gimmick.
I am not saying that RT based engines aren’t the way forward - they inevitably are as essentially THE holy grail of real-time rendering. But the push started way, waaaaay too early. Then again, could be argued that said push was necessary for hardware evolution, but NV at that point already was on RT train for their architectures, so… eh. Kind of a chicken and egg situation.

Right, they should have first prioritised a move to ultra-high resolution screens with high ppi/dpi. Because, obviously 1080p on 24", 27", 28", 32" screens looks extremely low quality.

Because AMD are AMD. Can't remember any of their 2020s products priced right.

And they never learn - meanwhile the market share is downward spiraling.
 
355W for the RX 7900 XTX minus 25% will be ~270W for the RX 8800 XT.
Still high, the cards will beg for undervolting and underclocking to keep the nasty power draw in check.
How is 270W still high and nasty?

Nvidia used to produce 250W flagships for generations and they were called efficient. Now that we have 450W flagships 270W is suddenly high?
270W is very respectable number and easily cooled by two slot, two fan cooling solution with low noise level.
Unfortunately, the TSMC N4P node cannot be labelled "new", since it's a 2022 thingie.
Care to elaborate on what high performance CPU's or GPU's were produced in 2022 on this node?
Newer nodes are not yet ready. 3nm will be there next year. 2nm likely in 2026.
Because AMD are AMD. Can't remember any of their 2020s products priced right.
The only example i can think of is 7900XT that cost 900 when it launched. This was too high. 999 for 7900XTX (same price as 6900XT) was ok considering 4080 cost 200 more. 6000 series prices were even better but did not look better compared to Nvidia who used a cheap Samsung 8nm node.
However you have to be totally delusional to think that 8800XT will cost anywhere near 750.
 
I'm glad I semi-retired from the hobby with my 7900 XT, I just don't give a damn about ray tracing, and Physx before that. Give me simple raster, a cup of unsweetened green tea, and off to game I go lads.
 
Low compared to what? 3090 Ti RT perf was not good enough improvement from AMD?
I'll also remind you that Nvidia themselves did not massively increase RT performance from 30 series.
Sales and the fact that AMD chose to retreat from the hi end, proves that a yesterday's good RT performance was not enough for those who where willing to pay over $800 for a graphics card. Also it gave Nvidia the opportunity to use Path Tracing as a technology demo where Nvidia cards where looking like 2 generations ahead compared to even an RX 7900XTX.

Why would a 256bit 16GB G6 card with 7900XT raster and 4080 RT perf be priced above 750?
It would make zero sense. I can already buy 4070 Ti Super for less that equals 7900XT raster and beats it in RT.

For 8800XT to have a chance it must not be more expensive than 7800XT is new. Meaning around 500. 550 at most.
The less it costs the better deal it will be. 450 would be good. 400 would be amazing. I doubt it will be less than 400.
Nvidia dictates the market. I seems that some people will never understand this. If AMD comes out with an RX 8800XT at $500 with performance close to an RTX 4080, they must be absolutely certain that firstly they will make a profit, secondly that Nvidia can't respond. If Nvidia is able to respond (and logic says they can, because everything favors them), then we will just see the RTX 5070 at $550 offering probably less VRAM, somewhat less raster performance and probably the same RT performance outselling the RX 8800XT 10 to 1. We have seen it multiple times in the past. Excuses of DLSS, better RT, better features, drivers and whatever it was said and written against AMD the last 15 years will be used to promote the Nvidia model over the AMD model. Seen also how Intel is losing money after pricing their GPUs too agreesively, only to go down to 0% market share, I don't expect AMD to be too aggressive with their pricing. They probably can't sustain a huge success in the gaming GPU market either, considering they are using most of their wafer capacity for EPYC, Instinc and probably mobile Ryzen chips.
I am expecting RX 8800 at $600 and RTX 5070 at $700. Where AMD might try to make a difference, is the sub $350 market because Nvidia doesn't seems to have any real interest for that market.
 
The only example i can think of is 7900XT that cost 900 when it launched.

I feel like everyone has to price things high these days, because sales happen so fast now. I got my 7900 XT for $705 almost no tax brand new, with a free game I wanted too so $70 off that, in July 2022 I believe or maybe it was 2023, I don't bloody remember at this point.
 
AMD finally at least trying to move out of the stone age.
 
Sales and the fact that AMD chose to retreat from the hi end, proves that a yesterday's good RT performance was not enough for those who where willing to pay over $800 for a graphics card. Also it gave Nvidia the opportunity to use Path Tracing as a technology demo where Nvidia cards where looking like 2 generations ahead compared to even an RX 7900XTX.
If you can call "achieving" barely playable 60fps on a card that most of it's shelf live has cost near 2000 as looking like 2 generations ahead then i dont know what to say. More like 2 generations behind. I remember the days when i bough a flagship card (that cost less than half as much) and cranked every setting to maximum and enjoyed a buttery smooth experience.
If AMD comes out with an RX 8800XT at $500 with performance close to an RTX 4080, they must be absolutely certain that firstly they will make a profit, secondly that Nvidia can't respond. If Nvidia is able to respond (and logic says they can, because everything favors them), then we will just see the RTX 5070 at $550 offering probably less VRAM, somewhat less raster performance and probably the same RT performance outselling the RX 8800XT 10 to 1.
4070 already came out at 600. You really think Nvidia would bother to lower 5070 to 550 when AMD is not a threat to them?
Nvidia will outsell AMD regardless if AMD prices their card at 500 or whatever.
Seen also how Intel is losing money after pricing their GPUs too agreesively, only to go down to 0% market share, I don't expect AMD to be too aggressive with their pricing. They probably can't sustain a huge success in the gaming GPU market considering they are using most of their wafer capacity for EPYC, Instinc and probably mobile Ryzen chips.
Intel's problem were not due to price. It was because it was a 1st gen (retail) product with major driver problems.
 
As a next gen replacement of the 7800XT, we should expect around a 20-30% increase in gen rasterization. That would place the chip around the 7900XT performance level. If priced around $399 as past rumors suggested, then we might finally have a killer perf/$ and perf/W graphics product.
 
The only example i can think of is 7900XT that cost 900 when it launched. This was too high. 999 for 7900XTX (same price as 6900XT) was ok considering 4080 cost 200 more. 6000 series prices were even better but did not look better compared to Nvidia who used a cheap Samsung 8nm node.
The whole RDNA2 line-up failed to outperform their MSRP-sakes from NVIDIA. By a significant (15+ %) margin at least. Also no DLSS, CUDA, non-existent RT performance on top of that + insane power spikes.
Also, 6500 XT. Not as bad as GT 1630 but still ridiculous.

RDNA3 is trickier:
7900 XTX looked somewhat attractive in comparison with 4080, however at this price point, a gamer expects more than just raw raster performance. They want to enable everything. You can't do that on 7900 XTX. Thus, it had to be launched more significantly below 1200. $850 tops.
7900 GRE is just an abomination and a half. At 600 dollars, it's just a meager 10 to 20 % boost over 4070 at the cost of being worse in power draw and scenarios that aren't gaming pure raster titles.
7800 XT is the same story as 7900 XTX. NOT CHEAP ENOUGH TO CONVINCE. 4070 is more feature rich and performance difference is only visible with FPS graphs enabled. $100 premium is low cost enough.
7700 XT is also an abomination.
7600... Don't even get me started, it's awful.

When we have a market where the leading party offers products objectively better than yours (more features, more power efficiency at give or take the same speed) you should erect a price war. Never happened, AMD just priced their products even higher than the highest they could've gotten away with. Hence the revenue fiasco.
However you have to be totally delusional to think that 8800XT will cost anywhere near 750.
That's why I'm not delusional. I'm just strongly pessimistic because AMD seem to live in the fairy tale where nothing NVIDIA better than 2080 Ti exists.
 
Here goes Intel's opportunity to sell a theoretical B780 for $500
 
Because AMD are AMD. Can't remember any of their 2020s products priced right.
The Radeon 5000 and Geforce GTX 1000 series were priced just fine. The Geforce RTX 2000 series introduced us to ray tracing where pricing started to get out of hand. Pricing went insane with the Geforce RTX3000, Geforce RTX4000, Radeon 6000 and Radeon 7000 series.
 
You had me at 4080 performance.

Hopefully its priced right, this could be my first AMD card since the 4890.

Not really concerned about the power, but the ability to run F@H on it with good production would be nice too.
 
I'm glad I semi-retired from the hobby with my 7900 XT, I just don't give a damn about ray tracing, and Physx before that. Give me simple raster, a cup of unsweetened green tea, and off to game I go lads.
This post eerily describes me as well right down to the 7900 XT, my current GPU.
 
Back
Top