Friday, June 23rd 2023

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

AMD Radeon RX 7800 XT will be a much-needed performance-segment addition to the company's Radeon RX 7000-series, which has a massive performance gap between the enthusiast-class RX 7900 series, and the mainstream RX 7600. A report by "Moore's Law is Dead" makes a sensational claim that it is based on a whole new ASIC that's neither the "Navi 31" powering the RX 7900 series, nor the "Navi 32" designed for lower performance tiers, but something in between. This GPU will be AMD's answer to the "AD103." Apparently, the GPU features the same exact 350 mm² graphics compute die (GCD) as the "Navi 31," but on a smaller package resembling that of the "Navi 32." This large GCD is surrounded by four MCDs (memory cache dies), which amount to a 256-bit wide GDDR6 memory interface, and 64 MB of 2nd Gen Infinity Cache memory.

The GCD physically features 96 RDNA3 compute units, but AMD's product managers now have the ability to give the RX 7800 XT a much higher CU count than that of the "Navi 32," while being lower than that of the RX 7900 XT (which is configured with 84). It's rumored that the smaller "Navi 32" GCD tops out at 60 CU (3,840 stream processors), so the new ASIC will enable the RX 7800 XT to have a CU count anywhere between 60 to 84. The resulting RX 7800 XT could have an ASIC with a lower manufacturing cost than that of a theoretical Navi 31 with two disabled MCDs (>60 mm² of wasted 6 nm dies), and even if it ends up performing within 10% of the RX 7900 XT (and matching the GeForce RTX 4070 Ti in the process), it would do so with better pricing headroom. The same ASIC could even power mobile RX 7900 series, where the smaller package and narrower memory bus will conserve precious PCB footprint.
Source: Moore's Law is Dead (YouTube)
Add your own comment

169 Comments on Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

#151
fevgatos
AusWolfI haven't got Dota 2, but I also haven't had any single issue with any DX11 game with my RDNA 2 cards.


That's another thing I've never had with RDNA 2. It sounds like a problem with your laptop, not a problem with RDNA 2.


More like 20, but I get what you mean.
It's definitely not a problem with my laptop. The advice I got from various forums regarding the issue was to disconnect. That happened 2 weeks ago. On restarting windows downloads it's own drivers for amd cards. It's still happening man.
Posted on Reply
#152
AusWolf
fevgatosIt's definitely not a problem with my laptop. The advice I got from various forums regarding the issue was to disconnect. That happened 2 weeks ago. On restarting windows downloads it's own drivers for amd cards. It's still happening man.
It's the first (and so far, only) time I've ever heard about it.

Edit: In my experience, RDNA 2 is the most stable thing on AMD since the good old GCN days.
Posted on Reply
#153
fevgatos
AusWolfIt's the first (and so far, only) time I've ever heard about it.

Edit: In my experience, RDNA 2 is the most stable thing on AMD since the good old GCN days.
Really? From what im hearing it's very common.
Posted on Reply
#154
AusWolf
fevgatosReally? From what im hearing it's very common.
In the last year or so, I've had a 6400, a 6500 XT and a 6750 XT in 3 different systems (one AM4, one AM5 and one Intel 1200), and never had this, or any other issue with any of them.
Posted on Reply
#155
john_
fevgatosSo because you can run dx12 it's not an issue that it doesn't work with dx 11?
Don't distort what I posted.
I said I can't find people talking about it, maybe you can provide some info?
I also said if the game can run under DX12, why run it under DX11?
Again, have some info? Post it. I found nothing. Until you post info, you lie.
Exactly. Im talking about specific cases where it's the amd gpu at fault, I know cause i've tested with the iGPU instead and it works fine.
So, RTX 4070 and RTX 3080 are AMD GPUs now?
Re read what I posted, don't distort it with nonsense. Can you do it? probably not.
And explain why the GPUs in the link I gave you are ALL RTX cards.
It IS an ongoing issue. Happened to me literally 2 weeks ago that I got my laptop. How is it fixed? You can doubt all you want, I don't have 1 amd laptop, I have 3 (6900hs, 6800u, 5600h).
Never happened to me. But I did said to you what WAS the issue. You keep ignoring what people reply to you even when it is important information to you - if we accept to believe that you do have AMD hardware. Unfortunately you just want to do the usual anti AMD posts and never admit that something got fixed.
Those amd cpus and apus are the mobile parts that don't have the io die. Those are great, that's why I buy amd laptops. But the desktop parts with the io die, those draw 30-40w sitting there idle doing nothing.
Every single alder and raptor lake cpu idles at 2w and browses the web at 5-10w on balanced windows power plan.

AMD Ryzen 7 7800X3D Review - The Best Gaming CPU - Power Consumption & Efficiency | TechPowerUp
Would you write ONE line that it will not be a lie? The above measurements are in Single Thread load, not idle. Idle obviously will be lower.
Posted on Reply
#156
londiste
PumperWhat Intel CPU does that exactly? Unless you are comparing laptop i3 vs 16 core desktop AMD, then this is pure nonsense as there are exactly 0 desktop CPUs that eat only 5W on idle, let alone during video playback.
i5-8400 I had before moving to AM4 consumed 7-8W at idle.
Every chiplet AM4 CPU by default uses 12-13W - from being chiplet - plus some more from CCDs, generally ending up at 27-28W. 3600X, 5600X, 5800X3D all have been doing exactly that.
5600G I had for a while, however, went nicely down to ~10W.
Posted on Reply
#157
fevgatos
londistei5-8400 I had before moving to AM4 consumed 7-8W at idle.
Every chiplet AM4 CPU by default uses 12-13W - from being chiplet - plus some more from CCDs, generally ending up at 27-28W. 3600X, 5600X, 5800X3D all have been doing exactly that.
5600G I had for a while, however, went nicely down to ~10W.
Exactly that.

Sent from my amd hardware that I do not have cause im lying :P
Posted on Reply
#158
john_
fevgatosExactly that.

Sent from my amd hardware that I do not have cause im lying :p
You are so excited feeling that the above post gives you an easy way out to avoid answering about all those lies you threw by now. About DOTA2, about Diablo IV, the fact that you avoided commenting on the link I showed you where a bunch of people with RTX cards are having problems with that game, the lies about still having problems with driver installation.
So much excitement it's almost pathetic.
Have a nice day.
londistei5-8400 I had before moving to AM4 consumed 7-8W at idle.
Every chiplet AM4 CPU by default uses 12-13W - from being chiplet - plus some more from CCDs, generally ending up at 27-28W. 3600X, 5600X, 5800X3D all have been doing exactly that.
5600G I had for a while, however, went nicely down to ~10W.
Review numbers from Wiz put the 5800X3D at 20W under single thread load and then some of AM4 and AM5 models moving lower even than that.

Based on HW Monitor my R5 5500 goes even under 7W while idle, hovering around 8-9W, while my R5 4600G plays around 10W(under 8W min), which makes sense, considering it's iGPU is also in use.
Posted on Reply
#159
fevgatos
12900k on a discord call with a browser and 2 videos streaming, 5w
john_You are so excited feeling that the above post gives you an easy way out to avoid answering about all those lies you threw by now. About DOTA2, about Diablo IV, the fact that you avoided commenting on the link I showed you where a bunch of people with RTX cards are having problems with that game, the lies about still having problems with driver installation.
So much excitement it's almost pathetic.
Have a nice day.


Review numbers from Wiz put the 5800X3D at 20W under single thread load and then some of AM4 and AM5 models moving lower even than that.

Based on HW Monitor my R5 5500 goes even under 7W while idle, hovering around 8-9W, while my R5 4600G plays around 10W(under 8W min), which makes sense, considering it's iGPU is also in use.
If you werent so aggressive I might have,, but it's obvious you are arguing in bad faith, so I have no interest.
Posted on Reply
#160
john_
fevgatos12900k on a discord call with a browser and 2 videos streaming, 5w


If you werent so aggressive I might have,, but it's obvious you are arguing in bad faith, so I have no interest.
You could reply just to prove me wrong and in bad faith as you say.
Still here I am telling you that you lie and still you can't prove me wrong. You threw half a dozen of BS, you proved none. What is the above screenshot? Maybe that DOTA 2 you said? Maybe that Diablo 4? Maybe that 30-40W power consumption? Maybe that things about the driver installation?

Keep hiding behind your finger. No on will ever notice. Don't worry.
Done wasting time with you.
Posted on Reply
#161
the54thvoid
Intoxicated Moderator
For the love of Henry Winkler...

The thread title is this:

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

Stop derailing it with off-topic arguments about other cards, other generations, and game specific scenarios. No further warnings.
Posted on Reply
#162
sethmatrix7
Well now that the power consumption cope n' seethe is over...

I'm hoping the 7700 XT and 7800 XT end up good. The 6xxx cards are nearly gone so they should come out soon. I'd do a 4070 but $600 is pretty steep to lose to a 6900 XT
Posted on Reply
#163
forman313
londistei5-8400 I had before moving to AM4 consumed 7-8W at idle.
Every chiplet AM4 CPU by default uses 12-13W - from being chiplet - plus some more from CCDs, generally ending up at 27-28W. 3600X, 5600X, 5800X3D all have been doing exactly that.
5600G I had for a while, however, went nicely down to ~10W.
R7 5800H, Power Saver mode, lowest brightness. System power consumption is < 3w. In Balanced mode ~6w. The problem is implementation. Manufacturers have no idea how to make use of all the power saving features in modern AMD chips. Or maybe they just dont care.
Posted on Reply
#164
fevgatos
forman313R7 5800H, Power Saver mode, lowest brightness. System power consumption is < 3w. In Balanced mode ~6w. The problem is implementation. Manufacturers have no idea how to make use of all the power saving features in modern AMD chips. Or maybe they just dont care.
Τhat's a mobile chip, doesn't have the IO die that desktops do. My 6900hs also drops to 2w.
Posted on Reply
#165
forman313
fevgatosΤhat's a mobile chip, doesn't have the IO die that desktops do. My 6900hs also drops to 2w.
Neither has the 5600G mentioned in the post I quoted. Its the same piece of silicon.
Posted on Reply
#166
Shelterdeck
I need a new card to replace my old and noisy 1070.
Hanging on (with an itchy trigger finger for a 6900xt) for the 7800xt.
Who amongst you would have a guess at the MRSP of that new card?
Posted on Reply
#167
AusWolf
ShelterdeckI need a new card to replace my old and noisy 1070.
Hanging on (with an itchy trigger finger for a 6900xt) for the 7800xt.
Who amongst you would have a guess at the MRSP of that new card?
There's no point in guessing. If you need a new GPU right now, just buy one. Don't wait for something that may or may not be released soon and may or may not be suited to your needs and price range. ;)
Posted on Reply
#168
Shelterdeck
AusWolfThere's no point in guessing. If you need a new GPU right now, just buy one. Don't wait for something that may or may not be released soon and may or may not be suited to your needs and price range. ;)
Well the truth is I don't need one just now but with the price of the 6000 series dropping daily it's tempting to get tooled up before the Bethesda game arrives in September and I have a sneaky feeling that the 7000 cards may,or be not be,half decent if the price is right.
Posted on Reply
#169
AusWolf
ShelterdeckWell the truth is I don't need one just now but with the price of the 6000 series dropping daily it's tempting to get tooled up before the Bethesda game arrives in September and I have a sneaky feeling that the 7000 cards may,or be not be,half decent if the price is right.
The 6000 cards are dropping in price to clear inventory for the 7000 series. No one knows what the 7700 and 7800 series will be like in price and performance, or if they'll have any issues, or when exactly they're coming, so I'd just rather get that 6900 XT if the price is right for you. Early adoption isn't really worth it these days unless you like experimenting with new and potentially unstable hardware.
Posted on Reply
Add your own comment
May 16th, 2024 07:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts