Friday, June 23rd 2023

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

AMD Radeon RX 7800 XT will be a much-needed performance-segment addition to the company's Radeon RX 7000-series, which has a massive performance gap between the enthusiast-class RX 7900 series, and the mainstream RX 7600. A report by "Moore's Law is Dead" makes a sensational claim that it is based on a whole new ASIC that's neither the "Navi 31" powering the RX 7900 series, nor the "Navi 32" designed for lower performance tiers, but something in between. This GPU will be AMD's answer to the "AD103." Apparently, the GPU features the same exact 350 mm² graphics compute die (GCD) as the "Navi 31," but on a smaller package resembling that of the "Navi 32." This large GCD is surrounded by four MCDs (memory cache dies), which amount to a 256-bit wide GDDR6 memory interface, and 64 MB of 2nd Gen Infinity Cache memory.

The GCD physically features 96 RDNA3 compute units, but AMD's product managers now have the ability to give the RX 7800 XT a much higher CU count than that of the "Navi 32," while being lower than that of the RX 7900 XT (which is configured with 84). It's rumored that the smaller "Navi 32" GCD tops out at 60 CU (3,840 stream processors), so the new ASIC will enable the RX 7800 XT to have a CU count anywhere between 60 to 84. The resulting RX 7800 XT could have an ASIC with a lower manufacturing cost than that of a theoretical Navi 31 with two disabled MCDs (>60 mm² of wasted 6 nm dies), and even if it ends up performing within 10% of the RX 7900 XT (and matching the GeForce RTX 4070 Ti in the process), it would do so with better pricing headroom. The same ASIC could even power mobile RX 7900 series, where the smaller package and narrower memory bus will conserve precious PCB footprint.
Source: Moore's Law is Dead (YouTube)
Add your own comment

169 Comments on Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

#76
kapone32
Dr. DroIt's an invalid comparison because they aren't the same architecture or work in a similar way, remember back in the Fermi v. TeraScale days, the GF100/GTX 480 GPU had 480 shaders (512 really but that config never shipped) while a Cypress XT/HD 5870 had 1600... nor can you go by the transistor count estimate because the Nvidia chip has several features that consume die area such as tensor cores and an integrated memory controller and on-die cache that the Navi 31 design does not (with L3 and IMCs being offloaded onto the MCDs and the GCD focusing strictly on graphics and the other SIPP blocks). It's a radically different approach in GPU design that each company has taken this time around, so I don't think it's "excusable" that the Radeon has less compute units because that's an arbitrary number (to some extent).

If you ask me, I would make a case for the N31 GCD being technically a more complex design than the portion responsible for graphics in AD102. And of course, the 7900 XTX can never get close unless you pump double the wattage into it.
Thing is my card is just as fast as yours at RT and I bet I paid less for my card. too.
Posted on Reply
#77
nguyen
kapone32Yep this is definitely an AMD titled GPU post. You would think these cards are an absolute failure. There is one thing the 7900 series cards can do that no 6000 card can. Both cards can run 4K 144Hz panels no problem. There is no Game, not even TWWH3 (Not the benchmark) that cannot be enjoyed at 4K with those cards. Idle monitor usage or video playback issues are so small that people forget that the message before the cards were launched was to get a nice Beefy PSU to avoid any issues with power and I hope none of the people Whining about power draw own an OLED panel. This card has no foreknowledge so saying it will be X, Y or Z is just conjecture. I will say the last time AMD showed confidence the Internet barbqed them and some of the negative hype is directly from the propaganda Campaign that is Nvidia's "advantage". This card should absolutely kick butt at 1440P anything anyway. Let's keep in mind that the 6700XT is not far away from a 6800 at 1440P.
4K 144hz LMAO
Take another guess maybe, barely 60FPS in 2023 games :roll:



Without relying on Upscaling, 7900XT/X won't have the grunt going forward @ 4K60FPS (much less 4K144FPS), and that's where DLSS vs FSR will come into the equation

Edit: forgot Jedi
Posted on Reply
#78
Dr. Dro
kapone32Thing is my card is just as fast as yours at RT and I bet I paid less for my card. too.
Yes, you paid less for your card, but I bought mine before even the 6900 XT released (as it was the last model to come out). That was Sept. 24 2020, almost 3 years ago. See the issue at hand? ;)
Posted on Reply
#79
kapone32
nguyen4K 144hz LMAO
Take another guess maybe, barely 60FPS in 2023 games :roll:



Without relying on Upscaling, 7900XT/X won't have the grunt going forward @ 4K60FPS (much less 4K144FPS), and that's where DLSS vs FSR will come into the equation
Yep well I guess there have been improvements since launch. It never ceases to amaze me how people will argue with owners on how their cards are supposed to perform.
Dr. DroYes, you paid less for your card, but I bought mine before even the 6900 XT released (as it was the last model to come out). That was Sept. 24 2020, almost 3 years ago. See the issue at hand? ;)
It is all relative I am happy with my purchase and that is all that should matter. Are you not happy with yours?
Posted on Reply
#80
Dr. Dro
kapone32Yep well I guess there have been improvements since launch. It never ceases to amaze me how people will argue with owners on how their cards are supposed to perform.


It is all relative I am happy with my purchase and that is all that should matter. Are you not happy with yours?
I sure am! But I resent there's no upgrade path other than the 4090 3 years later :(
Posted on Reply
#81
Bomby569
I totally disagree with this argument that only people that have card x can talk about card x, that is absurd. Like this is a ultra secret thing and we can't all know everything about any card.
Posted on Reply
#82
the54thvoid
Intoxicated Moderator
People can definitely talk about things they don't own - but folks can stop trolling. If your input isn't constructive to the OP, don't waste your keycaps.
Posted on Reply
#83
Beginner Macro Device
Yet I also disagree with the concept of "owner of no X can't discuss X," it's as invalid as it gets.
Posted on Reply
#84
Assimilator
I love how people are using this image as evidence that AMD isn't rubbish at low-load power consumption. Look at the bottom 5 worst GPUs. LOOK AT THEM. Who makes them?

Then look at the 3090 and 3090 Ti. They're near the bottom, yet their successors are in the middle of the pack - almost halving power consumption. It's like one company is concerned with making sure their product is consistently improving in all areas generation-to-generation, while the other is sitting in the corner with their thumb up their a**.

The fact that AMD has managed to bring down low-load power consumption since the 7000-series launch IS NOT something to praise them for, because if they hadn't completely BROKEN power consumption with the launch of that series (AND THEIR TWO PREVIOUS GPU SERIES), they wouldn't have had to FIX it.

Now all of the chumps are going to whine "bUt It DoESn'T mattER Y u MaKIng a fUSS?" IT DOES MATTER because it shows that one of these companies cares about delivering a product where every aspect has been considered and worked on, and the other just throws their product over the fence when they think they're done with it and "oh well it's the users' problem now". If I'm laying down hundreds or thousands of dollars on something, I expect it to be POLISHED - and right now only one of these companies does that.
Bomby569I totally disagree with this argument that only people that have card x can talk about card x, that is absurd. Like this is a ultra secret thing and we can't all know everything about any card.
It's an incredibly lazy non-argument used by those who are intellectually bankrupt. The best course of action is to ignore said people.
Posted on Reply
#85
kapone32
I am not against discussing other products but making desultory comments with no context is my issue. You will never hear me complain about the performance of Nvidia cards but I can talk about my opinion on Ray tracing, DLSS and FSR but I can't say one is better I have not used them so.
Posted on Reply
#86
EatingDirt
john_Another person pointed at the newer numbers, but probably you didn't read more posts. That being said the over 40W power consumption in TPU's review is still pretty high. Please read rest of the posts before replying. Not going to repeat what is already written.

As for 7600, you are missing the point about performance. More efficient? Where do you base this?
Do you not actually read any of the reviews on this site? You seem to run off of old information, or just plain wrong information. The efficiency of the 7600 over the 6650XT is clear as day in the review on TPU. Sure, the 6650XT isn't in the 'efficiency' chart, but the more power-efficient 6600XT is, which itself is around 15% more efficient than the 6650XT was.

I was being kind when I said the 7600 is 20% more efficient. It's probably closer to 30%.
Posted on Reply
#87
oxrufiioxo
Dr. DroI sure am! But I resent there's no upgrade path other than the 4090 3 years later :(
Yeah that does suck I expected the 4080 to be 800 usd and a decent upgrade but at it's price it really sucks. I also expected the 7900XTX to be better after how good the 6900X/6950X were and was disappointed leading me to the 4090 as my only decent upgrade option. These are honestly in my opinion the most meh high end products from both companies due to pricing.

The 7800XT is shaping up to be a really meh product slightly faster than a 4070 in raster but worse at every other metric likely for around the same price. I guess we already have the real 7800XT though the 7900XT that is just priced stupid but starting to make more sense in the low 700s
Posted on Reply
#88
Blitzkuchen
no need for this garbage, booth Nvidia and AMD did a level up.

7600 300€ vs 6650 249€
RTX 4060 6GB 350€ vs RTX 3060 12GB 300€


Arc 750 8GB 229€
Arc 380 6GB 134€ its still cheaper with its 6GB than garbage and old Cards like: RX 6400 139€, GTX 1630 148€, GTX 1650 172€
Posted on Reply
#89
Vya Domus
Assimilatorbecause if they hadn't completely BROKEN power consumption with the launch of that series
I have never had this problem and I have the card since launch.
Posted on Reply
#90
john_
EatingDirtDo you not actually read any of the reviews on this site? You seem to run off of old information, or just plain wrong information. The efficiency of the 7600 over the 6650XT is clear as day in the review on TPU. Sure, the 6650XT isn't in the 'efficiency' chart, but the more power-efficient 6600XT is, which itself is around 15% more efficient than the 6650XT was.

I was being kind when I said the 7600 is 20% more efficient. It's probably closer to 30%.
Just go one page before the one you point in that review and explain me those numbers.
AMD Radeon RX 7600 Review - For 1080p Gamers - Power Consumption | TechPowerUp

Before answering remember this
7600 new arch, new node both should be bringing better efficiency for 7600. But what do we see? In some cases 6600XT with lower power consumption, in some cases 7600 with lower power consumption.

Unfortunately I have to agree with @Assimilator in his above comment. And I am mentioning Assimilator here because he knows my opinion about him. Simply put, he in in my ignore list!
But his comment here is correct.
The fact that AMD has managed to bring down low-load power consumption since the 7000-series launch IS NOT something to praise them for, because if they hadn't completely BROKEN power consumption with the launch of that series (AND THEIR TWO PREVIOUS GPU SERIES), they wouldn't have had to FIX it.
...
it shows that one of these companies cares about delivering a product where every aspect has been considered and worked on, and the other just throws their product over the fence when they think they're done with it and "oh well it's the users' problem now".
When AMD was with a tight budget, think Polaris period, it was almost their way of doing their business. Never in one generation power efficiency and performance improvements at the same time. Only one of those. Now that they DO have money they should have been able to do that. Work the final product in all aspects and come up with a new product that is better in everything compared to the previous one. But unfortunately they focused so much on CPUs, that they lost the boat with GPUs. And NOW with AI explosion they are realizing that GPUs are becoming more important. But are they going to throw money in all GPU architectures or only on Instinct? If they focus only on Instinct they will lose the next generation of consoles. At least Microsoft who I bet realized by now that to beat PlayStation they need to do something drastic. I hope at AMD to fear the possibility of MS or Sony or both going with Intel and Nvidia next gen. Those will be more expensive consoles to make, but gaming cards in PCs are also much more expensive, so higher prices on consoles isn't something we should rule it out as a possibility.
Vya DomusI have never had this problem and I have the card since launch.
You have the card, we look at numbers from TPU review. Either your card works as it should, which will be great and something went wrong with the review numbers, or something else is happening.
Posted on Reply
#91
rv8000
john_Probably referencing to AMD's slide where it says that Navi 31 is designed to go over 3.0GHz and that higher power consumption in some scenarios, if I am not mistaken, idle, multi monitor, video playback - something like that.


I do read many tea leaves here, you on the other hand don't read correctly what I wrote. :p Never compared 6800XT with 7900XT.

Also 7600 should have been as fast as 6700, just as a generational rule where the x600 should be as fast or faster than previous x700 in both Nvidia's and AMD's case.
But the problem is not 7600 vs 6700. It's 7600 vs 6650 XT. Same specs, same performance, meaning RDNA2 to RDNA3 = minimal gains.
Do you bring this bad take into every AMD thread? There are PROVEN performance gains, and PROVEN efficiency gains. Off the top of my head 6900XT vs 7900XT in cyberpunk 2077; RDNA3 being roughly 25% more efficient in that specific scenario from TPUs own benchmarks. Please stop with misinformation.
Posted on Reply
#92
Bomby569
john_Now that they DO have money they should have been able to do that.
money doesn't solve those problems, ask Intel
Posted on Reply
#93
john_
rv8000Do you bring this bad take into every AMD thread? There are PROVEN performance gains, and PROVEN efficiency gains. Off the top of my head 6900XT vs 7900XT in cyberpunk 2077; RDNA3 being roughly 25% more efficient in that specific scenario from TPUs own benchmarks. Please stop with misinformation.
It's funny. I am an AMD fan for over 20 years, always choosing AMD hardware when having the option, an RX 6600 at 190 euros does look like a nice little upgrade for my RX 580 and I really regret that 7600 in Greece is still overpriced(over 280 euros), all my posts are usually AMD friendly, but expressing STRONG CONCERN about the company and it's choices and where it goes and if it will be able to remain a true competitor to Intel and Nvidia for years to come is enough excuse for you who obviously have no idea about my posts to start accusing me of "bringing this bad take into every AMD thread". Nice. Bravo!!!! :clap:

People where expecting 7900 series to be killers, in raster at least and at a lower power consumption. Based on TPU numbers that probably every one here disputes, 7900 XT is on average 14% faster than 6950XT with more shading units, more ROPS, more RT cores, more VRAM, more memory bandwidth. It's RDNA3 and still looks like a bigger RDNA2 chip. Do you even understand where my concerns are?

Maybe I should stop posting sinsere concerns and start posting pictures like the one below to make you happy.
Posted on Reply
#94
Vayra86
john_Look, we see things differently. And it's not about 50W of power in gaming for example or in Furmark. It's in video playback. I mean why waste an extra of 30W of power while watching a movie? I am not 15 years old when I was playing games all day. Today I will spend more time watching youtube videos and movies than gaming. So power consumption in video playback is important for me. Does AMD have the luxury to tell me "If you want low power consumption in video playback go and buy a competing card"? As I said, AMD should start looking for advantages that CAN achieve. Not trying to play catch up with Nvidia with Nvidia dictating the rules. Where is FreeSync 3.0 with Frame Generation? I am throwing it as an example. AMD should be looking in improving it's hardware in various ways, not just trying to follow where Nvidia wants to drive the market.
Well, if you go about life wondering if you might waste 30W somewhere, I think you're on a good path to suck all the fun out of it :D

I don't think AMD is trying to play catch up; I think we have lots of examples of them really doing their own thing to carve out a unique competitive position for the company.
- When Nvidia had RTX in play, AMD response was 'that's cool, but it won't go places until it gets mainstream market reach - midrange GPUs need to carry it comfortably'.
- RDNA2 had RT support but barely sacrifices hardware / die size for it.
- RDNA3 much the same even with what it does have.

Meanwhile, they rolled out a path of RDNA that was focused on chiplet. We're too early in the race here to say whether they made the right bet. But here are the long term trends we are looking at:
- Chiplets have caught up to Intel where monolithic was unable to for AMD
- Chiplet now enables them to remain close to Nvidia's top end with, frankly, not a whole lot of trouble
- Raster performance clearly remains a high performance affair, looking at newer engines, you'll keep a strong hardware requirement on it, with or without RT
- RT performance seems easy to lift through gen-to-gen general performance gains. It just takes time; at the same time, there is only a handful of titles that you can only really play well on Nvidia - even today, 3,5 gens past RT's release, even here, AMD's laid back approach to RT implementation isn't really breaking them.
- They still own the consoles and therefore have a final say in widespread market adoption wrt RT.

I'm really not seeing a badly positioned company or GPU line up here overall. That idea just applies to a minority that chases the latest greatest in graphics, the early adopter crowd, because that's really still where RT is at. AMD isn't playing catch up. They're doing a minimal effort thing on the stuff they don't care much about, seems like the proper rationale to me.
john_It's funny. I am an AMD fan for over 20 years, always choosing AMD hardware when having the option, an RX 6600 at 190 euros does look like a nice little upgrade for my RX 580 and I really regret that 7600 in Greece is still overpriced(over 280 euros), all my posts are usually AMD friendly, but expressing STRONG CONCERN about the company and it's choices and where it goes and if it will be able to remain a true competitor to Intel and Nvidia for years to come is enough excuse for you who obviously have no idea about my posts to start accusing me of "bringing this bad take into every AMD thread". Nice. Bravo!!!! :clap:

People where expecting 7900 series to be killers, in raster at least and at a lower power consumption. Based on TPU numbers that probably every one here disputes, 7900 XT is on average 14% faster than 6950XT with more shading units, more ROPS, more RT cores, more VRAM, more memory bandwidth. It's RDNA3 and still looks like a bigger RDNA2 chip. Do you even understand where my concerns are?

Maybe I should stop posting sinsere concerns and start posting pictures like the one below to make you happy.
You say barely anything happened between RDNA2 and 3, but...
the energy efficiency is virtually on par, both last and current gen. And the only GPU they can't match is the 4090 in raw perf.

Posted on Reply
#95
john_
Bomby569money doesn't solve those problems, ask Intel
Intel is stuck at an old manufacturing node. Thanks for having the money they havem they are competitive in CPUs, of course at 300W power consumption and they managed to hire some people to bring out some GPUs that do mostly work, even being first generation. So money does work in tech. If they fix their manufacturing what do you expect to happen? In fact that's what I fear. AMD concentrated so much on CPUs, that if tomorrow Intel manages to present a working 5nm node (nanometers not Intel marketing numbers) we could have a repeat of how things progressed after Intel Core Duo introduction.

AMD obviously does have the engineers to build great CPUs and GPUs. Now it needs to start investing those numbers where they should be invested. They thew 40 billions on Xilinx. It's time to start throwing more money on GPUs and software.

@Vayra86 I agree with most of your post. I was screaming about RT performance and was called Nvidia shill back then. With 7900XTX offering 3090/Ti RT performance, how could someone be negative about that? How can 3090/Ti performance being bad today when it was a dream yesterday?
The answer is marketing. I believe people buy RTX 3050 cards over RX 6600 because of all that RT fuss. RT is useless on an RTX 3050, but still people buy it, i believe for that reason. "RTX from Nvidia". Marketing. The same marketing won market share for AMD with the first Ryzen series. Much lower IPC compared to Intel CPUs but "8 CORES". The same marketing wins market share for Intel the last few years. "16 Cores CPU" and half of then are E cores. But people love to read higher numbers.
AMD should have doubled RT Cores in RDNA 3 even if that meant more die area, more expensive chips, lower profit margins.

PS
AMD Radeon RX 7600 Review - For 1080p Gamers - Power Consumption | TechPowerUp
Efficiency might be high in those charts, but the charts in the previous page say otherwise, or at least that efficiency of the chip isn't the same in every task. Seems that in cases, like video playback, there is a problem.
Posted on Reply
#96
Vayra86
Bomby569AMD has been using the higher vram as a selling point for year, it was like that in the Polaris days almost a decade ago, it kept it ever since. It's no a new move or a miscalculation of rdna3
It materialized for Polaris 8GB too.

It will materialize for RDNA3. But that could still make it a miscalculation. 16GB would have been fine on the 7900XT's raw perf as it is now too... I think @londiste made a fine point there.
Beginner Micro DeviceI'm talking about aforementioned absurdly high wattage at video playback and multi-monitor usage. It doesn't matter if it's fixed, the only thing that matters is they launched a product unable to perform efficiently. You can afford taking money from customers for a pre-alpha testing privilege when you're miles ahead of your competition, not when you're stone age behind and you yourself are not a competition at all.

7900 XTX was launched marginally cheaper than 4080 and has nothing to brag about. +8 GB VRAM does nothing since 4080 can turn DLSS3 on and yecgaa away from the 7900 XTX. "Sooper dooper mega chiplet arch" does also do nothing when 4080 can preserve 60ish framerates RT On whilst 7900 XTX goes for shambly 30ish FPS with permanent stuttering inbound. More raw power per buck? WHO CARES?
Since the overwhelming majority of performance is still in raster and certainly not always with DLSS3 support, yeah, who cares huh.

Consider also the fact that playing most of the new stuff at launch right now, as it was for the last year(s), is an absolute horror show. Nothing released without breaking issues. To each their own. I've always gamed extremely well following the early adopter wave of each gen loosely. In GPUs its the same thing as with games. We get all sorts of pre-alpha shit pushed on us. The entire RTX brand is a pre-alpha release since Turing, and we could say they're in open beta now with Ada. Until you turn on path tracing. You're paying every cent twice over for Nvidia's live beta here, make no mistake. We didn't see them discount Ampere heavily like AMD did RDNA2, for example ;) In the meantime, I can honestly not always tell if RT is really adding much if anything beneficial to the scene. In Darktide, its an absolute mess. In Cyberpunk, the whole city becomes a glass house, sunsets are nice though, but when you turn RT off, not a single drop of atmosphere is really lost at all while the FPS doubles.

Anyone not blindly sheeping after the mainstream headlines, cares.

Also, do consider the fact Ampere 'can't get DLSS3' for whatever reason, the main one being an obvious Nvidia milking you hard situation. The power per buck advantage isn't just AMDs pricing, its Nvidia's approach to your upgrade path much more so.
Posted on Reply
#97
rv8000
john_It's funny. I am an AMD fan for over 20 years, always choosing AMD hardware when having the option, an RX 6600 at 190 euros does look like a nice little upgrade for my RX 580 and I really regret that 7600 in Greece is still overpriced(over 280 euros), all my posts are usually AMD friendly, but expressing STRONG CONCERN about the company and it's choices and where it goes and if it will be able to remain a true competitor to Intel and Nvidia for years to come is enough excuse for you who obviously have no idea about my posts to start accusing me of "bringing this bad take into every AMD thread". Nice. Bravo!!!! :clap:

People where expecting 7900 series to be killers, in raster at least and at a lower power consumption. Based on TPU numbers that probably every one here disputes, 7900 XT is on average 14% faster than 6950XT with more shading units, more ROPS, more RT cores, more VRAM, more memory bandwidth. It's RDNA3 and still looks like a bigger RDNA2 chip. Do you even understand where my concerns are?

Maybe I should stop posting sinsere concerns and start posting pictures like the one below to make you happy.
It doesn’t matter what you’re a fan of. The opinion you keep reiterating is flat out wrong.
Posted on Reply
#98
tajoh111
AMD is far less competitive than it has been in years.

It reminds me of the initial pascal launch.

Navi 31 is basically Vega without the HBM2, but it has worse power consumption, had to shift down a tier to make it look competitive with it's competition and likely a higher cost to make vs the tier it is now competing against because it is slower than anticipated. Nvidia flagship is also far away faster. If the RTX 4090 was less cutdown, we would be looking at a generational difference between two flagships.

If nvidia wanted to, they could likely price the RTX 4080 at $699 and the RTX 4070 ti(which should be an RTX 4070) at $500 dollars and destroy AMD's lineup and still make a decent margin. However there seems to be some price fixing going about or NV does not want to depreciate their last gen so much since there is so much on the market still.

What's making this fight appear closer than it is, is Nvidia spending silicon on tensor cores which enabled AMD to get a bit closer in performance. However this silicon cost on tensor cores is well spent since it enables Nvidia to tackle the AI market simultaneously with the same lineup. If AMD is losing ground on raster while going all out on raster, the division has poor future. The marketshare and revenue from the gaming market simply won't justify the R and D expense, on top of the large dies/low profit compared to other sector's AMD is competing in.
Posted on Reply
#99
Bomby569
john_Intel is stuck at an old manufacturing node. Thanks for having the money they havem they are competitive in CPUs, of course at 300W power consumption and they managed to hire some people to bring out some GPUs that do mostly work, even being first generation. So money does work in tech. If they fix their manufacturing what do you expect to happen? In fact that's what I fear. AMD concentrated so much on CPUs, that if tomorrow Intel manages to present a working 5nm node (nanometers not Intel marketing numbers) we could have a repeat of how things progressed after Intel Core Duo introduction.

AMD obviously does have the engineers to build great CPUs and GPUs. Now it needs to start investing those numbers where they should be invested. They thew 40 billions on Xilinx. It's time to start throwing more money on GPUs and software.
what you call the manufacturing node problem is just a engineering problem just as the one you are talking to with AMD. Money doesn't solve that, the past is full of richer dead companies that couldn't solve those kinds of problems with money
Posted on Reply
#100
Tek-Check
So, is Navi 41 going to be a chiplet GCD, inspired by MI300?

MI300 has GCD die with up to 38CUs.

Navi 41 could have three or even four of those dies.
Posted on Reply
Add your own comment
May 16th, 2024 16:31 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts