• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD is Allegedly Preparing Navi 31 GPU with Dual 80 CU Chiplet Design

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,136 (1.10/day)
AMD is about to enter the world of chiplets with its upcoming GPUs, just like it has been doing so with the Zen generation of processors. Having launched a Radeon RX 6000 series lineup based on Navi 21 and Navi 22, the company is seemingly not stopping there. To remain competitive, it needs to be in the constant process of innovation and development, which is reportedly true once again. According to the current rumors, AMD is working on an RDNA 3 GPU design based on chiplets. The chiplet design is supposed to feature two 80 Compute Unit (CU) dies, just like the ones found inside the Radeon RX 6900 XT graphics card.

Having two 80 CU dies would bring the total core number to exactly 10240 cores (two times 5120 cores on Navi 21 die). Combined with the RDNA 3 architecture, which brings better perf-per-watt compared to the last generation uArch, Navi 31 GPU is going to be a compute monster. It isn't exactly clear whatever we are supposed to get this graphics card, however, it may be coming at the end of this year or the beginning of the following year 2022.


View at TechPowerUp Main Site
 
So, 1999$ MSRP, 3k+ on the market? :laugh:
 
The Voodoo that you do so welllll!

"paper launch q1 2023!" :rolleyes:
 
So, 1999$ MSRP, 3k+ on the market? :laugh:
and a grand total of 5 units manufactured throughout the entire year, only to be sold by scalpers
 
and a grand total of 5 units manufactured throughout the entire year, only to be sold by scalpers
Never mind, even if you could afford it and actually buy it, there's no power supply on this planet that can feed this beast.
 
Considering it mentions RDNA 3 as the base for the two chiplets then I'd doubt it would be this year at the earliest, and would probably be late next year?
 
This supply mrsp thing is really becoming a fucking meme right now.
 
"We heard you liked stock shortages, so we're going to take the constrained supply you're waiting on and use it to make half as many graphics cards"
 
Meanwhile in Czech republic for example:
1611408091476.png
 
5nm could easily make that happen and consume close to 350W at over 2GHz. Especially if combined with HBM(3?) for low latency which might be needed more when using chiplets.
 
The Voodoo that you do so welllll!

"paper launch q1 2023!" :rolleyes:

Why people keep mumming "paper launch", when cards are available in online stores (e.g. mindfactory in Germany) for WEEKS?
Now, yeah, pricing is way above MSRP, but so was 2080Ti's price up until Ampere came.
 
This supply mrsp thing is really becoming a fucking meme right now.
I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.

It's understandable, too, because Turing was a complete rip-off for the first year on the market, and then when Navi arrived on the scene a year late the party it was underwhelming AF; The 5700 wasn't even 15% faster than the Vega56, and even at MSRP, it actually offered lower performance/$. Street pricing on Vega56 by that point was way lower than the 5700-series ever reached.

The only "good" thing that happened that entire generation is that AMD rejoining the GPU market after a year of total nothingness brought TU104 down to under $500 in the 2070 Super. That didn't make it good value, but at least it restored price/performance to somewhere that wasn't obscene. Unfortunately Nvidia stopped manufacturing Turing around 9 months after the Super series launched, so if you didn't grab one during that short window you're SOL right now.
 
Remember when you could get excited about a futur product without wondering if you could get them, but rather when and how it will perform ? I miss those simplier times.
 
I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.

It's understandable, too, because Turing was a complete rip-off for the first year on the market, and then when Navi arrived on the scene a year late the party it was underwhelming AF; The 5700 wasn't even 15% faster than the Vega56, and even at MSRP, it actually offered lower performance/$.

The only "good" thing that happened that entire generation is that AMD rejoining the GPU market after a year of total nothingness brought TU104 down to under $500 in the 2070 Super. That didn't make it good value, but at least it restored price/performance to somewhere that wasn't obscene. Unfortunately Nvidia stopped manufacturing Turing around 9 months after the Super series launched, so if you didn't grab one in that window you're SOL right now.
The other thing about Navi is that it was still available when Turing was already out of stock and Ampere wasn't available yet (as it still isn't). That's how and why I managed to pick up my 5700 XT that will hopefully keep me happy for at least a good year or two.
 
The other thing about Navi is that it was still available when Turing was already out of stock and Ampere wasn't available yet (as it still isn't). That's how and why I managed to pick up my 5700 XT that will hopefully keep me happy for at least a good year or two.
My 5700XT is doing fine work, I bought it at launch and given what a damp squib RTX has been, it's turned out to be an excellent investment.

CP2077's DLSS is the first time I though that the 2070S I also own showed any noteworthy advantage over the 5700XT. Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.

Unfortunately the miners have ruined it all again, I'm seeing 5700-series cards at almost 50% premium over MSRP, which was never that competitive in the first place....
 
My 5700XT is doing fine work, I bought it at launch and given what a damp squib RTX has been, it's turned out to be an excellent investment.

CP2077's DLSS is the first time I though that the 2070S I also own showed any noteworthy advantage over the 5700XT. Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.

Unfortunately the miners have ruined it all again, I'm seeing 5700-series cards at almost 50% premium over MSRP, which was never that competitive in the first place....
To be honest, I still don't give a **** about DLSS, and I don't think I ever will. I'd much rather run everything at native resolution without any AA. Ray tracing is a little tempting, but it's still in its early stages even with Ampere. Not worth the premium just yet imo.
 
I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.
My 2070 died in December 2020 and I had a panic attack because this is the worst time for a GPU to die. No iGPU meant no using PC until I got a new GPU. So glad I found a 3070 Gaming X Trio in stock close to MSRP.
 
My 2070 died in December 2020 and I had a panic attack because this is the worst time for a GPU to die. No iGPU meant no using PC until I got a new GPU. So glad I found a 3070 Gaming X Trio in stock close to MSRP.
Maybe I'm weird, but I have a 1050 Ti on the shelf just for this purpose. :D
 
My 5700XT is doing fine work, I bought it at launch and given what a damp squib RTX has been, it's turned out to be an excellent investment.

CP2077's DLSS is the first time I though that the 2070S I also own showed any noteworthy advantage over the 5700XT. Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.

Unfortunately the miners have ruined it all again, I'm seeing 5700-series cards at almost 50% premium over MSRP, which was never that competitive in the first place....

My brother picked up a 5700 XT two Decembers back. He spent about $400 on it. I told him if he wanted to nearly double his money he could sell it on ebay - they're fetching upwards of $700+. But he doesn't have a spare GPU so he couldn't play any games...

He's not much on following hardware and I explained to him how his card,normally would only bring around $200 or so at this point on the second hand market and that GPUs were hard to find. He started looking around and noticed no one has any new GPUs in stock and that ebay prices for scalping was astronomical.

I have no parts lying around other than the ones in my PC, I suppose I should buy some cheapo gpu just for this lol


Doubt it's anything above 256MB.

Keep old hardware. I've got two spare PSUs, spare RAM (well, DDR3 so it wouldn't help if my new build needed RAM to test with) and two GPUs. When I upgrade I try to keep old working hardware for spare parts in testing just to have for a backup so I can figure out any possible problems that come up in the future if I need spare parts to test with.
 
10240 cores (two times 5120 cores on Navi 21 die). Combined with the RDNA 3 architecture, which brings better perf-per-watt compared to the last generation uArch, Navi 31 GPU is going to be a compute monster.

Great I'm sure miners will love it.
 
Low quality post by Deleted member 206429
This supply mrsp thing is really becoming a fucking meme right now.

Joke*** The correct word is joke.

To all the nerds who are crying about GPU shortages, price hikes, out of stock items: Grow up. Go outside, find a new hobby, actually talk to people in real life (I know, its a wild concept), discover there is more to the world than just Fall Guys and Among Us
 
Back
Top