• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD is Allegedly Preparing Navi 31 GPU with Dual 80 CU Chiplet Design

Remember when you could get excited about a futur product without wondering if you could get them, but rather when and how it will perform ? I miss those simplier times.
Well you can still wonder on how it will preform, there are youtubers for that who will get one for review.
I was never in the market for 1000 dollar gpus and will never be either so does not change much for me if they are available or not.
 
Maybe I'm weird, but I have a 1050 Ti on the shelf just for this purpose. :D

Always have back up parts, perferably evne a complete back up pc so you can chuck your main storage drive in there if something is wrong with it.
 
No, you are very clever, I have gtx 1060 on the shelf as well for that purpose.
You are smart, I have gtx670 on the shelf for that purpose.
 
Again, DLSS has been a complete non-feature for over two years, because until CP2077 showed up, none of the implementations of it were worth writing home about.

That’s because CP uses some really heavy post processing and has a very soft image quality anyway.
 
To be honest, I still don't give a **** about DLSS, and I don't think I ever will. I'd much rather run everything at native resolution without any AA. Ray tracing is a little tempting, but it's still in its early stages even with Ampere. Not worth the premium just yet imo.
Personally i'm all about DLSS. The performance gap between an AAA game like CP2077 and the "old but still hella fun", or competitive online games is getting ridiculous.
Having $500 (or more) gpus can feel like a waste sometimes, especially when those demanding games you bought the card for turn out to be not that fun.
My RX 480 has spent probably 85% of it's lifetime playing games that are far below what it can handle.
 
"We heard you liked stock shortages, so we're going to take the constrained supply you're waiting on and use it to make half as many graphics cards"
I mean, margins are much higher on higher end GPUs, so why use two GPUs to sell two $1000 GPUs when you can use the same silicon to sell a single $3000 GPU? Makes perfect sense to me :rolleyes:

In all seriousness, I really, really hope the current silicon wafer, packaging and silicon fabrication shortages start sorting themselves out soon. I can't really wrap my head around how we got here - did sales volumes in 2020 for ... silicon, in general? take off by that much? Really? Or has this been years in the making by the consolidation of the silicon litography industry, but nobody has wanted to talk about it?


I mean, it's great if AMD is working on MCM GPUs for RDNA3. Hopefully that isn't just for the high end, but lower-end too - if they could use a single ~150mm2 GPU die across 1, 2 and 3-die configurations that could make for a high volume, low cost product stack. But I need to see it work before I believe it, and then see actual products on actual/virtual shelves before I really believe it.
 
Personally i'm all about DLSS. The performance gap between an AAA game like CP2077 and the "old but still hella fun", or competitive online games is getting ridiculous.
Having $500 (or more) gpus can feel like a waste sometimes, especially when those demanding games you bought the card for turn out to be not that fun.
My RX 480 has spent probably 85% of it's lifetime playing games that are far below what it can handle.
Maybe that's the point. I'm not all about performance or competitiveness. If I can run my games at frame rates that are comfortable for my eyes (what that means varies between 30 and 60 fps depending on game), I'm fine.
 
ouucch, when we looks incoming nvidia and amds gpus this or next year i wonder only one think..
where we need thouse monsters?

you must have 8K monitor and top pick component that its is worth.


well, sure some1 have cssh, but its just much under 5 percent.

well, competition os always good.
 
Very surprising rumor. About two years ago, AMD representatives said it's not really trivial to do MCM for gaming workloads, and thus one would expect to see this type of design in CDNA first, and much later it would be used in RDNA. It's rumored even nVidia postponed their MCM GPU arch. Don't get your hopes up ...
 
ouucch, when we looks incoming nvidia and amds gpus this or next year i wonder only one think..
where we need thouse monsters?

you must have 8K monitor and top pick component that its is worth.


well, sure some1 have cssh, but its just much under 5 percent.

well, competition os always good.
What I want is a good sub $200 GPU again. What happened? That's where most of the money comes from, AMD and NVIDIA.
 
Joke*** The correct word is joke.

To all the nerds who are crying about GPU shortages, price hikes, out of stock items: Grow up. Go outside, find a new hobby, actually talk to people in real life (I know, its a wild concept), discover there is more to the world than just Fall Guys and Among Us

There's a pandemic going on. Also Among Us is quite fun, and it was originally released as a mobile game, and it can run on basically any PC, listed minimum specs is a Pentium 4 2Ghz and a Geforce 510.

 
Last edited by a moderator:
I just feel sorry for people who didn't pick up a modern GPU last generation. They're completely boned right now.

It's understandable, too, because Turing was a complete rip-off for the first year on the market, and then when Navi arrived on the scene a year late the party it was underwhelming AF; The 5700 wasn't even 15% faster than the Vega56, and even at MSRP, it actually offered lower performance/$. Street pricing on Vega56 by that point was way lower than the 5700-series ever reached.

The only "good" thing that happened that entire generation is that AMD rejoining the GPU market after a year of total nothingness brought TU104 down to under $500 in the 2070 Super. That didn't make it good value, but at least it restored price/performance to somewhere that wasn't obscene. Unfortunately Nvidia stopped manufacturing Turing around 9 months after the Super series launched, so if you didn't grab one during that short window you're SOL right now.

Still using my RX480, and will continue to until a card comes out at around 2x performance at around the same price I paid for it.
 
Still using my RX480, and will continue to until a card comes out at around 2x performance at around the same price I paid for it.
That's my way of thinking too. I've never understood those people who get excited about 10% more performance which is barely noticeable to the human eye.

Actually, I remember upgrading my Radeon X800 XT to a GeForce 7800 GS. One of the worst investments of my life.
 
big number good
It's not about big numbers. I highly doubt anyone can see the difference between 30 and 33 fps, or 100 and 110 fps. That's 10%.
 
Meanwhile at AMD HQ....

Mining.gif
 
Let's all get back to discussing GPU releases and let the nationalistic off topic chat go away please.
 
Guess how will be number of performance of Navi 31.
2X cores + new RDNA3 which will come with up to +50% combine IPC/power consumption per gigaflops thanks to new R&D and new lithography?
How will be sum of all parts of equation? 3X than RX 68006900 XT? Is possible with GDDR6/X because GDDR7 will be ready for use not before RDNA4(or nextgen architecture beyond of RDNA if RDNA3 is last of it's kind).
Edit in "equation" :D!
 
Last edited:
That’s because CP uses some really heavy post processing and has a very soft image quality anyway.

*knock knock* It's the FBI we have some questions that we would like you to answer.
 
Remember when you could get excited about a futur product without wondering if you could get them, but rather when and how it will perform ? I miss those simplier times.

Yeah like the PIII-1000 paper launch, Radeon X800 paper launch, GeForce 6800U, X1900, 8800 GTX, GTX 480... We've never had paper launches before.
 
high latency with CPU was already terrible and now they want high latency on GPU? Get out.
 
Back
Top