Tuesday, September 22nd 2020

AMD Radeon "Navy Flounder" Features 40CU, 192-bit GDDR6 Memory

AMD uses offbeat codenames such as the "Great Horned Owl," "Sienna Cichlid" and "Navy Flounder" to identify sources of leaks internally. One such upcoming product, codenamed "Navy Flounder," is shaping up to be a possible successor to the RX 5500 XT, the company's 1080p segment-leading product. According to ROCm compute code fished out by stblr on Reddit, this GPU is configured with 40 compute units, a step up from 14 on the RX 5500 XT, and retains a 192-bit wide GDDR6 memory interface.

Assuming the RDNA2 compute unit on next-gen Radeon RX graphics processors has the same number of stream processors per CU, we're looking at 2,560 stream processors for the "Navy Flounder," compared to 80 on "Sienna Cichlid." The 192-bit wide memory interface allows a high degree of segmentation for AMD's product managers for graphics cards under the $250-mark.
Sources: VideoCardz, stblr (Reddit)
Add your own comment

135 Comments on AMD Radeon "Navy Flounder" Features 40CU, 192-bit GDDR6 Memory

#26
Anymal
Pretty good means 50% better efficiency.
Posted on Reply
#27
Lionheart
AssimilatorWhy on Earth would you think that AMD is going to launch 16GB cards, when NVIDIA is sticking with 8GB? The answer is, they aren't. And as such this card, as a replacement for 5600, will have 6GB.



No, it's a new version of RDNA.
If AMD launch 16GB cards which they most likely will, I'm gonna come back here & laugh at you lol
Posted on Reply
#28
Vya Domus
Vayra86Correct, AMD might come out with a totally not optimized product because its new, it might be negative IPC for all we know.
That's pretty much impossible, this is the exact hardware inside of consoles, it has to be optimized. And RDNA2 is known to be faster already from the information released, again, from consoles.
Posted on Reply
#29
InVasMani
They'll probably have 3 tiers of RDNA2 chips, ram densities 8GB/12GB/16GB, and memory controller bus widths 192-bit/256-bit/384-bit. I doubt AMD will use less than 8GB. In fact it's same to pretty much a guarantee the lowest end card will match the ram density of the new PS5/XBOX consoles it would be asinine to use weaker than that on the PC the entry level bar has been raised.
LionheartIf AMD launch 16GB cards which they most likely will, I'm gonna come back here & laugh at you lol
Considering Radeon VII has 16GB of course there will be a RDNA2 card with 16GB with faster and more capable cores and memory as well.
Posted on Reply
#30
Minus Infinity
This makes sense unlike the BS about the 5700XT successor being 192 bit.
Posted on Reply
#31
kodorr95
I really hope this also comes with a performance improvement for a 5500XT successor, because as it is, the RX 580 is a better buy instead of the 5500XT, as they can be had for as low as $130ish and they perform the same. I want AMD to stop releasing an RX 480 every single year.
Posted on Reply
#32
Mussels
Freshwater Moderator
FLOUNDERS EDITION


i havent read anything else, still laughing
Posted on Reply
#33
Xzibit
MusselsFLOUNDERS EDITION


i havent read anything else, still laughing
I hope it has this color scheme

Posted on Reply
#34
Camm
AMD's cache designs must be fan fucking tastic to think a non-x, GDDR6, limited width bus card is going to compete against the 3000 stack well.
Posted on Reply
#35
BoboOOZ
CammAMD's cache designs must be fan fucking tastic to think a non-x, GDDR6, limited width bus card is going to compete against the 3000 stack well.
AMD are only telling us only what they want, I'm of half of a mind that in a month we'll know the top dies get 512 bit bus while the second tier is on 384, and all this cache fairy tale is just smoke for Nvidia.
The only thing we're reasonably sure of is that AMD will compete in all the segments of the market this time, the details are yet unknown.
Posted on Reply
#36
laszlo
MusselsFLOUNDERS EDITION


i havent read anything else, still laughing
same here, you know the other thread where is the ebay listing of rtx 3080 flounder edition ... whahahahaha :roll:
Posted on Reply
#37
Mussels
Freshwater Moderator
laszlosame here, you know the other thread where is the ebay listing of rtx 3080 flounder edition ... whahahahaha :roll:
thats all i could think of
Posted on Reply
#38
Vayra86
Vya DomusThat's pretty much impossible, this is the exact hardware inside of consoles, it has to be optimized. And RDNA2 is known to be faster already from the information released, again, from consoles.
Pretty much, indeed... All we have from the consoles are paper realities.
Posted on Reply
#39
Vya Domus
Vayra86paper realities.
Paper realities or not, AMD is taping out hundreds of thousands of chips as we speak. That's seems pretty real to me.

On that note, don't expect much stock from them even if RDNA2 is amazing.
Posted on Reply
#40
ratirt
AMD is launching 16GB graphics cards this time around. You can tell for sure that is true since NV now is boosting the memory capacity of the 3070 and 3080 up to 20GB. So yeah, 16GB for AMD's cards is true.
Posted on Reply
#41
-The_Mask-
If it really is true, this is a Navi 10 replacement. It has the same numbers of CU's, but RDNA2 should have some IPC improvements and a higher GPU frequency. So you're looking at RX 5700 XT + maybe 15% extra performance. It doesn't makes sense to give such a GPU only 6GB, so 12GB is the only logical choice here.

The reason the RTX 3080 only has 10GB is quite simple. Micron currently only has the 8Gb chips in mass production. nVidia has to wait for Micron to mass-produce the 16Gb chips to launch a 20GB RTX 3080. You can look this up at Micron.com, it's open data. If you thing what about the RTX 3090 with 24GB? It uses a really expensive option with double the memory chips on both sides of the PCB. Power consumption of the memory is also doubled with the RTX 3090. It's simply not a viable option for a RTX 3080. 16Gb GDDR6X goes in mass production over a couple of months btw.
Posted on Reply
#42
Valantar
OGocSo much speculation but ball park it we can: assume 7nm+ has 10% better performance per watt than 7nm and RDNA2 has 5% IPC improvement.

75 Watt equivalent to 1650 (PCIe)
150 Watt equivalent to 5600XT (6-pin + PCIe)
225 Watt equivalent to 2080 (8-pin + PCIe)
300 Watt equivalent to 3070 (6-pin + 8-pin + PCIe)
375 Watt equivalent to 3080 (2x 8-pin + PCIe)
Where are you getting your baseline perf/W number from? The 5700 XT, 5700, 5600 XT or 5500 XT? Depending on which you pick, the results of your calculation will vary wildly - the 5700 XT is around 1070 Ti levels, with the 2080 and 2080 Ti ~17% ahead, while the 5600 XT is 3-4% ahead of all of those. "The efficiency of RDNA" is in other words highly dependent on its implementation.
sergionographyPs5 has 36CU not 44. And since you mentioned ps5 think of this, ps5 tdp is 175w while also including 8 zen2 cores and gpu being clocked at 2250mhz in top. I think RDNA2 is looking pretty good in terms of efficiency
Where did you get that TDP number from? Has it been published or confirmed anywhere? The only number I've seen is the whole-system power rating from the official spec sheet, which is 350/340W for the whole system (BD/discless). Even assuming that is peak draw numbers including PSU losses, having the SoC only account for 50% of that number seems low.
-The_Mask-If it really is true, this is a Navi 10 replacement. It has the same numbers of CU's, but RDNA2 should have some IPC improvements and a higher GPU frequency. So you're looking at RX 5700 XT + maybe 15% extra performance. It doesn't makes sense to give such a GPU only 6GB, so 12GB is the only logical choice here.
That depends on pricing as well as performance. If this is indeed a chip for upper entry level and lower midrange GPUs, 12GB might simply be too expensive. Even with cheaper, lower clocked RAM chips another 6GB is an additional $50-80 BOM cost. That eats a lot of margin for a <$300 GPU. So in the end, the total amount of memory is also dependent on pricing. It might be that the highest end SKU based off this chip will have 12GB, but 6GB for lower end versions is going to represent a significant savinsg - and likely not hold the card back at all at the types of resolutions it will mostly be used for. Of course there could also be cut-down SKUs with 4GB and 8GB lower down the stack.
Posted on Reply
#43
BoboOOZ
ValantarThat depends on pricing as well as performance. If this is indeed a chip for upper entry level and lower midrange GPUs, 12GB might simply be too expensive. Even with cheaper, lower clocked RAM chips another 6GB is an additional $50-80 BOM cost. That eats a lot of margin for a <$300 GPU. So in the end, the total amount of memory is also dependent on pricing. It might be that the highest end SKU based off this chip will have 12GB, but 6GB for lower end versions is going to represent a significant savinsg - and likely not hold the card back at all at the types of resolutions it will mostly be used for. Of course there could also be cut-down SKUs with 4GB and 8GB lower down the stack.
There's absolutely no way that AMD will pair an improved version of 5700XT with 6GB of memory, after having announced 3 months before that 8GB is the bare minimum. That would make no sense at all. Also, 40 CU is not entry-level yet, this will be a midrange card, above 10 TFlop. Also, your price for 6 gigs is way overinflated, the current price for high-volume purchases of GDDR6 should be around 6 bucks per gig.
Posted on Reply
#44
Valantar
BoboOOZThere's absolutely no way that AMD will pair an improved version of 5700XT with 6GB of memory, after having announced 3 months before that 8GB is the bare minimum. That would make no sense at all. Also, 40 CU is not entry-level yet, this will be a midrange card, above 10 TFlop. Also, your price for 6 gigs is way overinflated, the current price for high-volume purchases of GDDR6 should be around 6 bucks per gig.
I never said 40 CUs would be entry level, I said this chip (as the news post also speculates) might be used in upper entry level SKUs as well as lower midrange ones. Even a basic understanding of chip binning and GPU product segmentation should lead one to conclude that such a SKU would then be a cut-down version, thus with less than 40 CUs. Besides, we know nothing about clock speeds or power consumption - it might well be that they'll launch a <75W low-clocked ~30CU GPU from this silicon, with no PCIe power connector to allow for overclocking. Who knows? We certainly don't. If they are positioning this as a significantly lower cost product than the 5700 and 5700 XT, less memory would be an entirely reasonable tweak to reach a lower price. The 5700/XT have been advertised for 1440p gaming, so if they instead target this as a "high end 1080p" GPU (like the 5600 XT), 6GB would be entirely reasonable for the foreseeable future.

RAM pricing is also highly dependent on clock rates. $6/GB might be possible for the slowest chips, but anything delivering current-gen speeds will cost more than that.
Posted on Reply
#45
Vya Domus
CammAMD's cache designs must be fan fucking tastic to think a non-x, GDDR6, limited width bus card is going to compete against the 3000 stack well.
As it happens there are some odd rumors about AMD using some colossal caches on RDNA2 based GPUs. And there is also this :

www.freepatentsonline.com/y2020/0293445.html
Posted on Reply
#46
Vayra86
Vya DomusPaper realities or not, AMD is taping out hundreds of thousands of chips as we speak. That's seems pretty real to me.

On that note, don't expect much stock from them even if RDNA2 is amazing.
Yeah, I just have my usual bag of salt at the ready, that is all. Did the same, and still do the same in fact, with Ampere. So far its a rewarding exercise, with all those little developments lately.

As for the paper reality, I was alluding to the specs relative to actual performance of the new consoles. We've seen it every time, its not quite up there with a similar PC setup. Limitations, stuff lacking or sharing resources... and its not looking like this gen is any different. Awesome tricks, but they do come at some cost and while the systems are likely very efficient, its not like they can simply transplant that efficiency to a PC dGPU.

So it would be a capital mistake to consider the console specs as 'the PC spec' or its performance level as 'the PC perf level'. Its not going to be that at all - and its most certainly isn't going to surpass the console's efficiency.
Posted on Reply
#47
ratirt
Vayra86Yeah, I just have my usual bag of salt at the ready, that is all. Did the same, and still do the same in fact, with Ampere. So far its a rewarding exercise, with all those little developments lately.
I hope you won't get a stroke or a heart disease with all the bags of salt you been having. Although, considering all the news and leaks, it's hard not to gobble up buckets of salt each day reading the rumors etc. Best bet is to stay at the side rails instead of getting on board the rumor express.
Me, on the other hand, just gonna wait for the release and see what AMD prepared for the consumer market. All those rumors, fake news and opinions make me kinda dizzy. Besides, in the end, all the rumors supposedly true, may turn into crap if AMD decides to change something in the end.
What I'm really hoping for from AMD is, at the launch date the cards are available and if you decide to go with one, it will be there waiting for you to buy it.
Posted on Reply
#48
Vayra86
ratirtI hope you won't get a stroke or a heart disease with all the bags of salt you been having. Although, considering all the news and leaks, it's hard not to gobble up buckets of salt each day reading the rumors etc. Best bet is to stay at the side rails instead of getting on board the rumor express.
Me, on the other hand, just gonna wait for the release and see what AMD prepared for the consumer market. All those rumors, fake news and opinions make me kinda dizzy. Besides, in the end, all the rumors supposedly true, may turn into crap if AMD decides to change something in the end.
What I'm really hoping for from AMD is, at the launch date the cards are available and if you decide to go with one, it will be there waiting for you to buy it.
Nobody said I ate them, in fact the salt generally goes out not in! Right?! :D
Posted on Reply
#49
ratirt
Vayra86Nobody said I ate them, in fact the salt generally goes out not in! Right?! :D
Right. I know you don't but I just couldn't refuse to joke about it :P
Posted on Reply
Add your own comment
Apr 27th, 2024 00:56 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts