Wednesday, September 9th 2020

Alleged AMD Radeon "Big Navi" Prototype Pictured

Following Wednesday's announcement of the Radeon RX 6000 series with product launches on October 28, the rumor mill started rolling full steam ahead. The RX 6000 series GPUs by AMD will be based on the RDNA2 graphics architecture, the same exact architecture powering the Xbox Series X and PlayStation 5, and will feature DirectX 12 Ultimate support, including hardware raytracing. A PC enthusiast on Chinese microblogging site Bilibili posted a picture of an alleged "Big Navi" prototype. Since its July 2019 debut, there have been rumors of AMD working on a new high-end GPU to take on the upper-segment of NVIDIA, given that the RX 5700 series offered competitive performance to NVIDIA's breadwinning products, such as the RTX 2070 series and RTX 2060 series.

The picture reveals the reverse side of the alleged "Big Navi" prototype's PCB, showing a larger cluster of GPU ancillaries than those behind a "Navi 10," and eight memory pads with the paper labels "Typical XT ASIC" references for a "16 Gb Samsung GDDR6 memory." Over a 256-bit wide memory interface, the chip should hence have 16 GB of GDDR6 memory. Since this is a prototype, several headers are sticking out of the PCB for the design and prototyping of the product. A tower-type CPU cooler has been MacGyvered onto the GPU (which isn't uncommon for VGA prototypes). We'll hear a lot more about this product in the run up to its October 28 launch.
Sources: 搞机猛男 (bilibili), ChipHell Forums
Add your own comment

98 Comments on Alleged AMD Radeon "Big Navi" Prototype Pictured

#51
ExcuseMeWtf
The other is the memory frequency, and we know what the limits of GDDR6 are. We also know GDDR6X is not available to them just yet.
1. We know current GDDR6 limits as used in consumer grade products. We don't know for sure they will apply to this specific final product.
2. We don't know whether data compression will be applied, and how significant it will be.
3. We may not know all the other nooks and crannies AMD engineers choose to implement, that we fail to come up with on the spot.
Posted on Reply
#52
DeathtoGnomes
Now if that card was designed right side up that heatsink would work more efficiently!
Posted on Reply
#53
midnightoil
btarunrGDDR6X is not a JEDEC standard. It's an NVIDIA+Micron exclusive. The only way AMD can match up is the expensive HBM2 MCM route.
HBM2e is unlikely to be more expensive than GDDR6X, from what I've heard, especially when you account for more expensive, complicated power delivery as the latter is so power hungry.

HBM2, sure. 2e ... highly unlikely.

But I doubt AMD are using 2e. I suspect they're waiting for RDNA3 to do a full stack HBM product line.
BoboOOZand Nvidia has a better memory compression algorithm.
Doesn't particularly look like it from new XB's info ... in the past, yes.
Posted on Reply
#54
Vya Domus
Here's something intriguing, I can only see three memory pads on front most side of the GPU, so how exactly it would all add up to 16GB I don't know.
Posted on Reply
#55
BoboOOZ
midnightoilDoesn't particularly look like it from new XB's info ... in the past, yes.
Care to elaborate on that a bit? Did they change their compression algorithm? Do you have any news that AMD have improved theirs?
Posted on Reply
#56
iO
Vya DomusHere's something intriguing, I can only see three memory pads on front most side of the GPU, so how exactly it would all add up to 16GB I don't know.

8 x 16Gbit = 16GB
Posted on Reply
#57
dragontamer5788
M2BIf this is indeed the top-end RDNA2 GPU, I can understand why AMD wouldn't want to enter the high-end segment. THEY JUST DON'T SELL. AMD's midrange 5700 series ended up selling way less than Nvidia's high-end, so why would AMD waste their R&D budget for something that they can't sell?
Not to mention the limited producation capacity that could go towards the CPU portion and be way more profitable.
I hope this isn't the top-end chip and AMD can compete with even the 3090, but you can't really expect a 256-bit GPU to be that much faster than a 2080Ti.
AMD did win the Frontier supercomputer, which needs to be primarily GPU-based and reach Exascale computing.

Rumor is that Frontier is Arcturus, but whatever they use for the Frontier Supercomputer could probably be the base for a very high-end gaming card. AMD is splitting the marketing between "CDNA" (Compute Cards) and RDNA (Radeon / Gaming cards), but some R&D effort would cross over between the two.
Posted on Reply
#58
tfdsaf
It could be any GPU, I don't buy that this is big navi. I could see big navi maxing at 16GB, but I think the lowerer model would also come with 16GB option.

That said I don't see AMD competing with RTX 3090, I think they are going to have a card that is a bit slower than the RTX 3080, probably about 15% slower, but costs $100 less, and then have a RTX 3070 competitor that is about 10% faster at $500.

So a RX 6900 at $600 with 15% less performance than RTX 3080
RX 6800 at $500 with 10% more performance than RTX 3070
RX 6700 at $400 that is about 20% faster than RX 5700XT
RX 6600 at $300 at about 20% faster than RX 5700
RX 5700xt is likely to be discontinued
RX 5700 likely to be sold at $250 price
RX 5600xt at $200
RX 5600 at $170
RX 5500xt at $140
Posted on Reply
#59
Caring1
iO
8 x 16Gbit = 16GB
8 x 2GB = 16GB
Posted on Reply
#60
cueman
hmm i cant see rdna2 cant beat rx 300 series,it just cant.

can it beat ASUS ROG GeForce GTX 2080 Ti Matrix Platinum gpu?
Posted on Reply
#62
mtcn77
thesmokingmanYea, I was like that is some good shit. They should sell that bracket.
It is the lab501 kit. I bet they assigned massman to lead the project forward.
Posted on Reply
#63
Fluffmeister
TheDeeGeeNot the cleanest test lab...
I wish i was wearing a hazmat suit just looking at the picture.
Posted on Reply
#64
mtcn77
FluffmeisterI wish i was wearing a hazmat suit just looking at the picture.
Lab501 trade secret. They used to stick it using rubber bands.
Posted on Reply
#65
iO
This also could be Navi22 and they just use the same prototype cooler for different SKUs, would explain the small 70mm fan and 256bit interface.

Or Big Navi simply isn't as big as we thought...
Caring18 x 2GB = 16GB
Same thing.
Posted on Reply
#66
GhostRyder
Hmm, will be interesting if it at least competes with the RTX 3080. Highly doubt it will compete with the 3090. But I am interested to see a new high end AMD GPU finally.
Posted on Reply
#67
Houd.ini
TheDeeGeeNot the cleanest test lab...
That's how you do field testing!
Posted on Reply
#68
mtcn77
You know, all the dust particles only mean the potential capacity for cooling of that fan, right?
If it ain't pulling, it ain't collecting.
Posted on Reply
#69
Octopuss
This looks like a joke or something. Who would stick papers saying "typical xyz" on a prototype card, as in something only the company has access to?
Posted on Reply
#70
mechtech
Looks like a random pic of some random overseas crap you'd buy on wish.
Posted on Reply
#71
Makaveli
BoboOOZGuys, stop dreaming. Without DDR6X, HBM2 or a 512bit memory bus, there's no way AMD can catch up to the 3090, or the 3080, the memory bandwidth can't be big enough, and Nvidia has a better memory compression algorithm.
I think you are right you need memory bandwidth to feed a beast.

While I want Big Navi to equal 3080 performance I think its will come in +15% faster than the 3070 and with 16GB's of ram. And that is why rumored launch price of $599 was lowered to $549 when they saw the performance of the 3080.

If it comes out and beat expectations I'll be happy either way as I'm due for a gpu upgrade, after building this rig in dec 2019 and carrying over my current gpu.
Posted on Reply
#72
DeathtoGnomes
tfdsafIt could be any GPU, I don't buy that this is big navi. I could see big navi maxing at 16GB, but I think the lowerer model would also come with 16GB option.

That said I don't see AMD competing with RTX 3090, I think they are going to have a card that is a bit slower than the RTX 3080, probably about 15% slower, but costs $100 less, and then have a RTX 3070 competitor that is about 10% faster at $500.

So a RX 6900 at $600 with 15% less performance than RTX 3080
RX 6800 at $500 with 10% more performance than RTX 3070
RX 6700 at $400 that is about 20% faster than RX 5700XT
RX 6600 at $300 at about 20% faster than RX 5700
RX 5700xt is likely to be discontinued
RX 5700 likely to be sold at $250 price
RX 5600xt at $200
RX 5600 at $170
RX 5500xt at $140
the hype i see suggests single digit separation. Ignore the price speculation its pointless.
Posted on Reply
#73
Anymal
It is rumor that rumored price 599 was lowered to 549.
Posted on Reply
#74
medi01
cuemani cant see
Thanks for sharing it.
Posted on Reply
#75
BoboOOZ
I usually find Paul's videos not overly interesting, but this one is reeeally interesting, I'm sure some of you will find it useful.

Posted on Reply
Add your own comment
Apr 26th, 2024 18:09 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts