Wednesday, October 21st 2020

AMD Radeon RX 6000 Series Specs Leak: RX 6900 XT, RX 6800 XT, RX 6700 Series

AMD's Radeon RX 6000 series graphics cards, based on the RDNA2 graphics architecture, will see the introduction of the company's first DirectX 12 Ultimate graphics cards (featuring features such as real-time raytracing). A VideoCardz report sheds light on the specifications. The 7 nm "Navi 21" and "Navi 22" chips will power the top-end of the lineup. The flagship part is the Radeon RX 6900 XT, followed by the RX 6800 XT and RX 6800; which are all based on the "Navi 21." These are followed by the RX 6700 XT and RX 6700, which are based on the "Navi 22" silicon.

The "Navi 21" silicon physically features 80 RDNA2 compute units, working out to 5,120 stream processors. The RX 6900 XT maxes the chip out, enabling all 80 CUs, and is internally referred to as the "Navi 21 XTX." Besides these, the RX 6900 XT features 16 GB of GDDR6 memory across a 256-bit wide memory interface, and engine clocks boosting beyond 2.30 GHz. The next SKU in AMD's product stack is the RX 6800 XT (Navi 21 XT), featuring 72 out of 80 CUs, working out to 4,608 stream processors, the same 16 GB 256-bit GDDR6 memory configuration as the flagship, while its engine clocks go up to 2.25 GHz.
A notch below the RX 6800 XT is the RX 6800 (Navi 21 XL), which cuts down the "Navi 21" further, giving it 64 compute units or 4,096 stream processors; the very same 16 GB of 256-bit GDDR6 memory interface, and up to 2.15 GHz engine clocks. The RX 6900 XT, along with the RX 6800 series, will be announced in the October 28 presser.

The next chip AMD is designing is the 7 nm "Navi 22" silicon, which features 40 compute units. On paper, this count looks similar to that of the "Navi 10," and it remains to be seen if this is a re-badge or a new silicon based on RDNA2. The RX 6700 XT maxes this chip out, featuring 40 CUs or 2,560 stream processors; while the RX 6700 features fewer CUs (possibly 36). The interesting thing about these two is their memory configuration—12 GB of 192-bit GDDR6.
Source: VideoCardz
Add your own comment

191 Comments on AMD Radeon RX 6000 Series Specs Leak: RX 6900 XT, RX 6800 XT, RX 6700 Series

#51
okbuddy
6900xt only 11.111% faster than 6800xt
Posted on Reply
#52
bug
okbuddy6900xt only 11.111% faster than 6800xt
It really doesn't matter. I've always said $1000 cards can be safely disregarded. What piques my interest is the level of performance pushed down into the sub-$300 segment. I expect on an enthusiast forum not everybody feels the same, but the vast majority of buyers do.
Posted on Reply
#53
Vya Domus
AsRockYeah your talking a 500W just to play a game lmao.
500W to play a game with the best possible quality and performance. If you just want to play it you need a lot less than that.
Posted on Reply
#54
Nater
ebivanAt the end of the day its the price not the model no thats important. Huang promised 4k 60fps for 700 bucks. Thats what i expect. Nothing more, nothing less. I really hope Su can give me what Huang didnt!
It's not just price right now. Availability is HUGE at the moment. Who cares how much the 3080 is listed at when you cannot get one at that MSRP/ESRP.

$700/slower/READY TO SHIP > $700/faster/OUT OF STOCK
Posted on Reply
#55
Chrispy_
I hope the 64CU model is $400 and sub-250W.

Depending on how well the RDNA2 architecture scales and how much IPC improvement there is, that model should be about twice as fast as a 5700XT which means it is potentially a 3070 competitor, only it'll have a more appropriate amount of GDDR6 than the Nvidia card.

I'm sure the 320W variants will be nice and fast but I'm not interested in trying to deal with that much heat from a card.
Posted on Reply
#56
Metroid
These are all gaming gpus and the 256bit memory is the important factor here. If people want computing then go nvidia, if they want gaming then choose amd. Interesting, tables have been turned, nvidia used to be for gaming and amd for computing, this time is the opposite.
Posted on Reply
#57
FinneousPJ
MetroidThese are all gaming gpus and the 256bit memory is the important factor here. If people want computing then go nvidia, if they want gaming then choose amd. Interesting, tables have been turned, nvidia used to be for gaming and amd for computing, this time is the opposite.
True, and I imagine that fewer cores at a higher frequency will also work in AMD's advantage for gaming.
Posted on Reply
#58
bug
Vya Domus500W to play a game with the best possible quality and performance. If you just want to play it you need a lot less than that.
When I got into PCs, the choice was between a 250 and a 350W PSU. Granted, this was before 2D acceleration was upon us...
Posted on Reply
#59
phanbuey
bugWhen I got into PCs, the choice was between a 250 and a 350W PSU. Granted, this was before 2D acceleration was upon us...
Also the PC was a completely closed box with tiny vents somewhere in the front lol.
Posted on Reply
#60
cueman
keep your rtx 3080...no worry

rx 6900 xt loose 15-20% and tdp is near over 350W.
Posted on Reply
#61
Punkenjoy
A big bunch of performance improvement came from increased power usage. Although the main performance improvement factor still come from process improvement.
Posted on Reply
#62
phanbuey
Looks like I'm going to be holding on to the 2080ti for this round -- getting 16K gpu score on timespy and 7.5K timespy extreme at 290W + I put a silent cooler on it + more DLSS support will keep it alive with a +30% boost in Cyberpunk.

3080 doesn't even seem worth it unless I can pick up a used one at some point with a 5800x/5600x.
Posted on Reply
#63
AsRock
TPU addict
Vya Domus500W to play a game with the best possible quality and performance. If you just want to play it you need a lot less than that.
Not 100% true, as i know RDR2 can look better than it does with my 390X and that's up to a 420w just for 40-60 fps.
Posted on Reply
#64
Unregistered
nVidia messed up with Ampere, high prices (MSRP of 1000$ here for no logical reason), only 10gb of vram, and worse the overpriced cards don't even exist in the real life. I really hope AMD will have a real launch with reasonable prices.
Posted on Edit | Reply
#65
tomc100
I just hope the power draw is really low when I'm not gaming since I don't game that often but use the computer mostly for surfing the net, youtube, and work.
Posted on Reply
#66
purecain
Please AMD release your cards so that i may cancel my 3090 order.
Posted on Reply
#67
Dante Uchiha
bugExcept that it doesn't work like that. Cache is much, much smaller than main memory. If you happen to frequently use only, say 16MB of the main memory, those MB will be retained in cache and used from there, resulting in dramatically improved performance. But if the video card wants to access say 8GB, the cache doesn't help much anymore.
Also, cache memory is so fast in part because it's very power hungry.

Look Infinity Cache Patent details:

"We propose shared L1 caches in GPUs. To the best of our knowledge, this is the first paper that performs a thorough characterization of shared L1 caches in GPUs and shows that they can significantly improve the collective L1 hit rates and reduce the bandwidth pressure to the lower levels of the memory hierarchy."

• "We develop GPU-specific optimizations to reduce inter-core communication overheads. These optimizations are vital for maximizing the benefits of the shared L1 cache organization."

• "We develop a GPU-specific lightweight dynamic scheme that classifies application phases and reconfigures the L1 cache organization (shared or private) based on the phase behavior."

• "We extensively evaluate our proposal across 28 GPGPU applications. Our dynamic scheme boosts performance by 22% (up to 52%) and energy efficiency by 49% for the applications that exhibit high data replication and cache sensitivity without degrading the performance of the other applications. This is achieved at a modest area overhead of 0.09 mm2 /core."

www.freepatentsonline.com/y2020/0293445.html
adwaitjog.github.io/docs/pdf/sharedl1-pact20.pdf
Posted on Reply
#68
MxPhenom 216
ASIC Engineer
No GDDR6X? :wtf:
purecainPlease AMD release your cards so that i may cancel my 3090 order.
I just want them to release cards that are competitive with Nvidia's high end again. That's all i ask. Then my Ryzen 3 build im doing next year will be more fun.
Posted on Reply
#69
Crustybeaver
Meanwhile I expect Nvidia are sat there with the Ti / Super variants ready for AMDs next move. No way they're letting AMD have the fastest cards come Christmas.
Posted on Reply
#70
Chrispy_
I hope the 6700-series (Navi 22) are still RDNA2 with DXR capabilities. It would be confusing as heck to have a mix of DXR capable and DXR incapable cards all under the same naming scheme.

At least Nvidia use RTX and GTX to distinguish their two lineups.
Posted on Reply
#71
Icon Charlie
With this information they left out.... WATTAGE.... Not good... Not good at all.
I'm not to heat up my damn house with a computer.

It really looks like I'm going to stick with my current Video card and maybe upgrade my CPU later.
Posted on Reply
#72
efikkan
ebivanThat would be great but I dont think that's realistic. Since there never actually were any 3080 for 700 (or has anyone acutally gotten a FE card?), AMD will propably not sell an 3080 equivalent for that price.
That's untrue. All AiBs have models at the same MSRP as the FE card.
Vya DomusIt's not "could", it's "does". That why caches exist, to mitigate the limitations caused by slow main memory. Caches work very well when the access patterns are sequential and guess what, the sort of things you need to compute for graphics, are always like that.
Caches only help for sequential access patterns if the average bandwidth is high enough.
Posted on Reply
#73
Chrispy_
efikkanThat's untrue. All AiBs have models at the same MSRP as the FE card.
That's usually true but unlike the FE which is a high-quality design with binned silicon and an expensive cooler, the MSRP uses the standard, cheap reference design and likely the cheapest-to-produce cooler they can get away with. So, compared to the FE it's terrible value for money.
bugIt really doesn't matter. I've always said $1000 cards can be safely disregarded. What piques my interest is the level of performance pushed down into the sub-$300 segment. I expect on an enthusiast forum not everybody feels the same, but the vast majority of buyers do.
The overwhelmingly popular price point is always the xx60 series which for about a decade got stuck at $199 because that was such an appealing price point to target; Thanks to inflation a $200 card is no longer the mid-range sweet spot but it works out at about $275 in today's money.

The 2060 screwed the pooch as a pricing anomaly, and the only other time nvidia deviated from this pricing was the GTX 260 back in 2008, also considered a complete rip-off. I very much doubt Nvidia is going to target $275 for anything decent unless AMD has something that's cleaning house at that price point.
Posted on Reply
#74
kapone32
bugExcept that it doesn't work like that. Cache is much, much smaller than main memory. If you happen to frequently use only, say 16MB of the main memory, those MB will be retained in cache and used from there, resulting in dramatically improved performance. But if the video card wants to access say 8GB, the cache doesn't help much anymore.
Also, cache memory is so fast in part because it's very power hungry.
Next week this time we will all be in the know of what is real and not.
Posted on Reply
#75
Vya Domus
AsRockNot 100% true, as i know RDR2 can look better than it does with my 390X and that's up to a 420w just for 40-60 fps.
Sure but that's a really old GPU. Gotta look at things from the current generation.
Posted on Reply
Add your own comment
Apr 26th, 2024 06:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts