Tuesday, October 20th 2020

AMD Radeon RX 6000 Series "Big Navi" GPU Features 320 W TGP, 16 Gbps GDDR6 Memory

AMD is preparing to launch its Radeon RX 6000 series of graphics cards codenamed "Big Navi", and it seems like we are getting more and more leaks about the upcoming cards. Set for October 28th launch, the Big Navi GPU is based on Navi 21 revision, which comes in two variants. Thanks to the sources over at Igor's Lab, Igor Wallossek has published a handful of information regarding the upcoming graphics cards release. More specifically, there are more details about the Total Graphics Power (TGP) of the cards and how it is used across the board (pun intended). To clarify, TDP (Thermal Design Power) is a measurement only used to the chip, or die of the GPU and how much thermal headroom it has, it doesn't measure the whole GPU power as there are more heat-producing components.

So the break down of the Navi 21 XT graphics card goes as follows: 235 Watts for the GPU alone, 20 Watts for Samsung's 16 Gbps GDDR6 memory, 35 Watts for voltage regulation (MOSFETs, Inductors, Caps), 15 Watts for Fans and other stuff, and 15 Watts that are used up by PCB and the losses found there. This puts the combined TGP to 320 Watts, showing just how much power is used by the non-GPU element. For custom OC AIB cards, the TGP is boosted to 355 Watts, as the GPU alone is using 270 Watts. When it comes to the Navi 21 XL GPU variant, the cards based on it are using 290 Watts of TGP, as the GPU sees a reduction to 203 Watts, and GDDR6 memory uses 17 Watts. The non-GPU components found on the board use the same amount of power.
When it comes to the selection of memory, AMD uses Samsung's 16 Gbps GDDR6 modules (K4ZAF325BM-HC16). The bundle AMD ships to its AIBs contains 16 GB of this memory paired with GPU core, however, AIBs are free to put different memory if they want to, as long as it is a 16 Gbps module. You can see the tables below and see the breakdown of the TGP of each card for yourself.
Sources: Igor's Lab, via VideoCardz
Add your own comment

153 Comments on AMD Radeon RX 6000 Series "Big Navi" GPU Features 320 W TGP, 16 Gbps GDDR6 Memory

#1
jesdals
What about the rumors of a 6900XTX card?
Posted on Reply
#2
SLK
Looks like this gen of GPUs are all power-hungry. Efficiency is out of the window!
Posted on Reply
#3
Rebe1
jesdals
What about the rumors of a 6900XTX card?
Same as with 5700 XT - probably the 6900XTX will be a limited edition of 6900XT.
Posted on Reply
#4
lemoncarbonate
SLK
Looks like this gen of GPUs are all power-hungry. Efficiency is out of the window!
You get more framerate with 3080 despite the insane power draw, some said it's the most-power-per-frame-efficient GPU out there.

But, I agree with you... I wish they could have made something that less hungrier. Imagine how amazing it would be if we could get <200W card that can beat 2080 Ti.
Posted on Reply
#5
Vya Domus
Rebe1
Same as with 5700 XT - probably the 6900XTX will be a limited edition of 6900XT.
I don't follow, the limited edition of the 5700XT wasn't a different product, it was still named 5700XT. "6900XTX" implies a different product.
Posted on Reply
#6
okbuddy
how real is that, isn't 2.4ghz?
Posted on Reply
#7
Turmania
Does anybody not care about electricity bills anymore, or most not having responsibilty to pay the bills? Who would buy these cards?
Posted on Reply
#8
jesdals
Turmania
Does anybody not care about electricity bills anymore, or most not having responsibilty to pay the bills? Who would buy these cards?
Gaming at 7680x1440 I could do with the upgrade
Posted on Reply
#9
Raevenlord
News Editor
SLK
Looks like this gen of GPUs are all power-hungry. Efficiency is out of the window!
Efficiency is one thing, power consumption is another.

NVIDIA's RTX 3080 is a much more power-efficient design than anything that came before (at 1440p and 4K), as our review clearly demonstrates.



One other metric for discussion, however, is power envelope. So, for anyone who has concerns regarding energy consumption, and wants to reduce overall power consumption for environmental or other concerns, one can always just drop a few rungs in the product stack for an RTX 3070, or the (virtual) RTX 3060 or AMD equivalents, which will certainly deliver even higher power efficiency, within a smaller envelope.

We'll have to wait for reviews on other cards in NVIDIA's product stack (not to mention all of AMD's offerings, such as this one leaked card), but it seems clear that this generation will deliver higher performance at the same power level as older generations. You may have to drop in the product stack, yes - but if performance is higher on the same envelope, you will have a better-performing RTX 2080 in a 3070 at the same power envelope, or a better performing RTX 2070 in a RTX 3060 at the same power envelope, and so on.

These are two different concepts, and I can't agree with anyone talking about inefficiency. The numbers, in a frame/watt basis, don't lie.
Posted on Reply
#10
repman244
Turmania
Does anybody not care about electricity bills anymore, or most not having responsibilty to pay the bills? Who would buy these cards?
Electricity is cheap in my country and I don't play games everyday for 12 hours so it will hardly show.
Posted on Reply
#11
Mussels
Moderprator
Well, looks like i'll just wire up my PC turn the AC on this summer
Posted on Reply
#12
FinneousPJ
Turmania
Does anybody not care about electricity bills anymore, or most not having responsibilty to pay the bills? Who would buy these cards?
How much more do you think a 320 W board will cost to use over a 250 W board?
Posted on Reply
#13
Turmania
With this rate even a successor to gtx 1650, which is a below 75w gpu will consume around 125w.
Posted on Reply
#14
RedelZaVedno
If TGP is 320W than peak power draw must be north of 400W, just like 3080 and 3090. That's really, really bad. Any single decent GPU should not peak over 300W, that's the datacenter rule of thumb and it's getting stumbled upon with Ampere and RDNA2. How long will air cooled 400W GPU last? I'm having hard time believing that there will be many fully functioning air cooled Big Navis/3080-90s around in 3-5 years time. Maybe that's the intend, 1080TIs are still killing new sales.
Posted on Reply
#15
kayjay010101
FinneousPJ
How much more do you think a 320 W board will cost to use over a 250 W board?
28% more.

For me it doesn't matter as electricity is quite cheap. 1 kWh (so 4x 250W per hour) is equivalent to one or two cents over in the States, so 70W? We're talking maybe one or two bucks more over the course of a year...
Posted on Reply
#16
Turmania
FinneousPJ
How much more do you think a 320 W board will cost to use over a 250 W board?
Reverse your thinking, say newer one that consumes 200w to 250w board. The trend to improve performance with increasing power consumption is for me not an ideal technological improvement. But that is me.
Posted on Reply
#17
RedelZaVedno
kayjay010101
28% more.

For me it doesn't matter as electricity is quite cheap. 1 kWh (so 4x 250W per hour) is equivalent to one or two cents over in the States, so 70W? We're talking maybe one or two bucks more over the course of a year...
It's not about the bill, it's about the heat. 400W GPU, 150W CPU, 50-150W for the rest of the system and you get yourself 0.6-0.7 KWh room heater. That's a no go in a 16m2 or smaller room in late spring and summer months if you live in moderate or warm climate.
Posted on Reply
#18
nguyen
Raevenlord
Efficiency is one thing, power consumption is another.

NVIDIA's RTX 3080 is a much more power-efficient design than anything that came before (at 1440p and 4K), as our review clearly demonstrates.

One other metric for discussion, however, is power envelope. So, for anyone who has concerns regarding energy consumption, and wants to reduce overall power consumption for environmental or other concerns, one can always just drop a few rungs in the product stack for an RTX 3070, or the (virtual) RTX 3060 or AMD equivalents, which will certainly deliver even higher power efficiency, within a smaller envelope.

We'll have to wait for reviews on other cards in NVIDIA's product stack (not to mention all of AMD's offerings), but it seems clear that this generation will deliver higher performance at the same power level as older generations. you may have to drop in the product stack, yes - but if performance is higher on the same envelope, you will have a better-performing RTX 2080 in a 3070 at the same power envelope, or a better performing RTX 2070 in an RTX 3060 at the same power envelope, and so on.

These are two different concepts, and I can't agree with anyone talking about inefficiency. The numbers, in a frame/watt basis, don't lie.
Finally someone who can explain the difference between power consumption vs efficiency.
Really tiring to see all the posts complaining about Ampere high power consumption.
High power consumption is so easy to fix, just drag the Power Limit slider down to where you want it to be (no undervolt/overclock). From TPU numbers, one can expect the 3080 to be 20-25% faster than 2080 Ti when limited to 260W TGP.

It seems like AMD is also clocking the shit outta Big Navi to catch up with Nvidia :roll: , people can just forget about RDNA2 achieve higher efficiency than Ampere.
Posted on Reply
#19
FinneousPJ
kayjay010101
28% more.

For me it doesn't matter as electricity is quite cheap. 1 kWh (so 4x 250W per hour) is equivalent to one or two cents over in the States, so 70W? We're talking maybe one or two bucks more over the course of a year...
Exactly. If you're worried about two bucks per year, maybe don't buy a GPU... try saving up a buffer first.
Turmania
Reverse your thinking, say newer one that consumes 200w to 250w board. The trend to improve performance with increasing power consumption is for me not an ideal technological improvement. But that is me.
Well, let's see whether these boards can offer better performance at the same power or not. If they can, what's the problem? You can undervolt & underclock to your desired power.
Posted on Reply
#20
RedelZaVedno
Performance per watt did go up on Ampere, but that's to be expected given that Nvidia moved from TSMCs 12nm to Samsung’s 8nm 8LPP, a 10nm extension node. What is not impressive is only 10% performance per watt increase over Turing while being build on 25% denser node. RDNA2 arch being on 7 nm+ looks to be even worse efficiency wise given that density of 7nm+ is much higher, but let's wait for the actual benchmarks.
Posted on Reply
#21
Khonjel
SLK
Looks like this gen of GPUs are all power-hungry. Efficiency is out of the window!
I sincerely apologise to every build suggestions where I recommended lower wattage PSUs because I said and I quote, "aS tiMe pROGressEs, Pc cOmpoNenTs will coNSuMe lESs poWeR"
Posted on Reply
#22
springs113
nguyen
Finally someone who can explain the difference between power consumption vs efficiency.
Really tiring to see all the posts complaining about Ampere high power consumption.
High power consumption is so easy to fix, just drag the Power Limit slider down to where you want it to be (no undervolt/overclock). From TPU numbers, one can expect the 3080 to be 20-25% faster than 2080 Ti when limited to 260W TGP.

It seems like AMD is also clocking the shit outta Big Navi to catch up with Nvidia :roll: , people can just forget about RDNA2 achieve higher efficiency than Ampere.
seems to that every gpu thread you need to defend the almighty Nvidia. Go be happy with your purchase and stop spewing garbo. Ik there's no ying without the yang but please cut it out. Nvidia numbers are known...navi2s are all speculative.
Posted on Reply
#23
Turmania
We used to have two slot gpu's as of last gen that went upto 3 slots, and now we are seeing 4 slot gpu's and it is all about to cool the power hungry beasts. But this trend surely has to stop. Yes, we can undervolt to bring power consumption to our desired needs and will most certainly be more efficient to last gen. But is that what 99% of users would do? I think about not only the power bill, the heat that is outputted, the spinning of fans,band consequently the faster detoriation of those fans and other components in the system. The noise output, and the heat that comes from the case will be uncomfortable.
Posted on Reply
#24
EarthDog
Yikes... feels again like they are punching up with clocks moving out of the sweetspot. Since power is seemingly similar between ampere and rdna2 cards, its going to come down to performance, price, and driver stability between them. Will amd have to go below ampere pricing due to performance or will they take the crown and proced similar?
Posted on Reply
#25
nguyen
Turmania
We used to have two slot gpu's as of last gen that went upto 3 slots, and now we are seeing 4 slot gpu's and it is all about to cool the power hungry beasts. But this trend surely has to stop. Yes, we can undervolt to bring power consumption to our desired needs and will most certainly be more efficient to last gen. But is that what 99% of users would do? I think about not only the power bill, the heat that is outputted, the spinning of fans,band consequently the faster detoriation of those fans and other components in the system. The noise output, and the heat that comes from the case will be uncomfortable.
I'm all for AIBs equipping 3-4 slot coolers to GPU and sell them at MSRP :D
That means with a little undervolting I can get similar experience to custom watercooling regarding thermal/noise.
Well if only there are any 3080/3090 available :roll: .
Posted on Reply
Add your own comment