Friday, October 9th 2020

AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs

AMD in its October 8 online launch event for the Ryzen 5000 "Zen 3" processors, provided a teaser of the company's next flagship graphics card slotted in the Radeon RX 6000 series. This particular SKU has been referred to by company CEO Lisa Su as "Big Navi," meaning it could be the top part from AMD's upcoming client GPU lineup. As part of the teaser, Su held up the reference design card, and provided three performance numbers of the card as tested on a machine powered by a Ryzen 9 5900X "Zen 3" processor. We compared these performance numbers, obtained at 4K UHD, with our own testing data for the games, to see how the card compares to other current-gen cards in its class. Our testing data for one of the games is from the latest RTX 30-series reviews, find details of our test bed here. We obviously have a different CPU since the 5900X is unreleased, but use the highest presets in our testing.

With "Borderlands 3" at 4K, with "badass" performance preset and DirectX 12 renderer, AMD claims a frame-rate of 61 FPS. We tested the game with its DirectX 12 renderer in our dedicated performance review (test bed details here). AMD's claimed performance ends up 45.9 percent higher than that of the GeForce RTX 2080 Ti as tested by us, which yields 41.8 FPS on our test bed. The RTX 3080 ends up 15.24 percent faster than Big Navi, with 70.3 FPS. It's important to note here that AMD may be using a different/lighter test scene than us, since we don't use internal benchmark tools of games, and design our own test scenes. It's also important to note that we tested Borderlands 3 with DirectX 12 only in the game's launch-day review, and use the DirectX 11 renderer in our regular VGA reviews.
With Gears 5, AMD claims performance of 73 FPS at 4K, with its DirectX 12 renderer, with the "Ultra" preset. This number ends up 16.24 percent faster than that of the RTX 2080 Ti, which scores 62.8 FPS on our test bed. The RTX 3080 is 15.61 percent faster than the AMD card, at 84.4 FPS.

Call of Duty: Modern Warfare was never added to our VGA review test selection, but we tested the game separately at launch (find its test bed information here). In this testing, we found the RTX 2080 Ti to score 77.9 FPS at Ultra Settings, with RTX-off. In comparison, AMD claims Big Navi scores 88 FPS, making it 12.96 percent faster.

We know this is a very coarse and unscientific way to compare AMD's numbers to ours, and AMD has probably cherry-picked games that are most optimized to its GPUs, but it lends plausibility to a theory that Big Navi may end up comparable to the RTX 2080 Ti, and trade blows with the upcoming RTX 3070, which NVIDIA claims outperforms the RTX 2080 Ti. The RX 6000 "Big Navi" ends up with a geometric mean of 21% higher frame-rates than the RTX 2080 Ti in these three tests, which would imply almost double the performance of the RX 5700 XT. Big Navi is rumored to feature double the CUs as the the RX 5700 XT, so the claims somewhat line up.
Add your own comment

262 Comments on AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs

#176
INSTG8R
Vanguard Beta Tester
cuemanjust my re-search ,calculation and sharp compare and etc etc.. info


well,i cant see that rx 6900 xt can reach even close rtx 3080 performance..i say rtx 3080 10/20gb is 20-25% faster.


amd use it 'teaser' video 5900xt 12-core cpu ,what ,if rtx 3080 get it for test,helps much just rtx 3080/3090 gpus, bcoz one of limited/bottleneckthouse gpus are cpu ,and 5900 xt 12-core helps.

also rx 6900xt have 16gb memory ,and its helps just there high resolution gaming, like amd 'teaser' video' test ,4K resolution.



still different is quite clear, even amd own cherry picked games...what is odd choice from them, bcoz almost no1 test site NOT use thouse...hmm.

lol unpossible to compare. that amd want..or why?


come on amd.?.lol




summarum


rtx 3080 20Gb gpu with amd 5900xt gpu or intel Rocket Lake 11th generation cpu,( coming btw Q1/2021, helps, i dare say, 5-10% more.

so different is between rx 6900xt vs rtx 3080 10GB/20GB is between 20-25%


meaning practical example rx 6900 50fps VS rtx 3080 60 fps..

i feel i do favor for amd big navi that omen..lol


we,let see.





finaly, i cant belive amd 300W tdp for rx 6900 xt.. hmm well sure amd can say tdp is 300W ,but its unpossible for example gaming session....

bcoz even single rx 5700 xt, its fastest variant, sapphire nitro+ SE AIB gpu, official tdp is 285w and measured 'average gaming' (yes, Techpowerup test) also, 285W

btw, peak was 296W ...wow.

hm,its btw,more than example rtx 2080 ti AIB versions,example MSI rtx 2080 ti Lightning eat watts, its have average gaming tdp 279W....6w less than rx 5700 sapphire nitro+ SE AIB gpu...


but,btw again, rtx 2080 ti Lightning is 44% faster than rx 5700 sapphire nitro+ SE AIB gpu,this for 4K gaming...hmm

(Techpowerup test) tx again...



so, i dont think that rx 6900 xt what practical is 2x5700xt for CF mode can do jod same values.....lol no way.


we must remembe,there is no different rdna2 and rdna1 for moust important issues

and i mean both use still 7nm core and both use still also gddr6 memory.

also RT support eat juice, 25w?


i say ..hmm. i dare say.. 360W average gaming, for rx 6900 xt, peak near 400w, this IF rx 6900xt get even close ,that 20% of rtx 3080 speed.

this bcoz there is several test for internet with rx 5000 gpus CF modes.

and example when testing CF mode rx 5600xt and rx5700, they measure tdp 464W, yes 464W....

gogole and you find.

if rx 6900 xt loose about 35% i belive 300w tdp.


so much power must have these days top gpus that it need current alot.


so,we must remembe that rx 5700 xt is already 7nm gpu, so amd cant do much there power saving,

but rtx 3080 is 8nm gpu and its little brothers rtx 2000 series is 12nm maded,different is big ,50% smaller die,its alot!


so when we know that rtx 3080 have amazing performance it tdp and average gaming measure watts eat, 303W, is excellent.


well ,we seen soon.



rtx 3000 series really earn all its glory.yes



also, when rtx 3070 and rtx 3060 ti coming (heard that 3070 Ti is cancelled..?)

i bet thouse gpus oc'd very well,and my calcuting at least rtx 3070 AIB Lighning is hard case rx 6900 xt...maybe, but sure is,its alot cheaper. 200$ i guess


well ,let see that too soon.




p.s. rtx 3090 is it own category, but its make just 8K gaming,rtx 3080 10gb & 20GB for 4K gaming ,rtx 3060 ti and rtx 3070/super FHD and WHD gaming, rtx 3070 Super (16gb) also 4K gaming.


3rd gpu:

but, where is intel Xe gpu!?

it is great to see 3 gpus battle line!



rs 6900 xt price:

i cant belive that amd cant sell rx 6900 xt under rtx 3080 price, little more, i say 699-799$



notice...

amd need cash,moore than nvidia.


its cant continue forever overdraft, bcoz its zero/2-3% profit price politics.





all great end of weekend!
So much garbage in that post so I‘ll just correct the one piece of misinformation in that wall of misinformation. I have a Sapphire 5700XT Nitro +and unless you’re raising the power limit it’s never over 220W. There was a “special edition“ that is irrelevant since the boost clocks were adjusted making its minor OC nothing. Even IF was us more power it would still never be over 235W unless forced
Posted on Reply
#178
Vayra86
nguyenOh don't you worry, the e-peen crowd already gobble up any 3090 stock they can find :D. Well I too would like a piece but all the vendors in my country are price gouging the hell outta 3080/3090, going as high as +400-500usd over MSRP for both, somehow that is legal in third world countries.
Not that there were few 2080 Ti buyers either, with 2080 Ti registering almost 1% in steam hardware survey.
There are many more rich kids out there than you might think :laugh:
That is just a whole lot of cognitive dissonance on your part. 1% is as niche as it gets.
Posted on Reply
#179
minami
It is interesting that SONY adopted a primitive shader for PS5.
Posted on Reply
#180
INSTG8R
Vanguard Beta Tester
BlueberriesNo, I'm speculating just like yourself but for some reason you think the burden of proof is on me to disprove your theory that they're hiding a more powerful GPU beyond a curtain. You're the one throwing flagrant speculation and I'm being realistic.
Where anywhere on this forums have I made any claims let alone the outrageous claims you’ve made? We’ve seen one short benchmark run and some infamous AMD graphs(yes( I think their graphs are always hilarious)So apparently just by that small amount information you can declare it a flop without out any hard proof while accusing me for some reason claiming it’s I guess the opposite of your flop but an all out victory? I don’t see anybody making those kinda of claims but you again while denying any burden of proof of your theory.
Posted on Reply
#181
nguyen
Vayra86That is just a whole lot of cognitive dissonance on your part. 1% is as niche as it gets.
Again with the insults already ? do you know what size the player base of Steam is ?
Even the 5700 XT has not reached 1% yet, talking about cognitive dissonance ?
Posted on Reply
#182
Razbojnik
So basically 3070 only more expensive and with worst drivers...mkay.
Posted on Reply
#183
Vayra86
nguyenAgain with the insults already ? do you know what size the player base of Steam is ?
Even the 5700 XT has not reached 1% yet, talking about cognitive dissonance ?
Insult? Observation. Are you saying 2080ti is doing 1% on Steam as well? Source?
Posted on Reply
#184
nguyen
Vayra86Insult? Observation. Are you saying 2080ti is doing 1% on Steam as well? Source?
If you have half a brain then you would know where to find it
So 2080 Ti is so niche, yet still out sold 5700 XT, talk about cognitive dissonace :)
Posted on Reply
#185
Vayra86
nguyenIf you have half a brain then you would know where to find it
So 2080 Ti is so niche, yet still out sold 5700 XT, talk about cognitive dissonace :)
again... Source?! Youre the one presenting cool facts here, dont run away now. Im not going to search for your realities...

Here is what I see. Perspective matters, as of October the share is 2% for just the 5700XT and 1.29% for thr 2080TI. So quite a gap still and its growing 61% more for 5700xt.

gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-AMD-RX-5700-XT/4027vs4045

Then we get a decent high end card for some perspective...

2070S. 3.38%
that is nearly 3x as many cards as the 2080Ti
gpu.userbenchmark.com/Compare/Nvidia-RTX-2070S-Super-vs-AMD-RX-5700-XT/4048vs4045

So... come again?
Posted on Reply
#186
nguyen
Vayra86again... Source?! Youre the one presenting cool facts here, dont run away now. Im not going to search for your realities
Can't read ? too poor for education ?
What part of "almost 1% of Steam Hardware Survey" is too hard to understand ?
Steam has a player base of 90 millions active users as of april 2019, 1% of that means 900 000 players own 2080 Ti. Yeah it's niche alright.
Posted on Reply
#187
Vayra86
nguyenCan't read ? too poor for education ?
What part of "almost 1% of Steam Hardware Survey" is too hard to understand ?
Steam has a player base of 90 millions active users as of april 2019, 1% of that means 900 000 players own 2080 Ti. Yeah it's niche alright.
Dont dig yourself a deeper hole and read the above.

Besides... 900K over 90 million is still a niche, its still 1% and that means thereis a whopping 99% not using said card. Im not sure what education youve enjoyed but I do think I prefer mine. I think cognitive dissonance is the perfect term here, and again its NOT an insult, Im trying to get a point across that simple stats should have already shown you.

You also completely gloss over the fact that 5700XT is competing with numerous others in the performance and price bracket, even two past generations have cards that match it. The 2080ti does not.
Posted on Reply
#188
INSTG8R
Vanguard Beta Tester
Vayra86Dont dig yourself a deeper hole and read the above.

Besides... 900K over 90 million is still a niche, its still 1% and that means thereis a whopping 99% not using said card. Im not sure what education youve enjoyed but I do think I prefer mine. I think cognitive dissonance is the perfect term here, and again its NOT an insult, Im trying to get a point across that simple stats should have already shown you.
I mean without even checking I know the 1060 is the most used GPU using the Hardware Survey as a metric. The conclusions you can draw from that can be interpreted differently I suppose but it tells me the average user runs an average card.
Posted on Reply
#189
nguyen
Vayra86Dont dig yourself a deeper hole and read the above.

Besides... 900K over 90 million is still a niche, its still 1% and that means thereis a whopping 99% not using said card. Im not sure what education youve enjoyed but I do think I prefer mine.
Certainly less niche than all AMD cards but the RX570/580. Bravo for calling AMD fans out.
Being 25th most popular out of hundred of GPU is niche.
Yup.
Tell me who has cognitive dissonance again ?
Posted on Reply
#190
Vayra86
INSTG8RI mean without even checking I know the 1060 is the most used GPU using the Hardware Survey as a metric. The conclusions you can draw from that can be interpreted differently I suppose but it tells me the average user runs an average card.
People forget Steam is polluted with tons of dead accounts, and legacy GPUs and IGP.

Userbenchmark is more accurate as it reflects actual run benches per time frame.
nguyenCertainly less niche than all AMD cards but the RX570/580. Bravo for calling AMD fans out.
Being 25th most popular out of hundred of GPU is niche.
Yup.
Tell me who has cognitive dissonance again ?
Are you that dense or unable to swallow your pride?! The facts are clear and this was never an AMD pissing contest, this was about how niche a top end GPU like 2080ti and now the 3090 were, remember?!

Wow, man. Just wow. Its very clearly you. Just stop here, it wont end well.
Posted on Reply
#191
nguyen
Vayra86People forget Steam is polluted with tons of dead accounts, and legacy GPUs and IGP.

Userbenchmark is more accurate as it reflects actual run benches per time frame.

Are you that dense or unable to swallow your pride?! The facts are clear and this was never an AMD pissing contest, this was about how niche a top end GPU like 2080ti and now the 3090 were, remember?!

Wow, man. Just wow. Its very clearly you. Just stop here, it wont end well.
Explain how Userbenchmarks is more trustyworthy than Steam ? because their users are verified ? Peope can't do duplicate benchmarks ?
Talking about digging graves :)

Out of hundred of GPUs, 2080 Ti being the 25th most popular is being consider a niche. See how flawed your reasoning is ?

From dictionary.cambridge.org/dictionary/english/niche

Niche:
interesting to, aimed at, or affecting only a small number of people

Somehow you can't distinguish between a "small number" and a "small percentage" can you.
Glad I wasn't in any part of your education
Posted on Reply
#192
INSTG8R
Vanguard Beta Tester
Vayra86People forget Steam is polluted with tons of dead accounts, and legacy GPUs and IGP
While I agree that would be when Intel iGP was top dog! :roll:At least the 1060 makes for a sensible majority card. I have never once used userbench. I just run 3Dmark on driver changes for comparisons.
Posted on Reply
#193
Totally
Idc either way, don't plan to upgrade this cycle. Unless those numbers have merit and they manage to make it a sub it 300w card, sub 250w if I were to be greedy but unlikely since Nvidia has pushed the envelope by to setting up camp at 400W+, AMD is going to be like "400w? Don't mind if we do." latter which I feel is likely to be the case.
beedoo@cueman; remember, mate, Winners don't do drugs!
Unless it's tiger blood. Then for sure you know that you're "winning!".
Posted on Reply
#194
Zach_01
cuemanjust my re-search ,calculation and sharp compare and etc etc.. info


well,i cant see that rx 6900 xt can reach even close rtx 3080 performance..i say rtx 3080 10/20gb is 20-25% faster.


amd use it 'teaser' video 5900xt 12-core cpu ,what ,if rtx 3080 get it for test,helps much just rtx 3080/3090 gpus, bcoz one of limited/bottleneckthouse gpus are cpu ,and 5900 xt 12-core helps.

also rx 6900xt have 16gb memory ,and its helps just there high resolution gaming, like amd 'teaser' video' test ,4K resolution.



still different is quite clear, even amd own cherry picked games...what is odd choice from them, bcoz almost no1 test site NOT use thouse...hmm.

lol unpossible to compare. that amd want..or why?


come on amd.?.lol




summarum


rtx 3080 20Gb gpu with amd 5900xt gpu or intel Rocket Lake 11th generation cpu,( coming btw Q1/2021, helps, i dare say, 5-10% more.

so different is between rx 6900xt vs rtx 3080 10GB/20GB is between 20-25%


meaning practical example rx 6900 50fps VS rtx 3080 60 fps..

i feel i do favor for amd big navi that omen..lol


we,let see.





finaly, i cant belive amd 300W tdp for rx 6900 xt.. hmm well sure amd can say tdp is 300W ,but its unpossible for example gaming session....

bcoz even single rx 5700 xt, its fastest variant, sapphire nitro+ SE AIB gpu, official tdp is 285w and measured 'average gaming' (yes, Techpowerup test) also, 285W

btw, peak was 296W ...wow.

hm,its btw,more than example rtx 2080 ti AIB versions,example MSI rtx 2080 ti Lightning eat watts, its have average gaming tdp 279W....6w less than rx 5700 sapphire nitro+ SE AIB gpu...


but,btw again, rtx 2080 ti Lightning is 44% faster than rx 5700 sapphire nitro+ SE AIB gpu,this for 4K gaming...hmm

(Techpowerup test) tx again...



so, i dont think that rx 6900 xt what practical is 2x5700xt for CF mode can do jod same values.....lol no way.


we must remembe,there is no different rdna2 and rdna1 for moust important issues

and i mean both use still 7nm core and both use still also gddr6 memory.

also RT support eat juice, 25w?


i say ..hmm. i dare say.. 360W average gaming, for rx 6900 xt, peak near 400w, this IF rx 6900xt get even close ,that 20% of rtx 3080 speed.

this bcoz there is several test for internet with rx 5000 gpus CF modes.

and example when testing CF mode rx 5600xt and rx5700, they measure tdp 464W, yes 464W....

gogole and you find.

if rx 6900 xt loose about 35% i belive 300w tdp.


so much power must have these days top gpus that it need current alot.


so,we must remembe that rx 5700 xt is already 7nm gpu, so amd cant do much there power saving,

but rtx 3080 is 8nm gpu and its little brothers rtx 2000 series is 12nm maded,different is big ,50% smaller die,its alot!


so when we know that rtx 3080 have amazing performance it tdp and average gaming measure watts eat, 303W, is excellent.


well ,we seen soon.



rtx 3000 series really earn all its glory.yes



also, when rtx 3070 and rtx 3060 ti coming (heard that 3070 Ti is cancelled..?)

i bet thouse gpus oc'd very well,and my calcuting at least rtx 3070 AIB Lighning is hard case rx 6900 xt...maybe, but sure is,its alot cheaper. 200$ i guess


well ,let see that too soon.




p.s. rtx 3090 is it own category, but its make just 8K gaming,rtx 3080 10gb & 20GB for 4K gaming ,rtx 3060 ti and rtx 3070/super FHD and WHD gaming, rtx 3070 Super (16gb) also 4K gaming.


3rd gpu:

but, where is intel Xe gpu!?

it is great to see 3 gpus battle line!



rs 6900 xt price:

i cant belive that amd cant sell rx 6900 xt under rtx 3080 price, little more, i say 699-799$



notice...

amd need cash,moore than nvidia.


its cant continue forever overdraft, bcoz its zero/2-3% profit price politics.





all great end of weekend!
So many misinformation in that post, Idont know were to begin...
I have an MSI 5700XT GamingX AIB card and the card hits 2080MHz core boost (VRAM 1800MHz) and a total peak of 240W (GPU+VRAM). So its a little OCed from base specifications and still a 240W. Nominal 5700XTs are about 220~225W.
If you double that just as it is you got 440~450W. But that is NOT the case. RDNA2 is on enhanced 7nm node of TSMC with perf/watt improvements and 15~20% more density, and its not on the same 7nm that RDNA1/ZEN2/3 is.

So enhanced 7nm + some redesign that AMD done to the cores... we are facing a +50% performance per watt. It means that if you have a GPU with same performance as the 5700XT with all these enhancements it will draw 145~150W. Double that and you are at 290~300W region.

BUT... BUT... you must account that not all parts of the core are doubled exactly. The CUs yes (40 >> 80) but what about the memory controller (still 256bit?) and some other stuff that I will not refer to. So.. again... from that 290~300W you must cut another chunk of... lets say 20~30W. So now you are at 260~290W with doubled raw power of a 5700XT (220~225W) at the same max clock speed (2000~2050MHz).

Also... its rumored that top RDNA2 GPU is clocked to max boost at least 2200MHZ. Thats another increase from 2050MHz of a lets say +7~8% of power draw.
Again... 260~290W +7~8% = 280~310W.

You end-up with a 280~310W of a GPU card with more than 100%, more than double the raw power of 5700XT.
When gaming is that close enough to a 3080? We will see in 17 days...

-----------

For the price of this card...
AMD appears to have made some drastic choises to keep the cost in low levels. First the low memory bandwidth (256bit?) and the usage of GDDR6. Both these 2 are cheaper to make, and also reduce significantly the cost of the core because of the reduced complexity.
So... this top big navi card wont be as expensive as a 3080 is with the premium GDDR6X chips and the complex 320bit controller.

-----------

Before anyone make any assumptions about next AMD GPU out of thin air and does not know what is talking about... should think first.

And something else...

We dont know those gaming numbers AMD provided... with what CPU and with what card was. Remember that at 4K it will not matter much if it was a 5900X/5950X or a 10850K/10900K.
Posted on Reply
#195
Vayra86
nguyenExplain how Userbenchmarks is more trustyworthy than Steam ? because their users are verified ? Peope can't do duplicate benchmarks ?
Talking about digging graves :)

Out of hundred of GPUs, 2080 Ti being the 25th most popular is being consider a niche. See how flawed your reasoning is ?

From dictionary.cambridge.org/dictionary/english/niche

Niche:
interesting to, aimed at, or affecting only a small number of people

Somehow you can't distinguish between a "small number" and a "small percentage" can you.
Glad I wasn't in any part of your education
A small percentage is always a small number because it relates to a larger whole. This isnt about education anymore is it, and it never was. But keep twisting....you are defending the point that a top SKU is not serving a niche. Good luck. Surely you wont look like a nutcase keeping that up.
INSTG8RWhile I agree that would be when Intel iGP was top dog! :roll:At least the 1060 makes for a sensible majority card. I have never once used userbench. I just run 3Dmark on driver changes for comparisons.
Userbenchmark is even giving the 2080ti more share than Steam so it only supports the 1% statement ;) Nguyen is apparently so blinded by whatever that he fails to see that even with that acknowledgment he is still reading statistic wrong entirely...

Oh well
Posted on Reply
#196
INSTG8R
Vanguard Beta Tester
Vayra86A small percentage is always a small number because it relates to a larger whole. This isnt about education anymore is it, and it never was. But keep twisting....you are defending the point that a top SKU is not serving a niche. Good luck. Surely you wont look like a nutcase keeping that up.



Userbenchmark is even giving the 2080ti more share than Steam so it only supports the 1% statement ;) Nguyen is apparently so blinded by whatever that he fails to see that even with that acknowledgment he is still reading statistic wtong entirely...

Oh well
Yeah I only know of it in passing as a thing and bought 3Dmark years ago so a Time Spy run is all I need to see i anything is happening between software;BIOS/Driver changes
Posted on Reply
#197
sepheronx
I do not know why it's so hard for you guys to wait till reviews are out to share judgement.
Posted on Reply
#198
ZoneDymo
sepheronxI do not know why it's so hard for you guys to wait till reviews are out to share judgement.
Humanity never fails to disappoint
Posted on Reply
#199
nguyen
Vayra86A small percentage is always a small number because it relates to a larger whole
"a billion dollars is a small amount of money because that is less than 1% of what Bezos made" :)
this is what your statement implied.
Does Bezos himself think 1bil is a small amount of money ? definitely not, since he has sound mind after all.

oh well since I dont bully disabled kid I just forgive all your insults, gave me good laugh also.
Posted on Reply
#200
Maye88
Can you correct your benchmarks for the 3080? Cause you used framerates for the 3080 for BL3 on ULTRA not Badass...

I also think AMD purposefully chose from among their worst games. Historically AMD has always underperformed on UE4 compared to Nvidia.

Posted on Reply
Add your own comment
May 20th, 2025 02:03 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts