Friday, October 9th 2020

AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs

AMD in its October 8 online launch event for the Ryzen 5000 "Zen 3" processors, provided a teaser of the company's next flagship graphics card slotted in the Radeon RX 6000 series. This particular SKU has been referred to by company CEO Lisa Su as "Big Navi," meaning it could be the top part from AMD's upcoming client GPU lineup. As part of the teaser, Su held up the reference design card, and provided three performance numbers of the card as tested on a machine powered by a Ryzen 9 5900X "Zen 3" processor. We compared these performance numbers, obtained at 4K UHD, with our own testing data for the games, to see how the card compares to other current-gen cards in its class. Our testing data for one of the games is from the latest RTX 30-series reviews, find details of our test bed here. We obviously have a different CPU since the 5900X is unreleased, but use the highest presets in our testing.

With "Borderlands 3" at 4K, with "badass" performance preset and DirectX 12 renderer, AMD claims a frame-rate of 61 FPS. We tested the game with its DirectX 12 renderer in our dedicated performance review (test bed details here). AMD's claimed performance ends up 45.9 percent higher than that of the GeForce RTX 2080 Ti as tested by us, which yields 41.8 FPS on our test bed. The RTX 3080 ends up 15.24 percent faster than Big Navi, with 70.3 FPS. It's important to note here that AMD may be using a different/lighter test scene than us, since we don't use internal benchmark tools of games, and design our own test scenes. It's also important to note that we tested Borderlands 3 with DirectX 12 only in the game's launch-day review, and use the DirectX 11 renderer in our regular VGA reviews.
With Gears 5, AMD claims performance of 73 FPS at 4K, with its DirectX 12 renderer, with the "Ultra" preset. This number ends up 16.24 percent faster than that of the RTX 2080 Ti, which scores 62.8 FPS on our test bed. The RTX 3080 is 15.61 percent faster than the AMD card, at 84.4 FPS.

Call of Duty: Modern Warfare was never added to our VGA review test selection, but we tested the game separately at launch (find its test bed information here). In this testing, we found the RTX 2080 Ti to score 77.9 FPS at Ultra Settings, with RTX-off. In comparison, AMD claims Big Navi scores 88 FPS, making it 12.96 percent faster.

We know this is a very coarse and unscientific way to compare AMD's numbers to ours, and AMD has probably cherry-picked games that are most optimized to its GPUs, but it lends plausibility to a theory that Big Navi may end up comparable to the RTX 2080 Ti, and trade blows with the upcoming RTX 3070, which NVIDIA claims outperforms the RTX 2080 Ti. The RX 6000 "Big Navi" ends up with a geometric mean of 21% higher frame-rates than the RTX 2080 Ti in these three tests, which would imply almost double the performance of the RX 5700 XT. Big Navi is rumored to feature double the CUs as the the RX 5700 XT, so the claims somewhat line up.
Add your own comment

262 Comments on AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs

#101
Vayra86
efikkanThe fine wine argument again, that's a nonsensical argument, and more than a little misleading.
Buy an inferior product today, which may be superior sometime later, but it never happens.
Nvidia's drivers improve performance just as much as AMD's over time.


Even if they did, I don't believe misleading competitors or consumers is a good thing. We need healthy competition and fair play.


If the performance lands between RTX 3070 and RTX 3080 and the price does too, then it's fair. But if it's less performant at the same price, then people shouldn't buy it.
Even if "big Navi" happens to have more OC headroom, that shouldn't be a general sales argument, most gamers want stable computers for gaming, not overclocking for setting records.


Aah, the eternal "future proofing" argument; buy inferior stuff with superior "specs" that will win over time. :rolleyes:
Our best guidance is always real world performance, not anecdotes about what will perform better years from now.
You're forgetting that even at 720 they will still undercut Nvidia on MSRP for any AIB version really. Like you say. If the price is right, as always. Thing is, I'm not liking the balance the 3080 has with core / VRAM cap. You're at liberty to think otherwise, but in a similar way, the 680 was in a remarkably similar place. I didn't pull that comparison out for nothing. We're at the eve of a new console gen here and those also carry somewhat higher VRAM capacities than Nvidia's finest, and I think it needs no discussion that the 2GB Kepler /refresh cards were obsolete faster than many would have liked. Faster than I liked personally at least - I bought a 770 and did replace with a 780ti only to get the 3GB VRAM as 2GB was falling short even at 1080p, for quite a few titles. Only one gen later the VRAM caps doubled and for good reason - the consoles had more too.

Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.

Ironically, the best handle on future proofing is gained by looking back. History repeats.
Posted on Reply
#102
Luminescent
You know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
They can afford this tactic because they rule desktop market.
New games will widen the gap between old and new more and more.
Posted on Reply
#103
Vayra86
LuminescentYou know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
They can afford this tactic because they rule desktop market.
New games will widen the gap between old and new more and more.
Wrong, both competitors keep adding support for new titles on all active GPU families.
Posted on Reply
#104
efikkan
Vayra86You're forgetting that even at 720 they will still undercut Nvidia on MSRP for any AIB version really.
There are plenty of slid AiB models at the same MSRP as Nvidia's founders edition ($700).
Vayra86Like you say. If the price is right, as always. Thing is, I'm not liking the balance the 3080 has with core / VRAM cap. You're at liberty to think otherwise, but in a similar way, the 680 was in a remarkably similar place.
And I always say the truth is in good (real world) benchmarking, not in anecdotes about how many cores, GBs, ROPs, TMUs etc. feels right.
As I've told you before, the fact remains that increasing VRAM usage nearly always requires both more bandwidth and computational performance in order to utilize it, which is why you usually will hit other bottlenecks long before VRAM.
Vayra86We're at the eve of a new console gen here and those also carry somewhat higher VRAM capacities than Nvidia's finest
Not really, that's combined system RAM and VRAM, so it's not a fair comparison.
Vayra86Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.
No objection there. ~250W seems to be the spot where cooling becomes a hassle.
LuminescentYou know why AMD ages ( fine wine....) better than Nvidia ? because Nvidia stops optimizing drivers for older cards and AMD doesn't.
They can afford this tactic because they rule desktop market.
Exactly how many years do you have to wait then?
The fact is Nvidia retains legacy support, AMD drops it.
AMD can't even get their current driver support right, how can they be better at legacy support?
LuminescentNew games will widen the gap between old and new more and more.
Yeah right.
I'm sure those old Radeon 200/300 will finally be unleashed any day now. :rolleyes:
Posted on Reply
#105
Zach_01
Vayra86Another thing of note is TDP. I'm not a huge fan of anything north of 220-250W. Air will struggle with that.
Actually I'm ok at the 300~320W draw as I owned a R9 390X from 2015 to 2018. But I'm also case-less so that helps.
Posted on Reply
#106
Luminescent
Vayra86Wrong, both competitors keep adding support for new titles on all active GPU families.
Yeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
Driver tricks will make RTX 3000 more and more appealing in new games.
I agree efikkan, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.
Posted on Reply
#107
TheoneandonlyMrK
LuminescentYeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
Driver tricks will make RTX 3000 more and more appealing in new games.
I agree efikkan, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.
Okay so interesting way to spend a Friday, I disagree with some of that.
No sorry all of it.

Opaquifying the B's for mortals, good work.
Posted on Reply
#108
evernessince
I'm not sure this is cherry picking. Many of the games chosen favor Nvidia, not AMD. The same could be said of AMD's recent Zen 3 announcement. Far Cry Primal for example heavily favors Intel yet AMD choose to use it.

Of course I am going to wait for 3rd party benchmarks but it's nice to see a company that isn't trying to insult your intelligence with "2X the Performance!!!!! *At 8K with RTX enabled vs a 2080"
Posted on Reply
#109
Space Lynx
Astronaut
I just checked on AMD's bicycles. They are sold out as well, lmao. wow. just wow.
Posted on Reply
#110
Kaleid
Some of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
I'll go for bang for buck as always
Posted on Reply
#111
nguyen
KaleidSome of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
I'll go for bang for buck as always
Why do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.
Posted on Reply
#112
sepheronx
nguyenWhy do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.
Because the RX 5700 was a good bang for your buck GPU as well.
Posted on Reply
#113
nguyen
sepheronxBecause the RX 5700 was a good bang for your buck GPU as well.
Saying you only going for bang for buck meaning you would choose the highest bang for buck GPU right ?

But when you would like to have more performance while having good bang for buck, then the 3080 is just as good bang for buck as the 5700, they come to around the same FPS/dollar (provided you can buy a 3080 right now :D).

So had @Kaleid say "Most people can't afford the 3080", then I wouldn't have any issue with that, since Ampere is rolling out at every price point just like Turing anyways.
Posted on Reply
#114
sepheronx
nguyenSaying you only going for bang for buck meaning you would choose the highest bang for buck GPU right ?

But when you would like to have more performance while having good bang for buck, then the 3080 is just as good bang for buck as the 5700, they come to around the same FPS/dollar (provided you can buy a 3080 right now :D).

So had @Kaleid say "Most people can't afford the 3080", then I wouldn't have any issue with that, since Ampere is rolling out at every price point just like Turing anyways.
Not even close to same regard. Especially depending on location and prices. My location doesn't have a 3080 below $1000 cad. And since the two cards are of two different generations, it isn't worth comparing the two since the RX 5700 was released to compete around RTX 2060.

Better wait and see RDNA 2 results to compare better bang for your buck.
Posted on Reply
#115
Blueberries
KaleidSome of you are really overstating the importance of having the fastest GPU. 3080 is so expensive that it won't matter to most gamers. Ever..
I'll go for bang for buck as always
So where do they cut cost? Performance inherently improves price/perf. They can't just sell cards at any price they want and earn a profit. There has to be some engineering improvement to undercut NVIDIA if the performance is similar and even more so if it's lower.

The cards are in NVIDIA'S hand until they release the 2060
Posted on Reply
#116
GoldenX
Could it be? A decent Radeon release after so long?
Nah, they will find a way to screw it up, and it better not be disabling ray tracing due to "reasons".
Posted on Reply
#117
nguyen
sepheronxNot even close to same regard. Especially depending on location and prices. My location doesn't have a 3080 below $1000 cad. And since the two cards are of two different generations, it isn't worth comparing the two since the RX 5700 was released to compete around RTX 2060.

Better wait and see RDNA 2 results to compare better bang for your buck.
Bang for buck is only an excuse for lower performing products. Zen CPU has gone from best bang for buck to being premium CPU now. If AMD had the best GPU they would be pricing them accordingly. At the end of the day nobody want to get paid minimum wages :laugh: .

I reckon RDNA2 better has 30% more perf/dollar than Ampere in rasterization, only then it makes sense overlooking its lack of finesses in other area (RT/DLSS/HW Encoding/Drivers/Reflex)
Posted on Reply
#118
Kaleid
nguyenWhy do you have a RX 5700 when the RX 580/570 have way better bang for buck ? asking for some consistency in your reasoning here.
I'm also...obviously aiming for a certain performance at a certain resolution.
Besides, I bought the 5700 used so i saved 100 USD doing so the 580 wouldn't have any chance when it comes to bang for the buck.
And since 3080 is inserted into the discussion, as it also provides lot of the money. Sure, but since I'm at 1440p it's performance would be utterly wasted with my monitor.

AMD raised the 5000 series CPU prices but I'm quite certain that very few will expect AMD to do the same with their new GPUs.

I'll change GPU next year, but not until I get to see MS direct storage in action.
Posted on Reply
#119
Shou Miko
Dristunwhat? They didn't tout it as such - in their own slides.
That really became where the card shine after it was released tried it in both 1440p and 4K compared to my old GTX 1080 Ti.
Posted on Reply
#120
cueman
hmm,when you get off amd's cherry picked amd games,ryzen 5900 cpu help and also 16gb memory help,specially just 4K resolution its help,looks clear that rtx 3080 is 20-25% faster than big navi.

so, when choose 20gb rtx 3080 and using same cpu ryzen 5900 for test...no doubt.


but, sure,its seen so clear,..amd cant make any miracle its building...still old parts and usual updates.
i mean rrx 6900 xt is 2 x 5700xt with CF mode, so rx 6900 xt cant be more than that for performance...and amds tdp value info for that performance, 300W is joke.
i say we took over 350W.and i mean if it get even 80% rtx 3080 speed.

interesting is also RT speed..and that category amd is child situation...also,RT support btw, eat more juice... 25W?


we can imagine..erh, or say wait this review Techpowerup:


RTX 3080 FE 10GB/20GB + ryzen 5900 cpu VS RX 6900 XT 16GB + ryzen 5900 cpu

i can bet 25% win for rtx 3080...and i add at least 20% more different when rtx 3090/ryzen 5900 cpu put against rx 6900 xt.

i see that rtx 3000 series, its highers gpus need powerfull cpu for partner... now i say ryzen 5900 or so, later intel 10nm and of coz 7nm one.


hmm rtx 3070/TI against rx 6900 xt....well, sure is that rtx 3070 is cheaper, and might fight hard against rx 6900 xt.... loose 10% and be 200$ cheaper....?

let see...waiting test.. only few weeks!


also,is it nice to get battleline also intel Xe gpu, then we have 3 gpus for soap!
Posted on Reply
#121
Chomiq
lynx29I just checked on AMD's bicycles. They are sold out as well, lmao. wow. just wow.
Well here's hoping AMD gpus aren't as shitty as those bikes.
Posted on Reply
#122
300BaudBob
LuminescentI think Nvidia estimated correctly what AMD will launch based on what they did with the consoles and they pushed the hell out of that silicon reaching a new record of GPU power consumption.
If you now build a computer with a 10900k and RTX 3080 you can skip buying a heater in the winter, these two can actually heat up the room :).
Stick the water cooler radiators out the window near heat loving plants; or use a themocouple to convert heat to run fans; or use it to boost cold water temps for your water heater...all sorts of fun things to do with that heat :)
Posted on Reply
#123
INSTG8R
Vanguard Beta Tester
While I have nothing to say about the numbers but I can say the BL3 presentation was using the built in benchmark I recognized it.
Posted on Reply
#124
Nephilim666
cuemanalso,is it nice to get battleline also intel Xe gpu, then we have 3 gpus for soap!
Sums up your post well. Nonsense.
Posted on Reply
#125
Vayra86
LuminescentYeah but the new gpu gets the better treatment, just look at horizon zero dawn which launched a bit before rtx 3080/3090, perfect candidate for a "driver fix", i think in this title rtx3080 distances the most from 2080ti, not so much in old games.
Driver tricks will make RTX 3000 more and more appealing in new games.
I agree efikkan, AMD sucks, they will launch a mediocre GPU with bad drivers at first and probably consume a lot of power, but, they won't handicap a good card to force you to buy a new generation because they already suck and they don't care much about this market, they care about the millions of orders from Microsoft and Sony much more than the very very few "enthusiast" that might buy these high end cards, that's why first drivers will be so bad you are lucky if you boot into windows, why bother for practically no money compared to what they sell to consoles.
They are here to piss off Nvidia a bit but not really compete.
No. Just:.. no, sorry. You can leave this drivel elsewhere.
efikkanThere are plenty of slid AiB models at the same MSRP as Nvidia's founders edition ($700).


And I always say the truth is in good (real world) benchmarking, not in anecdotes about how many cores, GBs, ROPs, TMUs etc. feels right.
As I've told you before, the fact remains that increasing VRAM usage nearly always requires both more bandwidth and computational performance in order to utilize it, which is why you usually will hit other bottlenecks long before VRAM.


Not really, that's combined system RAM and VRAM, so it's not a fair comparison.


No objection there. ~250W seems to be the spot where cooling becomes a hassle.


Exactly how many years do you have to wait then?
The fact is Nvidia retains legacy support, AMD drops it.
AMD can't even get their current driver support right, how can they be better at legacy support?


Yeah right.
I'm sure those old Radeon 200/300 will finally be unleashed any day now. :rolleyes:
You can keep repeating the blanket statements but Im referring specifically to the bakance of this particular 3080. Its a typical Nvidia move to place it like this and after the 670/680/770 and numerous other examples Im pretty capable of predictions in this regard.

The gpu has shit balance, like it or not, and 700 isnt exactly a steal for such product.
Posted on Reply
Add your own comment
Apr 25th, 2024 10:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts