Friday, October 9th 2020

AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs
AMD in its October 8 online launch event for the Ryzen 5000 "Zen 3" processors, provided a teaser of the company's next flagship graphics card slotted in the Radeon RX 6000 series. This particular SKU has been referred to by company CEO Lisa Su as "Big Navi," meaning it could be the top part from AMD's upcoming client GPU lineup. As part of the teaser, Su held up the reference design card, and provided three performance numbers of the card as tested on a machine powered by a Ryzen 9 5900X "Zen 3" processor. We compared these performance numbers, obtained at 4K UHD, with our own testing data for the games, to see how the card compares to other current-gen cards in its class. Our testing data for one of the games is from the latest RTX 30-series reviews, find details of our test bed here. We obviously have a different CPU since the 5900X is unreleased, but use the highest presets in our testing.
With "Borderlands 3" at 4K, with "badass" performance preset and DirectX 12 renderer, AMD claims a frame-rate of 61 FPS. We tested the game with its DirectX 12 renderer in our dedicated performance review (test bed details here). AMD's claimed performance ends up 45.9 percent higher than that of the GeForce RTX 2080 Ti as tested by us, which yields 41.8 FPS on our test bed. The RTX 3080 ends up 15.24 percent faster than Big Navi, with 70.3 FPS. It's important to note here that AMD may be using a different/lighter test scene than us, since we don't use internal benchmark tools of games, and design our own test scenes. It's also important to note that we tested Borderlands 3 with DirectX 12 only in the game's launch-day review, and use the DirectX 11 renderer in our regular VGA reviews.With Gears 5, AMD claims performance of 73 FPS at 4K, with its DirectX 12 renderer, with the "Ultra" preset. This number ends up 16.24 percent faster than that of the RTX 2080 Ti, which scores 62.8 FPS on our test bed. The RTX 3080 is 15.61 percent faster than the AMD card, at 84.4 FPS.
Call of Duty: Modern Warfare was never added to our VGA review test selection, but we tested the game separately at launch (find its test bed information here). In this testing, we found the RTX 2080 Ti to score 77.9 FPS at Ultra Settings, with RTX-off. In comparison, AMD claims Big Navi scores 88 FPS, making it 12.96 percent faster.
We know this is a very coarse and unscientific way to compare AMD's numbers to ours, and AMD has probably cherry-picked games that are most optimized to its GPUs, but it lends plausibility to a theory that Big Navi may end up comparable to the RTX 2080 Ti, and trade blows with the upcoming RTX 3070, which NVIDIA claims outperforms the RTX 2080 Ti. The RX 6000 "Big Navi" ends up with a geometric mean of 21% higher frame-rates than the RTX 2080 Ti in these three tests, which would imply almost double the performance of the RX 5700 XT. Big Navi is rumored to feature double the CUs as the the RX 5700 XT, so the claims somewhat line up.
With "Borderlands 3" at 4K, with "badass" performance preset and DirectX 12 renderer, AMD claims a frame-rate of 61 FPS. We tested the game with its DirectX 12 renderer in our dedicated performance review (test bed details here). AMD's claimed performance ends up 45.9 percent higher than that of the GeForce RTX 2080 Ti as tested by us, which yields 41.8 FPS on our test bed. The RTX 3080 ends up 15.24 percent faster than Big Navi, with 70.3 FPS. It's important to note here that AMD may be using a different/lighter test scene than us, since we don't use internal benchmark tools of games, and design our own test scenes. It's also important to note that we tested Borderlands 3 with DirectX 12 only in the game's launch-day review, and use the DirectX 11 renderer in our regular VGA reviews.With Gears 5, AMD claims performance of 73 FPS at 4K, with its DirectX 12 renderer, with the "Ultra" preset. This number ends up 16.24 percent faster than that of the RTX 2080 Ti, which scores 62.8 FPS on our test bed. The RTX 3080 is 15.61 percent faster than the AMD card, at 84.4 FPS.
Call of Duty: Modern Warfare was never added to our VGA review test selection, but we tested the game separately at launch (find its test bed information here). In this testing, we found the RTX 2080 Ti to score 77.9 FPS at Ultra Settings, with RTX-off. In comparison, AMD claims Big Navi scores 88 FPS, making it 12.96 percent faster.
We know this is a very coarse and unscientific way to compare AMD's numbers to ours, and AMD has probably cherry-picked games that are most optimized to its GPUs, but it lends plausibility to a theory that Big Navi may end up comparable to the RTX 2080 Ti, and trade blows with the upcoming RTX 3070, which NVIDIA claims outperforms the RTX 2080 Ti. The RX 6000 "Big Navi" ends up with a geometric mean of 21% higher frame-rates than the RTX 2080 Ti in these three tests, which would imply almost double the performance of the RX 5700 XT. Big Navi is rumored to feature double the CUs as the the RX 5700 XT, so the claims somewhat line up.
262 Comments on AMD Big Navi Performance Claims Compared to TPU's Own Benchmark Numbers of Comparable GPUs
Whether we agree or disagree Lisa Su and Co have shown that they can deliver on their promises and I for one clearly think that AMD would be considered morons to have a separate event to release a "flop" of a gpu. It's clear to me and a select few that they've got something special on their hands and are being real coy about it. Do you think the teaser at the end wasn't on purpose? They've been doing it all along. Look at the rumor mill when it came to Zen 3 vs RDNA2. It was clear to me what was prioritized and where. While Zen 3 is a big deal, RDNA2 is where the statement needed to made.
IIRC no one here mentioned that these cards are cheaper to make on 7nm overall vs Ampere. These cards will be very efficient and priced well. AMD also is doing what very few companies have done, be honest. These benchmarks clearly shows AMD used games they struggle with (both CPU & GPU). Seems to me like they are showing worse case scenarios to gauge the internets reactions and it's working. The GPU event was obviously last for a reason, they have the fastest CPUs in town and thus needed the GPUs to be tested with the best. This also shows their ecosystem and lastly to surprise all the naysayers who can't seem to understand that they can compete. RDNA1 matched Turings IPC and Nvidia regressed in that department with Ampere.
A "halo" product ain't a damn 3070 competitor.
Why is that so hard for fanboys to understand?
Vega - Flopped
Radeon VII - Flopped
AMD did have special events annoucing all of those GPU mind you. Radeon VII in particular flopped so hard it was dead on arrival.
Well considering Navi10 is slightly faster than TU106 (2070), it is a remarkable improvement already when Navi21 is faster than GA104, I will give you that. GA102 however is a different beast.
And Navi21 is cheaper to produce than GA102 ? do you know that AMD pay 9k usd per 7nm wafer at TSMC vs Nvidia pay only 3k usd per 8N wafer at Samsung ? GA102 is dirt cheap to produce, Nvidia has like 60%+ profit margin afterall (last quarter was 66%).
Navi21 best silicon will be reserved for prosumer cards where AMD make more profit. So expect cut down Navi21 competing and beating GA104, which is still good.
While some people here can't get around this, the majority don't see a reason to ever upgrade beyond a GTX 1060 or something equivalent from Amd until it's dead, lack of good games makes PC so unattractive and the rare good ones are so well made they don't need RTX 3080 or whatever Amd makes, just look at recent star wars squadrons or previous ones like jedi fallen order, runs fine on gtx 1060 and equivalent AMD, big publishers like EA or blizzard.... they don't care about GPU sales, it really hurts Nvidia when they put something like that on the market, imagine gaming on a 4 year old card with all settings turned up and graphics are top notch :laugh:
3080 oc is exactly 200% 5700xt
6900xt is double 5700xt and plus
This is because I have already seen them all and none of them holds any water whatsoever. You can spare yourself the trouble. And again.. you can either learn a thing or two here or you can live in fantasy land, but again Id say, do it elsewhere. Ill torch any nonsense I see and so far you've been full of it.
Consider carefully what you might post next. This subject has been debunked a hundred times already ;) Im not going there again.
Not that there were few 2080 Ti buyers either, with 2080 Ti registering almost 1% in steam hardware survey.
There are many more rich kids out there than you might think :laugh:
FURY X did not flop... loud mouth aibs have word to Nvidia which allowed for Nvidia to drop its pricing to train on FuryX parade, so your argument there is a wash.
Funny how you forget 290x vs Titan fiasco... i won't mention who took that L.
Vega didn't flop it was just late... and judging by AMDs predicament at the time, Zen was the more important product(now we're about to reap the reward for Vegas sacrifice). It's also funny that Vega is as good as its Nvidia counterpart now. But if you want to call vega an L, I'll give you that.
Radeon 7 was never meant for the gaming segment, navi had a bug and was going to be delayed and they needed a "tide me over" card. And if Radeon 7 was trash then so was the 2080... at least the Radeon 7 could be used for compute.
RDNA1 (5700xt) was a good bit faster that Nvidia had to do a refresh. It did all this while being substantially smaller.
Now to present time... If RDNA2s so bad why did Nvidia rush a launch? (No stock till 2021). Why did they push out those space heaters? Xx80 chip now back on the big die. Hmm, i wonder, it seems Nvidia knows more about AMD than you or I, and they clearly jumped the gun. The 3090 is just 2 shaders shy of being the full die and it is only a mere 10% faster the 3080, all while being basically double/more the cost. That's doom and gloom in my book. If AMD wants the crown, it's right there for the taking. If i were them, I'd look to take both the CPU & GPU crown in the same month. Why did Nvidia delay the 3070 then give that bs excuse? They wanted to one up AMD the day after, if you think AMD want ready this go around, you're delusional. Too many irrational decisions by Nvidia this go around and it says one thing to me, AMD has arrived.
Overall IMHO if you got last generation's video card. You do not need this generation's video card. I'll even go further to say that if you got any video card from 2016 and on wards, you do not need a video card. Because the tech advances that you need to play your favorite games will not be much of an increase.
And contrary to what these tech talking heads want to say. The world does not revolved around 4K. It is 1080p AND LESS.
So enjoy what you have and buy what you absolutely need to buy and save some money in the end.
500W is the same as my small office electric eco heater. The heat that a 500W PC produces can't vanish into thin air of course it just gets soaked up by your cooler fins until it's spat out into your room.
I'm not for any price increases but it's hilarious the double standard being applied. $50 makes AMD greedy fucks, $500 doesn't even register for Nvidia apparently though.
GA 102 is not dirt cheap to produce when the chips are so damn big. Common now lol. You are assuming that there is 100% Yield and Nvidia is fitting more chips on the wafer then AMD. common now, no matter how you cook it AMD will make more chips on a smaller node and be cheaper to make overall.
oh by the way. Nvidia is not paying 3k per wafer. Its more like close to 6k. So yea they are making profit sure, but not so much with the FE model it seems since they are only selling it through best buy in the U.S now and rumors are coming true that they don't want to make too many of those.
itigic.com/why-did-nvidia-rtx-3000-arrive-in-8nm-with-samsung/
Vega and Radeon VII are based on the Vega architecture which is clearly focused on professional and compute. They have have a large part of the die dedicated to the HBCC, which does nothing for gaming. The only upside to the VII cards was that they still sell for $700 because they do very well in professional workloads.
In essence "Big Navi" will be AMD's return to high end gaming graphics cards in a LONG time. I suggest you read up on Navi before saying it has no new features: www.techpowerup.com/256660/amd-e3-2019-tech-day-all-slide-decks-ryzen-3000-zen-2-radeon-rx-5000-navi-etc
Among the features is localized L1 cache sharing. Sharders in a group can share content among the L1 caches. This not only increases effective bandwidth and cache size (by removing the need for duplicated data) but reduces latency and cache misses. I would not be surprised if RDNA2 enables sharing across shader groups. Anything that spares the GPU from having to access the L2 cache, which is significantly slower to access than L1.
Given the boost in performance Navi has over prior AMD GPUs and that it brought AMD on par efficiency wise with Nvidia, I would certainly not call it a turing. Just because AMD only released the mid range certainly doesn't make it bad.
Also, what is your source for marketed features that had to be disabled? That was only for Vega as far as I'm aware as I've looked over all the marketing slides AMD have provided.
Primitive shaders is also in hardware in RDNA1, same as Vega, was never enabled. Imagine developing a whole new stage in the graphics pipeline, and never using it.
Either way, you'd have to be living under a rock to miss the mind-games that are going on in the industry right now. As such you can choose to believe that Lisa Su just chucked her hand in and said this is all we've got - at an event that was about CPU launches, or you can be open to the possibility that AMD is playing the game, and leaving something in the tank for the main GPU launch event in a few weeks.
If they had nothing to challenge Nvidia, I believe they would have teased a 'wait and see' line. Of course, there is also the possibility that AMD are softening the blow.
In my job, which involves being very analytical, I would fail dismally every time if I made rash decisions without considering all possibilities. Then again, I trust my gut instinct.
I'm not saying you're wrong in terms of the eventual outcome, but I am saying you're wrong to mock and be definitive in your opinion. Time will tell.