Monday, November 2nd 2020

AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

AMD sent ripples in its late-October even launching the Radeon RX 6000 series RDNA2 "Big Navi" graphics cards, when it claimed that the top RX 6000 series parts compete with the very fastest GeForce "Ampere" RTX 30-series graphics cards, marking the company's return to the high-end graphics market. In its announcement press-deck, AMD had shown the $579 RX 6800 beating the RTX 2080 Ti (essentially the RTX 3070), the $649 RX 6800 XT trading blows with the $699 RTX 3080, and the top $999 RX 6900 XT performing in the same league as the $1,499 RTX 3090. Over the weekend, the company released even more benchmarks, with the RX 6000 series GPUs and their competition from NVIDIA being tested by AMD on a platform powered by the Ryzen 9 5900X "Zen 3" 12-core processor.

AMD released its benchmark numbers as interactive bar graphs, on its website. You can select from ten real-world games, two resolutions (1440p and 4K UHD), and even game settings presets, and 3D API for certain tests. Among the games are Battlefield V, Call of Duty Modern Warfare (2019), Tom Clancy's The Division 2, Borderlands 3, DOOM Eternal, Forza Horizon 4, Gears 5, Resident Evil 3, Shadow of the Tomb Raider, and Wolfenstein Youngblood. In several of these tests, the RX 6800 XT and RX 6900 XT are shown taking the fight to NVIDIA's high-end RTX 3080 and RTX 3090, while the RX 6800 is being shown significantly faster than the RTX 2080 Ti (roughly RTX 3070 scores). The Ryzen 9 5900X itself is claimed to be a faster gaming processor than Intel's Core i9-10900K, and features PCI-Express 4.0 interface for these next-gen GPUs. Find more results and the interactive graphs in the source link below.
Source: AMD Gaming Benchmarks
Add your own comment

146 Comments on AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

#126
mtcn77
theoneandonlymrk
And GCN looked pretty effing capable until afew years after last gen consoles came out, about the Maxwell era no?.
To be honest, not really. I can recall the times radeon technology presentations used to end with, "you can access me personally for the last 10% custom hardware tuning, so I get to keep my job as well as yourself" when it still was ATi.
They never worked perfect out of the box.
Posted on Reply
#127
theoneandonlymrk
mtcn77
To be honest, not really. I can recall the times radeon technology presentations used to end with, "you can access me personally for the last 10% custom hardware tuning, so I get to keep my job as well as yourself" when it still was ATi.
They never worked perfect out of the box.
I disagree I think, and wtaf does Ati have to do with anything in this debate or the last console generation, it's fine to disagree but try and stay relevant.


What the actual ffff so I did something special the last few AMD generations without even knowing it? Tune what?!.

They didn't end like that in my country, source please.
Posted on Reply
#128
moproblems99
theoneandonlymrk
So consider, is GPU physx big in game's ,what about Cuda ,is that big in game's because direct compute is, as is tesselation, an Rx580 will meet the minimum specs at least of any game released since it's birth.
Did Nvidia bring more performance at times, yes of course but that doesn't preclude AMD having good support for their features.
And GCN looked pretty effing capable until afew years after last gen consoles came out, about the Maxwell era no?.
My point was that AMD had last gen consoles and games are no more optimized for them now then they were before.

However, AMD now appears to have a winner so most of this is moot.
Posted on Reply
#129
theoneandonlymrk
moproblems99
My point was that AMD had last gen consoles and games are no more optimized for them now then they were before.

However, AMD now appears to have a winner so most of this is moot.
So stop adding irrelevant bits, and I got your main point and still disagree.
Every new game released worked here and performance got optimised over a few months via driver, and I can prove it via AMD driver release notes.
Can you provide any proof your opinion is right?.
Posted on Reply
#130
mtcn77
theoneandonlymrk
They didn't end like that in my country, source please.
:( such distrust over something so little...
Posted on Reply
#131
theoneandonlymrk
mtcn77
:( such distrust over something so little...
Distrust, I am fact checked often why not you.
And I watched every GPU release done since GPU were a thing and don't recall AMD saying what you accuse them of saying every time?! Or more importantly even once.
Posted on Reply
#132
mtcn77
theoneandonlymrk
Distrust, I am fact checked often why not you.
And I watched every GPU release done since GPU were a thing and don't recall AMD saying what you accuse them of saying every time?! Or more importantly even once.
It was a developer session. You are literally throwing me into the hay sack with what you did there. I'm an astroturfer for sake of the argument. It is your duty to prove anything...
Posted on Reply
#133
theoneandonlymrk
mtcn77
It was a developer session. You are literally throwing me into the hay sack with what you did there. I'm an astroturfer for sake of the argument. It is your duty to prove anything...
mtcn77
To be honest, not really. I can recall the times radeon technology presentations used to end with, "you can access me personally for the last 10% custom hardware tuning, so I get to keep my job as well as yourself" when it still was ATi.
They never worked perfect out of the box.
A developer session told you you had to custom tune your hardware to get 10%.
I'm laughing I'll leave it at that.
Posted on Reply
#134
moproblems99
theoneandonlymrk
Can you provide any proof your opinion is right?
Yes, let me quote it:
theoneandonlymrk
performance got optimised over a few months via driver, and I can prove it via AMD driver release notes
There you go. AMD did the optimizing via drivers. Devs did the same ol shit they used to do and continue to do. I mean shit, everybody is acting like I work for nvidia when I have an all AMD rig.
Posted on Reply
#136
mtcn77
theoneandonlymrk
A developer session told you you had to custom tune your hardware to get 10%.
I'm laughing I'll leave it at that.
To get from 90% to 100. It was a general session.
Posted on Reply
#137
theoneandonlymrk
moproblems99
Yes, let me quote it:



There you go. AMD did the optimizing via drivers. Devs did the same ol shit they used to do and continue to do. I mean shit, everybody is acting like I work for nvidia when I have an all AMD rig.
It was you saying they didn't optimise, you seem to have swapped argument.
lexluthermiester
That's because it was nonsense.
A lot of this talk is.
Posted on Reply
#138
lexluthermiester
theoneandonlymrk
A lot of this talk is.
Yeah, pretty much. Why can't people just get along? I find it refreshing that NVidia pulled out all the stops and yet AMD has caught up in just two GPU gen jumps. It's a great time to be in the tech/PC industry!
Posted on Reply
#139
mtcn77
I like it when exclusive brands pull out the openness card at the first sight of setback.
I cannot shed any tears, sorry. This is a very hard fought achievement. Every last bit of AMD went into making radeon right after the RTG situation. Good thing it was stopped before more damage could be incurred on the firm. RTG was losing money just about then.
Posted on Reply
#140
moproblems99
theoneandonlymrk
It was you saying they didn't optimise, you seem to have swapped argument.
What are you on about? The discussion was clearly about developers not optimizing games for AMD's architecture. Did you read any of the other posts or just assume that I am pissing on
AMD?

Sometimes I wonder why I bother with threads that have anything to do with Intel, AMD, or nVidia.
Posted on Reply
#141
theoneandonlymrk
moproblems99
What are you on about? The discussion was clearly about developers not optimizing games for AMD's architecture. Did you read any of the other posts or just assume that I am pissing on
AMD?

Sometimes I wonder why I bother with threads that have anything to do with Intel, AMD, or nVidia.
I am aware, and AMD too have people they can put in place to help game and Engine Dev's optimise their software.
I'm arguing they do.

Oh and your insinuations, stick them you know where yeh , I read your posts, and I still chose not to call you names or insinuate anything yeh.
Posted on Reply
#142
mtcn77
moproblems99
Sometimes I wonder why I bother with threads that have anything to do with Intel, AMD, or nVidia.
In the past, there were shills that would attack you on moderators watch in OCN. It was a learning experience. I developed so many false premise engagement vectors there. Now I stick to astroturfing. It is so elegant, people don't know what hit them. It is so good to confuse people from outside of their perspective, they don't know whether it is real, or whether to care. It literally leaves them searching for things.

Genuine or not, I cannot help but feel sympathy for the people sharing their concern and good wishes for the well being of the red team. I'm impressed.
Posted on Reply
#143
medi01
moproblems99
@medi01 , no idea what you just said.
You realize it is cruel towards team green, right?
Let me expand it:

EPIC, the company behind major game development framework "Unreal Engine",when being asked about "why do Unreal Engine 4 games run soo poorly on AMD" was blantly saying "it was optimized for NVidia GPUs". Demo of the UE4 engine has happened, as you would guess, on the fastest NVidia GPU. Because that was all that mattered.

In 2020, EPIC has demoed Unreal Engine 5.
Demo took place, on... PS5, next gen console, with RDNA2+Zen APU inside it.
Because NV is no longer as important as it was, in fact, it's AMD who is important.
Note that it was even before RX 6000 series were rolled out into green faces.

And to make it even more embarrassing to team green, in a very impressive demo loaded with light effects, no Nvidia style RT was used.


More to it: RDNA2 based APUs in the upcoming consoles are mighty, beating nearly entire PC market, with only 2080s/2080Ti level GPUs being faster than them.
Posted on Reply
#144
BorgOvermind
This perfectly explains it:

The battle will not be about the highest benchmark bar but about everything else.
Quality of the product and the extra features that exist or not on one side or the other will make the use of the GFx cards of one side or the other good for one thing or the other.

Overall, I expect a general "equalization" of brute computing power now that both side had a lot of time to study what the other did and come up with something like-it+.

I'd like to see some mining scores in the mean time, they can be relevant for quite a few things.

@medi01

Most PC games I played between the start of 2018 and mid-2019 were UE-based.
I like that engine a lot and I hope it keeps up.
Posted on Reply
#145
BoboOOZ
BorgOvermind
The battle will not be about the highest benchmark bar but about everything else.
Quality of the product and the extra features that exist or not on one side or the other will make the use of the GFx cards of one side or the other good for one thing or the other.
That's highly debatable, I personally will still buy based on price/perf. Of course, if Nvidia loses the performance crown, it's expected that they will try to move the fight to some other zone where they can still win.

But I would argue that the most important feature in the next 6 months will be AVAILABILITY ;)
Posted on Reply
#146
EarthDog
BoboOOZ
But I would argue that the most important feature in the next 6 months will be AVAILABILITY ;)
GL getting an AIB AMD card before the end of the year. :)
Posted on Reply
Add your own comment