Monday, November 2nd 2020

AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

AMD sent ripples in its late-October even launching the Radeon RX 6000 series RDNA2 "Big Navi" graphics cards, when it claimed that the top RX 6000 series parts compete with the very fastest GeForce "Ampere" RTX 30-series graphics cards, marking the company's return to the high-end graphics market. In its announcement press-deck, AMD had shown the $579 RX 6800 beating the RTX 2080 Ti (essentially the RTX 3070), the $649 RX 6800 XT trading blows with the $699 RTX 3080, and the top $999 RX 6900 XT performing in the same league as the $1,499 RTX 3090. Over the weekend, the company released even more benchmarks, with the RX 6000 series GPUs and their competition from NVIDIA being tested by AMD on a platform powered by the Ryzen 9 5900X "Zen 3" 12-core processor.

AMD released its benchmark numbers as interactive bar graphs, on its website. You can select from ten real-world games, two resolutions (1440p and 4K UHD), and even game settings presets, and 3D API for certain tests. Among the games are Battlefield V, Call of Duty Modern Warfare (2019), Tom Clancy's The Division 2, Borderlands 3, DOOM Eternal, Forza Horizon 4, Gears 5, Resident Evil 3, Shadow of the Tomb Raider, and Wolfenstein Youngblood. In several of these tests, the RX 6800 XT and RX 6900 XT are shown taking the fight to NVIDIA's high-end RTX 3080 and RTX 3090, while the RX 6800 is being shown significantly faster than the RTX 2080 Ti (roughly RTX 3070 scores). The Ryzen 9 5900X itself is claimed to be a faster gaming processor than Intel's Core i9-10900K, and features PCI-Express 4.0 interface for these next-gen GPUs. Find more results and the interactive graphs in the source link below.
Source: AMD Gaming Benchmarks
Add your own comment

147 Comments on AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

#26
oldtimenoob
The frostbite engine always seems to favour AMD GPU's. Even with the RX 5700's
Posted on Reply
#27
Vya Domus
nguyen
It means to get >60fps in single player game with your GPU, you are mostly playing with Low setting at 4K.
Or you actually never play any game and just decide that everyone need high FPS.
I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think everyone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.
Posted on Reply
#28
nguyen
Vya Domus
I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think someone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.
60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?
Posted on Reply
#29
Vya Domus
nguyen
no point in downgrading visual just to get >60FPS.
Still can't figure out that this is a purely subjective conclusion ?
Posted on Reply
#30
nguyen
Vya Domus
Still can't figure out that this is a purely subjective conclusion ?
Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
Because I can give you many editorials who would target 60FPS gaming
Posted on Reply
#31
HaKN !
What is it with people and RayTracing suddenly ? Since it was announced and until before AMD showed the new cars , NOBODY was talking about RayTracing and now ?

Just be glad for the fricking competetion , its a win for all us customers. Nvidia and AMD really dont care about us , they just wanna make as much money as possible.
Posted on Reply
#32
mystera
nguyen
How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.
That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward, most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).
Posted on Reply
#33
Vya Domus
nguyen
Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?
Posted on Reply
#34
yoyo2004
nguyen
Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?
nguyen
Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
I am honestly very confused as of what is it you want or trying to imply?


Different players have different preferences when gaming.
Posted on Reply
#35
TumbleGeorge
There are real RT(Ms DXR) and Nvidia RT. Real RT will be in 100% of games Nvidia RT in no more than 10% of this 100%. I think there will be not games exclusive with support Nvidia RT only.
Posted on Reply
#36
arbiter
mystera
That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward, most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).
This sounds like typical AMD excuse for why their card sucks with what was/is a standard.
Posted on Reply
#37
ZoneDymo
Dristun
Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?
I dont agree, I think the reflections in Watch Dogs look pretty dang impressive, sad its all so heavy so a true ray traced future is still several gens out for sure, but look at Digital's Foundry's latest vid on it
Posted on Reply
#38
nguyen
Vya Domus
I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?
Sure just tell me which games you do you play exactly ? CSGO ? Youtube videos ?
Almost everyone recommends turning down visuals in AAA games to hit 144hz 1440p ? Yeah I really need some confirmation on that. No one would want to play AAA games with Low settings just to hit 144hz, that I'm sure of.

I didn't say anyone should play at 60FPS, if you have already max out all the graphical settings and can still getting >60FPS, then play at >60FPS, although capping the framerate really help with input latency with Nvidia Low Latency and AMD Anti Lag in certain games.
Posted on Reply
#39
ZoneDymo
arbiter
This sounds like typical AMD excuse for why their card sucks with what was/is a standard.
I think you do not know what "standard" means or how its applied.
Posted on Reply
#40
EarthDog
lynx29
dragons dogma
This doesn't explain it to you? The title? Monkeys with crayons can draw the scenes fast enough, lol.

From 2016: "Given its old-gen nature, Dragon’s Dogma: Dark Arisen is not really a demanding title."

www.tweaktown.com/guides/7542/dragons-dogma-dark-arisen-gaming-graphics-performance-tweak-guide/index.html

Just saying. ;)
mystera
That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???
Its nothing like it, really. AMD, like NV uses DXR. They're both using the same API for RT.
mystera
And based on how well AMD's raytracing looks and performs
Was anything official released on AMD RT performance?

RTX is hardware on the card. NV cards use DXR API for RT just as AMD will.
Posted on Reply
#41
Calmmo
Dristun
Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?
I am similarly excited about RT.
It's just fanboys shouting at fanboys at this point. These same people would have been the ones mocking Geforce 256 back in the day about HWT&L, pay them no heed.

We're just in that awkward phase now where DXR is still an unknown for most people and we still dont know for sure if this years or maybe the next cycle is the one that will bring mainstream acceptance/performance to RT. I personally am not aware of any non DXR games tho i do believe those nvidia developed ones like Quake RTX are probably going to be nvidia HW only. I doubt games like Control aren't going to work on AMD, I suspect it's just AMD's software side of things being still not ready enough. I would expect a lot of growing pains for the first half of 2021 and AMD DXR. Hopefully I'm wrong, but they are going into this dealing with a 2 year handicap.

New games will have cross brand hw to work with soon, and as someone with a 2070super all i can say is the DXR game library is veeery small still, and its only going to really grow now with the new consoles since more or less all cross platform AAA titles will be coming with some form of RT once this first cross platform year of releases is over. (and already some of those cross plats are coming with RT anyway) So it bodes well overall for us mid to long term, regardless of hardware brand choices or.. god forbid loyalties.
Posted on Reply
#42
R0H1T
arbiter
AMD excuse for why their card sucks with what was/is a standard.
You mean the DXR standard as opposed to Nvidia's proprietary RT or going back freesync, Mantle based Vulkan just to name a few?
Posted on Reply
#43
EarthDog
R0H1T
You mean the DXR standard as opposed to Nvidia's proprietary RT
Who's proprietary RT? Nvidia uses DXR as well...
When DXR is enabled by a Game Ready Driver, targeted for April (2019), the supported GeForce GTX graphics cards will work without game updates because ray-traced games are built on DirectX 12’s DirectX Raytracing API, DXR. This industry standard API uses compute-like ray tracing workloads that are compatible with both dedicated hardware units, such as the RT Cores, and the GPU’s general purpose shader cores.
Posted on Reply
#44
RedelZaVedno
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
Posted on Reply
#45
kapone32
nguyen
60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?
I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.
Posted on Reply
#46
nguyen
kapone32
I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.
Do you play the single player or multi player version of the Division 2 ?
Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.

Now tell me which do you prefer with your current GPU:
RDR2 High setting ~60fps or 144fps with low settings
AC O High Setting ~60 fps or 144fps with low settings
Horizon Zero Dawn High Setting ~60fps or 144fps with low settings

Well to be clear when I said 60FPS, it's for the minimum FPS.

Yeah sure if you count auto-overclocking and proprietary feature (SAM) that make 6900XT as being equal to 3090, see the hypocrisy there ? Also I can find higher benchmark numbers for 3080/3090 online, so trust AMD numbers with a grain of salt.
Posted on Reply
#47
WeeRab
RedelZaVedno
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
The 3070 is at the moment, Unicorn breath, like the rest of the Ampere lineup. What you call "impressive" regards the 3090, becomes idiotic when a $1500 card only beats a $800 card by 10%.
Oh! and CUDA is no good for gaming - Whilst Ray Tracing kills performance without resorting to DLSS.
Raytracing is todays equivalent of Hairworks or Physx.
The leather jacket openly lied to Nvidia's consumer base, claiming the 3090 was "Titan-like" when it clearly isn't, and promising plenty of stock for buyers. The reality is that abysmal yields are the reason
the Ampere series are almost impossible to come by.
Posted on Reply
#48
mechtech
Chrispy_
The vanilla 6800 is actually looking really strong in the first few of those benchmarks.

It's great that yesterday's $1200 performance is now half price, but what the overwhelming majority have needed for two years is yeserday's $600 performance for $300.
I wonder if they will release a cheaper 8GB version of the 6800?
Posted on Reply
#49
Xaled
RedelZaVedno
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
except the 3070's real price is not 499$. since rtx 2xxx series nvidia has been selling their cards with much higher price than the announced prices.
This is fraud and reviewer sites channels should warn people and condemn nvidia for this but very few does.
Posted on Reply
#50
mysterfix
RedelZaVedno
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
You are wrong, extra vram costs money. A lot of people will gladly pay for the extra performance + double the vram. Funny how many people were just bitching about the amount of vram on Nvidia cards before AMD released their new cards. That extra $80 isn't going to be a deal breaker for anyone who wasn't already set on buying Nvidia.
Posted on Reply
Add your own comment