Monday, November 2nd 2020

AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

AMD sent ripples in its late-October even launching the Radeon RX 6000 series RDNA2 "Big Navi" graphics cards, when it claimed that the top RX 6000 series parts compete with the very fastest GeForce "Ampere" RTX 30-series graphics cards, marking the company's return to the high-end graphics market. In its announcement press-deck, AMD had shown the $579 RX 6800 beating the RTX 2080 Ti (essentially the RTX 3070), the $649 RX 6800 XT trading blows with the $699 RTX 3080, and the top $999 RX 6900 XT performing in the same league as the $1,499 RTX 3090. Over the weekend, the company released even more benchmarks, with the RX 6000 series GPUs and their competition from NVIDIA being tested by AMD on a platform powered by the Ryzen 9 5900X "Zen 3" 12-core processor.

AMD released its benchmark numbers as interactive bar graphs, on its website. You can select from ten real-world games, two resolutions (1440p and 4K UHD), and even game settings presets, and 3D API for certain tests. Among the games are Battlefield V, Call of Duty Modern Warfare (2019), Tom Clancy's The Division 2, Borderlands 3, DOOM Eternal, Forza Horizon 4, Gears 5, Resident Evil 3, Shadow of the Tomb Raider, and Wolfenstein Youngblood. In several of these tests, the RX 6800 XT and RX 6900 XT are shown taking the fight to NVIDIA's high-end RTX 3080 and RTX 3090, while the RX 6800 is being shown significantly faster than the RTX 2080 Ti (roughly RTX 3070 scores). The Ryzen 9 5900X itself is claimed to be a faster gaming processor than Intel's Core i9-10900K, and features PCI-Express 4.0 interface for these next-gen GPUs. Find more results and the interactive graphs in the source link below.
Source: AMD Gaming Benchmarks
Add your own comment

146 Comments on AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

#51
B-Real
nguyen
How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.
XDDDD Cry cry cry.
Posted on Reply
#53
EarthDog
Xaled
This is fraud and reviewer sites channels should warn people and condemn nvidia for this but very few does.
lol, wat? How delusional are you? This is not a crime.
Posted on Reply
#54
kapone32
nguyen
Do you play the single player or multi player version of the Division 2 ?
Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.

Now tell me which do you prefer with your current GPU:
RDR2 High setting ~60fps or 144fps with low settings
AC O High Setting ~60 fps or 144fps with low settings
Horizon Zero Dawn High Setting ~60fps or 144fps with low settings

Well to be clear when I said 60FPS, it's for the minimum FPS.

Yeah sure if you count auto-overclocking and proprietary feature (SAM) that make 6900XT as being equal to 3090, see the hypocrisy there ? Also I can find higher benchmark numbers for 3080/3090 online, so trust AMD numbers with a grain of salt.
The point of getting a next generation GPU has always been about high FPS and great visual quality. With my GPU I always prefer high settings but my AMD driver is pretty good at setting for each Game.

It's 1440P Freesync2 43-165 HZ high refresh so there is nothing to lament.

You see you are still in denial. There is no bias because of SAM. It's the same thing Nvidia does with CUDA. Just because SAM mitigates NVME doesn't mean that it is somehow cheating. As far as trusting AMD you can look at my history and know that I have always had confidence in the leadership and honesty of Lisa Su. No one at AMD confirms or leaks anything substantive (except her) since Ryzen was announced. I know that it is difficult in a world amplified by social media that it is sometimes hard to understand that AMD GPUs are faster than Nvidia GPUs (apparently) period this generation. Announcing/leaking a 3080TI card while in the midst (weeks) of launch issues with current 3000 cards is actually laughable in it's desperation. Just imagine if AMD announces that 3000 series CPUs will support SAM.........period, point, blank. The future is indeed bright and X570/B550 has proven to be a great investment.
Posted on Reply
#56
mtcn77
You know this is like the mission 'the Amerigo' where Kerrigan finds out about the ghost project to unlock her psyonic abilities. SAM is the next HSA target, shared address space. The next one is 'common memory interface'.
Posted on Reply
#57
EarthDog
kapone32
You see you are still in denial. There is no bias because of SAM. It's the same thing Nvidia does with CUDA. Just because SAM mitigates NVME doesn't mean that it is somehow cheating.
The difference is in games. This thread is about gaming benchmarks.

In order to get the full performance of these cards, you need to be balls deep in the AMD ecosystem. This means the latest mobos and processors. Most people aren't there and that will take time to get there. That said, the real curiosity to me how this works on intel and non 5000 series/b550/x570 setups. From the looks of the charts, that knocks things down a peg.
kapone32
Just imagine if AMD announces that 3000 series CPUs will support SAM.
just imagine....................as that all it will ever be.... :p
Posted on Reply
#58
Blueberries
So serious question: If a 6800 will get you 80-100 FPS at 4k, is there any incentive other than "future-proofing" to purchase anything higher specced? I know some people have 120hz+ 4k panels but for the 144hz/1440p and 60hz/4k crowd (i.e., the vast majority) it doesn't seem to make a lot of sense to dump extra heat into your chassis.
Posted on Reply
#59
mtcn77
Blueberries
So serious question: If a 6800 will get you 80-100 FPS at 4k, is there any incentive other than "future-proofing" to purchase anything higher specced? I know some people have 120hz+ 4k panels but for the 144hz/1440p and 60hz/4k crowd (i.e., the vast majority) it doesn't seem to make a lot of sense to dump extra heat into your chassis.
6800 is a different hardware class. Rdna2 has less code bubbles. If anything it will lessen your heat impact, compared to your runner up gpu.

I don't want to be a teamster, but this is how the green team thinks. These gpus aren't mobile. They aren't for the same kinds of workloads.

Beating yesteryears competition to the punch is something only Intel gpu goals could imagine, so set your goals high, not to become a laughing stock, imo.
Posted on Reply
#60
RedelZaVedno
mysterfix
You are wrong, extra vram costs money. A lot of people will gladly pay for the extra performance + double the vram. Funny how many people were just bitching about the amount of vram on Nvidia cards before AMD released their new cards. That extra $80 isn't going to be a deal breaker for anyone who wasn't already set on buying Nvidia.
1 Gbyte GDDR6, 3,500 MHz, 15 Gbps (MT61K256M32JE-14: A TR )costs $7.01 at the moment if you order up to a million pieces. You can negotiate much lower prices if you order more. That's 56 bucks for 8 gigs in worse case scenario (around 40 is more realistic). AMD is likely making nice profit selling you additional vram for 80 bucks.
Posted on Reply
#61
TheTechGuy1337
nguyen
Do you play the single player or multi player version of the Division 2 ?
Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.

Now tell me which do you prefer with your current GPU:
RDR2 High setting ~60fps or 144fps with low settings
AC O High Setting ~60 fps or 144fps with low settings
Horizon Zero Dawn High Setting ~60fps or 144fps with low settings

Well to be clear when I said 60FPS, it's for the minimum FPS.

Yeah sure if you count auto-overclocking and proprietary feature (SAM) that make 6900XT as being equal to 3090, see the hypocrisy there ? Also I can find higher benchmark numbers for 3080/3090 online, so trust AMD numbers with a grain of salt.
I completely agree with this guy. Refresh rate does not matter as much after 60 fps. I own a 60hz, 120hz, and a 144hz monitor. One of which is a laptop with horrid gray to gray scale of 45ms. All three of them perform similarly. They feel the same with vsync on the the 60hz monitor. If you were to turn vsync off then the difference shows. That is it. If the game is producing frames higher than a monitor can handle. That is the only time these higher refresh rate monitors come into play. However, with implementations like vsync this becomes less of a deal. In my opinion response time, gray to gray scale performance, brightness, and color accuracy are hands down the most important aspects to monitors. I want to be lost in a new world and not reminded that I have to work in the morning.
Posted on Reply
#62
xman2007
How dare AMD price their cards in line with nvidia cards /performance, how dare they turn a profit, says all the ones who were praising nvidia 2 weeks ago for selling a 2080ti performance gpu for 500, which are also vapourware :roll:
Posted on Reply
#63
nguyen
kapone32
The point of getting a next generation GPU has always been about high FPS and great visual quality. With my GPU I always prefer high settings but my AMD driver is pretty good at setting for each Game.

It's 1440P Freesync2 43-165 HZ high refresh so there is nothing to lament.

You see you are still in denial. There is no bias because of SAM. It's the same thing Nvidia does with CUDA. Just because SAM mitigates NVME doesn't mean that it is somehow cheating. As far as trusting AMD you can look at my history and know that I have always had confidence in the leadership and honesty of Lisa Su. No one at AMD confirms or leaks anything substantive (except her) since Ryzen was announced. I know that it is difficult in a world amplified by social media that it is sometimes hard to understand that AMD GPUs are faster than Nvidia GPUs (apparently) period this generation. Announcing/leaking a 3080TI card while in the midst (weeks) of launch issues with current 3000 cards is actually laughable in it's desperation. Just imagine if AMD announces that 3000 series CPUs will support SAM.........period, point, blank. The future is indeed bright and X570/B550 has proven to be a great investment.
So you agree that you are not using Low settings just to get 144fps then ? :roll:
New games are designed push next gen GPU to their knee all the same, it's all a scheme to sell you GPU you know, or you could just play CSGO for eternity and not care about brand new GPU.
Well I am just as happy when AMD is competitive as you are, because now the retailers can't charge cut throat price Ampere as before :D, though at this point I'm just gonna wait for 3080 Ti.
Posted on Reply
#64
RedelZaVedno
xman2007
How dare AMD price their cards in line with nvidia cards /performance, how dare they turn a profit, says all the ones who were praising nvidia 2 weeks ago for selling a 2080ti performance gpu for 500, which are also vapourware :roll:
xx70 class GPU should never cost more than 400 bucks. I blame FOMOs, fanboys and AMD not being competitive in the high end retail GPU market segment in the last 7 years, enabling Ngreedia to hike prices as they please. Let's face it, 3090 is nothing more than "3080TI" with additional vram for 500 bucks more than 2080TI which was 300 more ($500 in real life) than 1080TI. Praised 3070 is the most expensive xx70 class GPU besides 2070(S) being ever released, yet it was the best selling RTX in the Turing line, go figure.
What really pisses me off is now is AMD deciding to get along with Ngreedia price hikes, not even bothering to compete with them on bettering price to performance ratio anymore. The way things stand today, GPU prices will only go up as AMD has obviously chosen higher profit margins over trying to increasing GPU market share (by offering consumers substantially more for less). DIY PC builders are getting milked by both companies and they seem not to care. That's why I decided to get out of the market. I'm keeping 1080TI and will wait till RDNA3 to buy 3080XT on 2nd hand market for 300 bucks.
Posted on Reply
#65
Zach_01
Personally I do not consider 6900XT a 3090 equal. Its between 3080 and 3090. And that is without any feature enabled.
On the other hand 6800XT is a 3080 direct competitor. Even with all the extra features off.

At AMD's presentation the "RageMode" OC and SAM altogether gained around a ~6.5% FPS on avg (+2~13% depending the game). We really dont know how much was it from SAM alone.
Just remove white tiles above red lines

Posted on Reply
#66
EarthDog
Zach_01
At AMD's presentation the "RageMode" OC and SAM altogether gained around a ~6.5% FPS on avg (+2~13% depending the game). We really dont know how much was it from SAM alone.
Just remove wihite tiles above red lines,
I'v heard rage mode isn't much that it is mostly SAM doing this(?). I recall Linus mentioning that it isn't much more than power limit increase and fan speed increase to get more boost.

This is great for the few who are balls deep in their ecosystem...but what about for the majority of users? How many, in the current landscape, are using non B550/X570 systems (an overwhelming majority, surely)? You need to upgrade your CPU and mobo to support this feature. Personally, for the majority, we need to see how these perform without. From the chart above, looks like it takes a win back to a loss and a couple of wins back to virtual ties. I'd love to see this compared to overclocked 3080's instead of whatever they have... oh and on the same API.
Posted on Reply
#67
RedelZaVedno
Zach_01
Personally I do not consider 6900XT a 3090 equal. Its between 3080 and 3090. And that is without any feature enabled.
On the other hand 6800XT is a 3080 direct competitor. Even with all the extra features off.

At AMD's presentation the "RageMode" OC and SAM altogether gained around a ~6.5% FPS on avg (+2~13% depending the game). We really dont know how much was it from SAM alone.
Just remove wihite tiles above red lines


Now, let's be honest RDNA2 rasterization performance is very good if AMD is not lying, pricing not so much. 1000 bucks is still A LOT of $ to pay for a gaming GPU.
Posted on Reply
#68
kapone32
EarthDog
The difference is in games. This thread is about gaming benchmarks.

In order to get the full performance of these cards, you need to be balls deep in the AMD ecosystem. This means the latest mobos and processors. Most people aren't there and that will take time to get there. That said, the real curiosity to me how this works on intel and non 5000 series/b550/x570 setups. From the looks of the charts, that knocks things down a peg.
just imagine....................as that all it will ever be.... :p
Are not the purpose of Gaming benchmarks not to gauge the performance period but people also included Adobe Premiere benchmarks in AMD reviews? I do not agree that AMD has not penetrated the market and it is not like X570 or B550 (some) boards are expensive. It is not like you have to get a X670 or B650 board. X570 and now B550 are both mature platforms. If these first CPUs all support SAM then that means that the non X parts will do the same thing so the gap could be wider. I can understand as I too am intrigued to see how Intel and Nvidia cards work with either. I am going to be selfish in this though as I bought a X570 months ago specifically for this as soon as I saw the PS5 technical brief.
Posted on Reply
#69
Zach_01
RedelZaVedno
Now, let's be honest RDNA2 rasterization performance is very good if AMD is not lying, pricing not so much. 1000 bucks is still A LOT of $ to pay for a gaming GPU.

I'm not going to argue... but
AMD pricing +54% more for a +5% perf GPU has somehow followed the utter stupidity of...
nVidia pricing +114% more for a +10% perf GPU

It is what it is...!

Non the less, AMDs cards have more perf/$ value.
Posted on Reply
#70
ODOGG26
mysterfix
You are wrong, extra vram costs money. A lot of people will gladly pay for the extra performance + double the vram. Funny how many people were just bitching about the amount of vram on Nvidia cards before AMD released their new cards. That extra $80 isn't going to be a deal breaker for anyone who wasn't already set on buying Nvidia.
100 percent agree. People only talking this way because its AMD. I think the 6800 non xt is really good. Double vram plus 15-20% faster than 3070. lmao and people want that for no additional cost. Delusional
Posted on Reply
#71
EarthDog
kapone32
Are not the purpose of Gaming benchmarks not to gauge the performance period but people also included Adobe Premiere benchmarks in AMD reviews? I do not agree that AMD has not penetrated the market and it is not like X570 or B550 (some) boards are expensive. It is not like you have to get a X670 or B650 board. X570 and now B550 are both mature platforms. If these first CPUs all support SAM then that means that the non X parts will do the same thing so the gap could be wider. I can understand as I too am intrigued to see how Intel and Nvidia cards work with either. I am going to be selfish in this though as I bought a X570 months ago specifically for this as soon as I saw the PS5 technical brief.
A few sites cover that, sure. But this is in reference to gaming and following the thread here.

They've penetrated the market... but of those.. who owns the B550/X570? I'd guess more own X470 and lower than X570/B550. Remember, there was a large contingent pissed about 5000 series support on those 400 series boards. With that, it feels like many were just holding out. Also, not a soul has these CPUs yet. So, there is literally zero penetration on that front. So at minimum, you need to buy a new CPU. At worst, you're buying a CPU and a motherboard for this. Again, not something a majority has at this time. So IMO, it would be more than prudent to include the majority here to see how it performs on those systems until one gets the new 5000 series and a motherboard. Obviously seeing BOTH would be ideal.
Posted on Reply
#73
Punkenjoy
We can't really compare T&L on the first GeForce with Ray Tracing. Ray Tracing is adding new feature to the graphic rendering where T&L was to use a fixed function on the GPU to speed up something that every game had to do and were doing it on the CPU.


So we really adding new tricks to game engine right now and not offloading or accelerating an already existing process.

The thing is we are at a point that we have to do so much trick to emulate lights and reflection that it become really close to require as much power as real ray tracing. Using ray tracing for theses things require a lot of computational power but require way less complexity than all the tricks we currently use to emulate it.

It's one of the way that we can increase graphic fidelity. But there are still other area where we need to improve like more polygons, better physics and object deformation and better texture and materials.

Ray Tracing in game is clearly the future and it will clearly improve on every generation. It's like when the first shaders were added to GPU. It was just for few effect and the performance hit was huge. When it started to be used widely, the first GPUs supporting it were totally outclassed anyway. This will be the same with these current Generation (Both Nvidia and AMD gpu).

So, yes, Ray tracing is the future and is here to stay. But no, it shouldn't be a buying decision. No one should say, i will get the 3080 instead of a 6800 XT to "Future proof due to ray tracing"

People should just buy the best GFX cards for game released right now and maybe in the next year max.
Posted on Reply
#74
nguyen
Chomiq
It's funny
Next time read thoroughly.
It's funny,
I just showed that RX6000 can indeed run RTX, yet AMD did not share the numbers.

Next time read more thoroughly please.
Also I get 86fps with 2080 Ti at 1440p RTX ON DLSS OFF, I have no clue what the guy who wrote the article get their numbers from.

Posted on Reply
#75
lexluthermiester
NeuralNexus
NO ONE REALLY CARES ABOUT RAYTRACING
If you really think that, I have a bridge in Brooklyn NY I want to sell you...
NeuralNexus
All future games will be optimized for AMD's raytracing solution anyway.
That's an assumption on your part and not a very logical one, especially considering that NVidia has already had 2 years to gain a lead in both deployment and development of RTRT.
Posted on Reply
Add your own comment