Tuesday, April 12th 2022

AMD Ryzen 7 5800X3D Gets Full Set of Gaming Benchmarks Ahead of Launch

XanxoGaming has finally released its complete set of benchmarks for the AMD Ryzen 7 5800X3D and it's been tested against an Intel Core i9-12900KF. This time both platforms are tested using an NVIDIA GeForce RTX 3080 Ti and four times 8 GB of 3200 MHz CL14 DDR4 memory. The only difference appears to be the OS drive, motherboard and cooling, although both systems rely on a 360 mm AIO cooler. Both systems were running Windows 10 21H2. The site has a full breakdown of the components used for those interested in the exact details.

The two platforms were tested in 11 different games at 720p and 1080p. To spoil the excitement, it's a dead race between the two CPUs in most games at 1080p, with Intel being ahead by about 1-3 FPS in the games where AMD loses out. However, in the games AMD takes the lead, it's by a good 10 FPS or more, with games like the Witcher 3 and Final Fantasy XV seeing an advantage of 40-50 FPS. AMD often has an advantage when it comes to the one percent low numbers, even when Intel is ahead when it comes to the average FPS, but this doesn't apply to all of the games. It's worth keeping in mind that the Intel CPU should gain extra performance when paired with DDR5 memory in some of these games, but we'll have to wait for more reviews to see by how much. The benchmarks displayed are mostly the games TPU normally tests with, but aren't the entirety of games tested by XanxoGaming.
As for the 720p tests, AMD only loses out in Strange Brigade, even though it's a loss of over 20 FPS on average FPS and by over 10 FPS when it comes to the one percent low frames. As for the other games, it's mostly a dead race here too, but with an advantage to AMD instead of Intel by 1-3 FPS. However, the 3D V-Cache seems to kick in here when it comes to the one percent low frames, as AMD edges out Intel by a large margin in more games here by at least 10 FPS, often by around 30 FPS or more. Take these benchmarks for what they are, an early, unconfirmed test of the Ryzen 7 5800X3D. We're just over a week away from the launch and we should be seeing a lot more benchmarks by then. Head over to XanxoGaming for the full set of tests and their conclusion, especially as they made an effort to write the test in English this time around.
Source: XanxoGaming
Add your own comment

139 Comments on AMD Ryzen 7 5800X3D Gets Full Set of Gaming Benchmarks Ahead of Launch

#76
InVasMani
PunkenjoyThe thing is the 5950x is not a 64 MB L3 cache CPU, it's a 2x32 MB cpu. Same thing Milan-X, it's a 8x96 MB cpu. Accessing the Other CCD L3 cache is as slow as accessing the main memory. That cache is only realy useful for the core inside that CCD.
Fair point though it can still fit and access 2 files that are 32MB w/o having to fall back to memory. Not quite as good as true 64MB design that's not shared, but good regardless.
Posted on Reply
#77
DemonicRyzen666
considering the heat problems this person claims of this 5800x 3D
I feel this just like piled more of a thermal issue that was already present on their current Zen3 cpu's
The heat density needs to be more spread out, however there must % that AMD is aiming for with yields a 10% larger die could equal at lot less chiplets for them in the end
Posted on Reply
#78
Leshy
ACE76This comment is getting old...they purposely used 720p to make the benchmark CPU bound. That takes the GPU out of the equation so you can see which CPU is having a better impact on the score. They're not benchmarking the video card right? So this is how you get a clear picture of which CPU is better.
ye .. so old and still u dont get it ... i know why they do this, but its stupid :) no one ll use it like this so whats the point? all this creates is confusion, because everyone think, that u need some HEDT platform to run 3090ti .. meanwhile u can uses r 3100 with 2080ti (80-90% tested witcher 3 city with some minor dips -100 fps) on 1080p ... so for gaming all u need is 12400, if u dont want to play with 300fps on ultralow 720p :D
Posted on Reply
#79
QuietBob
3080Ti @ 1080p ultra
5800X3D vs. 12900K 1% lows

Assassin's Creed: Origins -3%
Borderlands 3 -1%
Control +11%
Death Stranding +9%
F1 2020 +1%
Final Fantasy XV +26%
Metro Exodus +15%
Shadow of the Tomb Raider +28%
Middle-Earth: Shadow of War +7%
Strange Brigade -1%
The Witcher 3 +1%

I'd call it a tie in five games and a win for the 5800X3D in the other six. The difference is going to be less pronounced in higher resolutions, or with a weaker GPU, but still. Based on these results alone, AMD have delivered on their promise.
Posted on Reply
#80
InVasMani
Leshyye .. so old and still u dont get it ... i know why they do this, but its stupid :) no one ll use it like this so whats the point? all this creates is confusion, because everyone think, that u need some HEDT platform to run 3090ti .. meanwhile u can uses r 3100 with 2080ti (80-90% tested witcher 3 city with some minor dips -100 fps) on 1080p ... so for gaming all u need is 12400, if u dont want to play with 300fps on ultralow 720p :D
In years past I'd be inclined to agree it's a bit silly from a real world perspective, but I get why it's done at the same time. It's to test how it will hold up as GPU hardware matures. These days especially it's becoming more readily important and relevant with upscale that is improving and getting better and more widely used at the same time. You can get tons of additional performance via upscale now and 720p with upscale can look pretty much as good as native 1080p or even arguably better perhaps.

Given it another GPU architecture or two and it will be difficult to debate the merits of it in regard to image quality trade offs in regard to larger performance gains to be had. It'll be a bigger deal when 1080p can do upscale to 4K at higher quality closer in parity to native 4K while offering huge performance upside. I wouldn't say it's perfect yet, but it is improving.
Posted on Reply
#81
Why_Me
I'm curious to see how this $449 cpu stacks up against the $310 12700F at 1440.
Posted on Reply
#82
eidairaman1
The Exiled Airman
DemonicRyzen666considering the heat problems this person claims of this 5800x 3D
I feel this just like piled more of a thermal issue that was already present on their current Zen3 cpu's
The heat density needs to be more spread out, however there must % that AMD is aiming for with yields a 10% larger die could equal at lot less chiplets for them in the end
Just use a Scythe Ashura, Mugen, Aro m14G. Bet the 12000 series is hah.
QuietBob3080Ti @ 1080p ultra
5800X3D vs. 12900K 1% lows

Assassin's Creed: Origins -3%
Borderlands 3 -1%
Control +11%
Death Stranding +9%
F1 2020 +1%
Final Fantasy XV +26%
Metro Exodus +15%
Shadow of the Tomb Raider +28%
Middle-Earth: Shadow of War +7%
Strange Brigade -1%
The Witcher 3 +1%

I'd call it a tie in five games and a win for the 5800X3D in the other six. The difference is going to be less pronounced in higher resolutions, or with a weaker GPU, but still. Based on these results alone, AMD have delivered on their promise.
Considering this is not even the top cpu from AMD. And it took 6 Gens for intel just to catch up but still are hah...
Posted on Reply
#83
Why_Me
eidairaman1Just use a Scythe Ashura, Mugen, Aro m14G. Bet the 12000 series is hah.
Considering this is not even the top cpu from AMD. And it took 6 Gens for intel just to catch up but still are hah...
So what would be the top AMD cpu for gaming if not this one in the OP? :confused:
Posted on Reply
#84
Chrispy_
InVasManiThe 1% lows is what matters most critically and also where DDR4 still really holds up most well against DDR5 at the same time. I look at DDR5 almost like micro stutter in SLI/CF against a single card. There are cases where a single card of a bit lower average is so much smoother on micro stutter that it's still generally worth the trade off. It's much like some cases of a dual core vs quad core that can both generally run a game alright enough, but frame time variance on the dual core just craters at certain points while the quad core hums along smoothly. I'm looking at the 3D stacked cache the same way from the results I'm seeing thus far.
1% lows and stuttering are typically caused by the game engine or render pipeline waiting on data from main memory. Bandwidth typically hasn't been a bottleneck for gaming since DDR4-2666 hit the market in 2015. Sure, there's some gain to be had in a few scenarios by adding more bandwidth but pretty much every memory-bandwidth scaling review/article has the jump from 2400MHz to 2666MHz at the beginning of diminishing returns and by DDR4-3200 there's almost no point adding more bandwidth. Yes, Zen2/Zen3 CPUs benefit from faster RAM, but that's not because games are using more bandwidth, it's because the CPU's infinity fabric can be overclocked only by using faster RAM, due to the 1:1 FCLK:MCLK ratio.

So, to the problem with DDR5; Common stuff is DDR5-4800 CL40, which actually has the same latency as DDR4-3200 CL30. Given that DDR4-3200 CL16 is the cheap stuff and DDR4-3200 CL14 is quite common, that makes DDR5 laughably slow in terms of latency.

DDR5 is literally double the (access) latency for more bandwidth, and games want low latency not more bandwidth. The 5800X3D does so well for 1% lows because the massive cache reduces access latency to data when it matters most.
Posted on Reply
#85
ratirt
fevgatosWhile amd users have a grand time. They only had to go nuts on the Internet to force amd to backtrack and give support for x470 (of course, 6 months after thr launch of zen 3) or just... wait with a 6 years old motherboard to get a bios update (x370) SOMETIME in the future for a soon to be 2 years old cpu. Absolutely amazing support man..
You are so demeaning towards AMD and what they are doing for their customers. You are overlooking this MOBO stuff in such a bad way. There's been hundreds or even thousands of practices that paint Intel in a bad light and yet you are still OK with it. AMD promised AM4 support for 5 years and it has been kept with new product support extensions for older hardware. I can't say the same about Intel and your 'too late or months later' statement is just so out of place here. Just because Intel is supporting AL and RL with the current chipset for 12th gen CPU, it is only because they are losing market share and people go AMD. Intel is very conservative in that department.
THU31The 12900K(S) is not just a gaming CPU, though.

What they should do is release an i7 without E-cores, but with maxed out clocks.
If it is not being used for gaming but MT workloads then prepare for toasty tasks unless you have a beefy cooling system
Posted on Reply
#86
fevgatos
ratirtYou are so demeaning towards AMD and what they are doing for their customers. You are overlooking this MOBO stuff in such a bad way. There's been hundreds or even thousands of practices that paint Intel in a bad light and yet you are still OK with it. AMD promised AM4 support for 5 years and it has been kept with new product support extensions for older hardware. I can't say the same about Intel and your 'too late or months later' statement is just so out of place here. Just because Intel is supporting AL and RL with the current chipset for 12th gen CPU, it is only because they are losing market share and people go AMD. Intel is very conservative in that department.


If it is not being used for gaming but MT workloads then prepare for toasty tasks unless you have a beefy cooling system
Yes they promised 5 years support and then they backtracked. You do remember they announced that x470 will not support zen 3 right? You do remember it took a public outcry for them to change their mind back, right? You do realize x370 still doesn't support zen 3 right? Thats already an almost 2 years old cpu...
Posted on Reply
#87
ratirt
fevgatosYes they promised 5 years support and then they backtracked. You do remember they announced that x470 will not support zen 3 right? You do remember it took a public outcry for them to change their mind back, right? You do realize x370 still doesn't support zen 3 right? Thats already an almost 2 years old cpu...
I dont think you understand what 5 years of AM4 support means. The socket has been used for 5 straight years with new CPUs realese every year. No it was not like you describe it to be.
AMD stated the CPU can work with the x470 but it will have to be with the board makers to support it not AMD. So no, you are not correct on this one and you are twisting the truth. The other problem here was the low mem on the mobos to support all the processors. Focus on that problem. SO MANY PROCESSORS FOR THE SOCKET SO THE MEMORY GOT SHORT TO GET THEM ALL SUPPORTED. Do you get this? That is why MSI for instance released new boards MAX with increased mem to get all CPUs supported with the AGESA. Otherwise it would not FIT SINCE IT WAS TOO DAMN BIG.
x370 that is your problem now? Dude look at Intel support. 370 supports 1000, 2000, 3000 series is that not enough? How many usually Intel support if you can remind me?
Posted on Reply
#88
GURU7OF9
aQiYeh we all know ddr4 was pre mature where as ddr3 had remarkable marks everywhere. I am not supporting ddr5 here but its still in its very early stages. Similarly 12900KS might use its potential but will be restricted to limited games/apps (new titles would be optimised to use ddr5 just like arc gpus)
But one thing is for sure 12900KS is an overclocking bad boy and 5800x3d is not. The above mentioned benches are also on stock for both where 5800x3d admittedly beats 12900KS yet we all wana see the 4k fun here (I personally want to see how 5800x3d performs under 4k circumstances).
I dont get it the bencher could get us with 4k benchmarks but did not. Stilll waiting. One thing more the value per dollar will be prominent enough for users to select options.


You bet it will thats what Intel does. Two chipsets and two generations then jump socket :p
"The 12900ks overclocking bad boy" .
I know Intel overclocks better but seriously this ks version is already binned and overclocked from the factory to its limit. It is only for Intel to still try and claim tbe so called "Gaming Crown"! Their is no more headroom left in it! Which makes its overclocking capabilities pretty much non existant or minimal at best !
So as far as overclocking goes its not really relevant!
All the high end cpus from Intel and AMD are cranked pretty hard straight out of the box with minimal headroom . Only from the lower specced ones can any reasonable gains be made !

So far from these preliminary tests, it appears that the Ryzen 5800x 3dvcache will be very competitive with 12900ks in gaming !
Posted on Reply
#89
fevgatos
ratirtI dont think you understand what 5 years of AM4 support means. The socket has been used for 5 straight years with new CPUs realese every year. No it was not like you describe it to be.
AMD stated the CPU can work with the x470 but it will have to be with the board makers to support it not AMD. So no, you are not correct on this one and you are twisting the truth. The other problem here was the low mem on the mobos to support all the processors. Focus on that problem. SO MANY PROCESSORS FOR THE SOCKET SO THE MEMORY GOT SHORT TO GET THEM ALL SUPPORTED. Do you get this? That is why MSI for instance released new boards MAX with increased mem to get all CPUs supported with the AGESA. Otherwise it would not FIT SINCE IT WAS TOO DAMN BIG.
x370 that is your problem now? Dude look at Intel support. 370 supports 1000, 2000, 3000 series is that not enough? How many usually Intel support if you can remind me?
I think you don't remember. Amd announced that x470 will not support zen 3. Theb there was a huge outcry and they backtracked. Yes intel aint any better but at least you know BEFOREHAND. With amd you are just rolling the dice when where and if you are getting support.
Posted on Reply
#90
ratirt
fevgatosI think you don't remember. Amd announced that x470 will not support zen 3. Theb there was a huge outcry and they backtracked. Yes intel aint any better but at least you know BEFOREHAND. With amd you are just rolling the dice when where and if you are getting support.
There was no outcry but people who don't understand the issues here. with the x370 boards, some of them have issues with working with Ryzen 3rd gen due to hardware constraints and that is why AMD said no support but these still can work on x370 if board partners support it.
You are not rolling anything you just want to make it look bad. I totally disagree with you.
Posted on Reply
#91
fevgatos
ratirtThere was no outcry but people who don't understand the issues here. with the x370 boards, some of them have issues with working with Ryzen 3rd gen due to hardware constraints and that is why AMD said no support but these still can work on x370 if board partners support it.
You are not rolling anything you just want to make it look bad. I totally disagree with you.
I think you dont remember what happened. In april - may of 2020 amd announced that x470 wont be getting zen 3. Then there was an outcry by hardware media and users and amd backtracked. These are the facts...
Posted on Reply
#92
Leshy
fevgatosI think you don't remember. Amd announced that x470 will not support zen 3. Theb there was a huge outcry and they backtracked. Yes intel aint any better but at least you know BEFOREHAND. With amd you are just rolling the dice when where and if you are getting support.
agree 100% :D bought x370 crosshair hero VI to be able to upgrade to better ryzen cpu :D and i cant (only with crossflash :D )
ratirtThere was no outcry but people who don't understand the issues here. with the x370 boards, some of them have issues with working with Ryzen 3rd gen due to hardware constraints and that is why AMD said no support but these still can work on x370 if board partners support it.
You are not rolling anything you just want to make it look bad. I totally disagree with you.
i would like to know what are that HW constraints.

anyway why they make promises when they can't keep them?
Posted on Reply
#93
GURU7OF9
fevgatosI think you dont remember what happened. In april - may of 2020 amd announced that x470 wont be getting zen 3. Then there was an outcry by hardware media and users and amd backtracked. These are the facts...
One could argue the other way and say at least AMD are flexible and willing to listen to what people want and change to suit as apposed to Intel which is you only get what we tell you, you can have ! No flexibility at all .
The only thing i have seen change from Intel is they allow memory overclocking on more chipsets not just the high end z series!
But they are still masters at bleeding everyone for every little extra feature!
Posted on Reply
#94
Chrispy_
Leshyagree 100% :D bought x370 crosshair hero VI to be able to upgrade to better ryzen cpu :D and i cant (only with crossflash :D )

i would like to know what are that HW constraints.

anyway why they make promises when they can't keep them?
There were two primary hardware constraints, both of them were created by the motherboard manufacturers and not AMD.
  1. Motherboard manufacturers saved a buck by using small 16MB BIOS chips that are too small to add new CPU support without stripping features or older CPU support. This caused massive complications for AMD when writing new standard AGESA packages.
  2. Early 300-series boards had, quite frankly, shitty VRM design and poor memory topology. They work, but they're not high-performance parts by today's standards. A combination of more experience with Ryzen and more effort (because AMD's market share increased enough to warrant the extra effort) means that budget X570 and B550 boards are often more capable than flagship 300-series boards. As for budget 300-series boards, some of those are woeful and weren't even up to the task of a 2700X, let alone any of the Zen2 or Zen3 chips.
This argument between @ratirt and @fevgatos is a valid argument and they're both taking polar-opposite stances in what is actually a very grey area. AMD made a broad-sweeping promise, and the motherboard manufacturers then made that promise hard to keep after the promise was made.

So, AMD did initially break that promise because they looked at existing 300-series boards and deemed the complications the motherboard manufacturers had created too messy to work around. They chose that initial stance and people rightly called them out for breaking their original promise. You can finger-point all you want but ultimately boards that don't have Zen3 support to this day are the result of the motherboard manufacturer not caring enough about it, or having insufficient staff dedicated to BIOS updates. AMD have now at least upheld their side of the promise (release the AGESA code that will let motherboard manufacturers create Zen3-compatible BIOSes for old boards) so now the anger should be directed at the motherboard manufacturer in question for dragging their feet and offering shitty/slow updates.
Posted on Reply
#95
GURU7OF9
Chrispy_There were two primary hardware constraints, both of them were created by the motherboard manufacturers and not AMD.
  1. Motherboard manufacturers saved a buck by using small 16MB BIOS chips that are too small to add new CPU support without stripping features or older CPU support. This caused massive complications for AMD when writing new standard AGESA packages.
  2. Early 300-series boards had, quite frankly, shitty VRM design and poor memory topology. They work, but they're not high-performance parts by today's standards. A combination of more experience with Ryzen and more effort (because AMD's market share increased enough to warrant the extra effort) means that budget X570 and B550 boards are often more capable than flagship 300-series boards. As for budget 300-series boards, some of those are woeful and weren't even up to the task of a 2700X, let alone any of the Zen2 or Zen3 chips.
This argument between @ratirt and @fevgatos is a valid argument and they're both taking polar-opposite stances in what is actually a very grey area. AMD made a broad-sweeping promise, and the motherboard manufacturers then made that promise hard to keep after the promise was made.

So, AMD did initially break that promise because they looked at existing 300-series boards and deemed the complications the motherboard manufacturers had created too messy to work around. They chose that initial stance and people rightly called them out for breaking their original promise. You can finger-point all you want but ultimately boards that don't have Zen3 support to this day are the result of the motherboard manufacturer not caring enough about it, or having insufficient staff dedicated to BIOS updates. AMD have now at least upheld their side of the promise (release the AGESA code that will let motherboard manufacturers create Zen3-compatible BIOSes for old boards) so now the anger should be directed at the motherboard manufacturer in question for dragging their feet and offering shitty/slow updates.
I would agree with this . Early days of Ryzen manufacturers were not really sure if they were going to be any good. So investment in their products paled in comparison to Intel stuff.
As time moved on, they realised these products were actually gong to be pretty good so they started putting in a lot more effort and quality components . So 300 series got average compenets and support, where as 400 started to get much better quality components and support and so on !
Posted on Reply
#96
Gamer1882
If this legit then explain why my 12600k D4 with XMP2 enabled with a 3070 using the same settings and resolution beats your 12900kf by 16 FPS and falls behind the 5800X3D by 1 FPS in shadow of the raider?
QuietBob3080Ti @ 1080p ultra
5800X3D vs. 12900K 1% lows

Assassin's Creed: Origins -3%
Borderlands 3 -1%
Control +11%
Death Stranding +9%
F1 2020 +1%
Final Fantasy XV +26%
Metro Exodus +15%
Shadow of the Tomb Raider +28%
Middle-Earth: Shadow of War +7%
Strange Brigade -1%
The Witcher 3 +1%

I'd call it a tie in five games and a win for the 5800X3D in the other six. The difference is going to be less pronounced in higher resolutions, or with a weaker GPU, but still. Based on these results alone, AMD have delivered on their promise.
According to this even my 12600k beats the 12900kf by 16% so somehow I call these benchmarks utter BS
Posted on Reply
#97
Leshy
Chrispy_There were two primary hardware constraints, both of them were created by the motherboard manufacturers and not AMD.
  1. Motherboard manufacturers saved a buck by using small 16MB BIOS chips that are too small to add new CPU support without stripping features or older CPU support. This caused massive complications for AMD when writing new standard AGESA packages.
  2. Early 300-series boards had, quite frankly, shitty VRM design and poor memory topology. They work, but they're not high-performance parts by today's standards. A combination of more experience with Ryzen and more effort (because AMD's market share increased enough to warrant the extra effort) means that budget X570 and B550 boards are often more capable than flagship 300-series boards. As for budget 300-series boards, some of those are woeful and weren't even up to the task of a 2700X, let alone any of the Zen2 or Zen3 chips.
This argument between @ratirt and @fevgatos is a valid argument and they're both taking polar-opposite stances in what is actually a very grey area. AMD made a broad-sweeping promise, and the motherboard manufacturers then made that promise hard to keep after the promise was made.

So, AMD did initially break that promise because they looked at existing 300-series boards and deemed the complications the motherboard manufacturers had created too messy to work around. They chose that initial stance and people rightly called them out for breaking their original promise. You can finger-point all you want but ultimately boards that don't have Zen3 support to this day are the result of the motherboard manufacturer not caring enough about it, or having insufficient staff dedicated to BIOS updates. AMD have now at least upheld their side of the promise (release the AGESA code that will let motherboard manufacturers create Zen3-compatible BIOSes for old boards) so now the anger should be directed at the motherboard manufacturer in question for dragging their feet and offering shitty/slow updates.
so u are telling, that this board www.asrock.com/MB/AMD/B550M-HDV/index.fr.asp is capable of running 5950x, but this one rog.asus.com/motherboards/rog-crosshair/rog-crosshair-vi-hero-model/ isnt :D give me a break :D its not supported because they dont want it to be :D idc if its AMD or ASUS part ... :D end of story
Posted on Reply
#98
SL2
nexxustyYes!

This is the best comment yet I've seen on the 5800X3D.
Are we forgetting Broadwell S all of a sudden? ;)
Posted on Reply
#99
medi01
ARFNo one will do it but as the graphics cards performance increases, this CPU will bottleneck the graphics card less.
Which is NOT something we saw happen in reality. Ryzen was lagging in oldies at 720p, but on par in newer games that were supposed to "bottleneck" it.

What does it take for this weirdo myth to die?
Posted on Reply
#100
THU31
medi01Which is NOT something we saw happen in reality. Ryzen was lagging in oldies at 720p, but on par in newer games that were supposed to "bottleneck" it.

What does it take for this weirdo myth to die?
Newer games are more GPU-bound, because they usually have better graphics, which makes them run at lower framerates.

But after several years the same games will become CPU-bound, when your new graphics card is twice as powerful or faster.

There comes a time for every game, where it will have pretty much the same framerate in 720p and in 4K, if we only increase GPU performance, but the CPU stays the same.

Benchmarking at 720p basically shows the maximum possible framerate you can get on a specific CPU, no matter what GPU you use.

If your CPU is limiting you to 200 FPS in 720p with a 3090 Ti, it will also limit your 4090 in exactly the same way.
Posted on Reply
Add your own comment
Apr 16th, 2024 16:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts