• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

First Game Test With the Ryzen 7 5800X3D Appears as Promised

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,834 (2.50/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
XanxoGaming has now posted its first game benchmark with the Ryzen 7 5800X3D, paired with a NVIDIA GeForce RTX 3080 Ti Founders Edition. They put it up against an Intel Core i9-12900KS and Core i9-12900K. However, as you might have deduced from the headline of this news post, so far, they've only run a single game, but are promising to deliver more results shortly. That single game so far is Shadow of the Tomb Raider at 720p and using low settings, which means that this is a far cry from a real world scenario, but it does at least give a first taste of what's to come. For whatever reason, the Core i9 systems are using an NVIDIA GeForce RTX 3090 Ti and the CPUs are paired with DDR5 memory rated at 4800 MHz CAS 40. The Ryzen 7 5800X3D has been given another pair of 8 GB modules, so it's now using dual rank memory, but still at 3200 MHz and CAS 14.

In their test, the Core i9-12900K averages around 190 FPS, which they place as their baseline. The Core i9-12900KS manages around 200 FPS, or a bit over a five percent improvement. These benchmark numbers are provided by CapFrameX that claims that due to the low resolution used, the GPU doesn't really matter and although it's not an apples-to-apples comparison, it's very close. So what about the Ryzen 7 5800X3D? Well, it gets an average FPS number of 231, which is a bit odd, since the Intel CPU benchmarks are rounded and the AMD ones are not. Regardless, that's over a 20 percent increase over the Core i9-12900K and over 15 percent of the Core i9-12900KS. XanxoGaming is promising more benchmarks and those will be delivered at 1080p at Ultra settings according to the publication. In other words, this is still not what most of us have been waiting for.



View at TechPowerUp Main Site | Source
 
I think certain games might get a boost with the new Ryzen and others decrease in performance due to lower clocks but I guess we have to wait for some reviews. The ram configs are a bit off here and could have been more in line with both Intel and AMD CPUs
 
Good to see it wasn't just hype. What kind of difference should we expect from ram in this benchmark? I mean AMD vs Intel difference.
 
I like a comparison made using different hardware. Very informative.

Not.
 
Looking forward to just dropping this into my 4-5yr old system and get gaming performance like a boss for cheaper then a hole new system (Intel) talk about awesome!
 
Looking forward to just dropping this into my 4-5yr old system and get gaming performance like a boss for cheaper then a hole new system (Intel) talk about awesome!
That is actually a good point. I just need to see if there is any benefit for me with my current 5800x. I play 4k mostly. I have doubts there will be any benefit here for my case but time will tell.
 
In the mean time, here in techpowerup, with a rtx 3080 non ti,
1649401809551.png

I don't know how the guy managed to achieved such fps.

I guess the main reason for amd, to deliver this new cpus in this time, late, is to sell them to the guys who have the 1000 and 2000 ryzen platforms, like me.
 
Yeah I think that this is fake too.
 
I'll wait until someone who actually knows what they're doing gets their hands on it.
 
Won't be surprised if it's faster, AMD created it on our purpose for this.
But I don't think it would make any difference for gaming at reasonable resolutions such as 1440p or 4k, at Fullhd most people won't be using a 3090 or a 6900XT hence are GPU limited anyways.
 
1649404622460.png
 
Last edited:
In the mean time, here in techpowerup, with a rtx 3080 non ti,
View attachment 242880
I don't know how the guy managed to achieved such fps.

I guess the main reason for amd, to deliver this new cpus in this time, late, is to sell them to the guys who have the 1000 and 2000 ryzen platforms, like me.
Different test scene. Not hard to achieve very different results.
 
Will definitely wait for full tests before drawing any conclusions. Lots of conflicting info about this one.

It's not conflicting, just that the extra 64mb l3 cache is only useful in really specific workloads.

The idea of ultra expensive gaming CPUs is only for a niche market anywhere, people who will pay 3k for a system to play at 1080p 200fps +.

If you play at 4k, non competitive, any cpu, even 7 year old 4790k OC can do the job.
 
Different test scene. Not hard to achieve very different results.
Yep, but when someone wants to show something, he needs to do it the right way, not a random scene from nowhere.
So this screenshot gives us nothing as information :)
 
If you play at 4k, non competitive, any cpu, even 7 year old 4790k OC can do the job.
Yes but you may notice some 1%low figures pretty low even at 4k in some games with the 4790K.
 
Why wouldn't you use the in-game benchmark, which provides easily comparable results?
 
It's not conflicting, just that the extra 64mb l3 cache is only useful in really specific workloads.

The idea of ultra expensive gaming CPUs is only for a niche market anywhere, people who will pay 3k for a system to play at 1080p 200fps +.

If you play at 4k, non competitive, any cpu, even 7 year old 4790k OC can do the job.
And we haven't yet exactly determined what those loads are...
 
Shadow of the Tomb Raider at 720p
-Why are you running games at 720p in 2022???
-Because at higher resolution there is barely any difference between CPUs!
-Oh... OK then...
 
Why wouldn't you use the in-game benchmark, which provides easily comparable results?
You can compare things that are comparable:
Different CPU, Same memory type/speed, same gpu

Here you have:
Different CPU, Different memory type/speed, different GPU
 
You can compare things that are comparable:
Different CPU, Same memory type/speed, same gpu

Here you have:
Different CPU, Different memory type/speed, different GPU
I somehow missed that. Makes the entire "test" rather questionable, to put it mildly.
 
Yep, but when someone wants to show something, he needs to do it the right way, not a random scene from nowhere.
So this screenshot gives us nothing as information :)
It's not random, they're using the CapFrameEx software and whatever scene that uses, hence the comparison to hardware they don't have on hand.

I don't think it's a good test of this new CPU by any means, but you really can't compare with tests done by TPU.
 
It's not conflicting, just that the extra 64mb l3 cache is only useful in really specific workloads.

The idea of ultra expensive gaming CPUs is only for a niche market anywhere, people who will pay 3k for a system to play at 1080p 200fps +.

If you play at 4k, non competitive, any cpu, even 7 year old 4790k OC can do the job.
Until you play a properly CPU limited game like Stellaris, CK3, Cities Skylines, Civ 6, Old Wolrd, Factorio etc etc.

Then you can be getting 2,000 fps at 4k Ultra but slow turn times or slow tic rates make them a pain to play on thr largest maps which totally changes the game play experience.
 
Back
Top