• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 4090 & 53 Games: Core i9-13900K vs Ryzen 7 5800X3D

Joined
Aug 25, 2021
Messages
955 (1.04/day)
7600x beats x3d and raptor lake according to this video, in highs, in lows, everything. as long as you pair with ddr5 6000 cas 30... how come I didn't realize this when I bought my 13600k on launch day... im so fucking confused right now. ugh my head hurts.
If you game only and do nothing more, 7600X is marginally better. Nothing to worry too much about. You will, however, not be able to upgrade from 13600K.
You made no mistake with this choice. 13600K is much better all-rounder beating 7600X by a lot in multi core workloads and providing great performance in games.
True that.
Here you can read a review in which the testing methodology in games is based on finding the most cpu heavy scenarios (not built-in benchmarks like so many reviewers like to do) to show differences between different cpus and as you will see 13600K performs really solid better than 7600X in games
Doubtful. HUB recent tests giver edge to 7600X in gaming.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,506 (4.53/day)
Location
Kepler-186f
If you game only and do nothing more, 7600X is marginally better. Nothing to worry too much about. You will, however, not be able to upgrade from 13600K.

I don't want to upgrade after this. The industry is going crazy these days, this is it for me for several years. lol
 
  • Like
Reactions: N/A
Joined
Jan 17, 2018
Messages
359 (0.16/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
In that review, the 13900k performs 13% better than the 5800x3d (100/88.6). The 13900k review is 15 days old, nothing changed.
I'd just like to know how we went from 13% with a rtx 3080 to 6.2% with a rtx 4090 in a completely cpu bound scenario.
What do you mean nothing changed?

1. Different games were benchmarked. Games are not a monolith, the 13900k being 13% faster in 12 games doesn't mean it'll be 13% faster in 53 games. It's pretty clear that the 13900k test suite favored the 13900k over the 5800X3D.
2. The entire GPU & architecture changed. This can change the performance even in similar games.
3. A different Nvidia driver was used. See #2.
 
Last edited:
Joined
Oct 12, 2022
Messages
15 (0.03/day)
Doubtful. HUB recent tests giver edge to 7600X in gaming.
This is a very reliable source in my country with a long history. HU doesn't even give the places in the games where they test and for some titles they're running built-in benchmarks. This is why their results are flattened and the differences between the processors are small.
 

nomadka670

New Member
Joined
Nov 5, 2022
Messages
8 (0.02/day)

7700x with 6000mhz cl30 Memory beats the 13700k same with 4090.

Games like Csgo , Valorant the Intel dont have any chance.

So when you link a cyberpunk where the 13600k beats all the ryzen its Just false a little bit.

Its more like which game you play and Pick the best Processor
 
Last edited:
Joined
Oct 26, 2022
Messages
57 (0.12/day)
Well, I kind of regret getting raptor lake now after watching this video... looks like 7600x beats it easily across the board as long as you slot in some ddr5 6000 cas 30 ram. i think even lows are better by 20% with the 7600x over the 5800x3d and 13600k... turns out 7600x was the sleeper winner all along you just have to use really high end ram.

fuck. i don't know know what to do now. i should have never betrayed my love for AMD after all... :cry:

7600x beats x3d and raptor lake according to this video, in highs, in lows, everything. as long as you pair with ddr5 6000 cas 30... how come I didn't realize this when I bought my 13600k on launch day... im so fucking confused right now. ugh my head hurts.

Would 7950X pair with a very low latency CL30 DDR5 6000 beats 13900K with the best DDR5 7800 as well I wonder?
 

ausmisc

New Member
Joined
Oct 19, 2022
Messages
1 (0.00/day)
Hardware Unboxed just seem to have higher Zen4 results compared to the majority of reviews. Even when comparing the same games at the same resolutions, but maybe that's in-game benchmark vs in-game play I'm not sure. The general launch review consensus is that a DDR5 13600K competes with the 7700x.

 

nomadka670

New Member
Joined
Nov 5, 2022
Messages
8 (0.02/day)
Hardware Unboxed just seem to have higher Zen4 results compared to the majority of reviews. Even when comparing the same games at the same resolutions, but maybe that's in-game benchmark vs in-game play I'm not sure. The general launch review consensus is that a DDR5 13600K competes with the 7700x.

Thats why i asking for a 7700 vs 13700 benchmark with the Right ram and 4090

Lets see the true.
 
Joined
Oct 26, 2022
Messages
57 (0.12/day)
Intel and MB manufacturers been pushing for this whole DDR5 8000-9000+ bus speed aka the higher, the better.

I wanna know if a low latency CL30s DDR5 6000 really make up for DDR5 9000+ speed in real-world gaming usage.
 
Last edited:
Joined
Sep 19, 2015
Messages
21 (0.01/day)
Thank you for finally having all games using they proper APIs in these charts.
 
Joined
Feb 14, 2020
Messages
116 (0.08/day)
What do you mean nothing changed?

1. Different games were benchmarked. Games are not a monolith, the 13900k being 13% faster in 12 games doesn't mean it'll be 13% faster in 53 games. It's pretty clear that the 13900k test suite favored the 13900k over the 5800X3D.
2. The entire GPU & architecture changed. This can change the performance even in similar games.
3. A different Nvidia driver was used. See #2.
Nothing changed about the platform, as i'm saying since yesterday they went from a rtx 3080 to a rtx 4090 which is a more powerful gpu. That's what changed and that's what should have caused a bigger gap.
Again, some of the additional 43 games have even a bigger gap (i listed 5 of them) even if the initial test suite was favoring the 13900k.

The difference shouldn't be 13%, that's obvious, but it can't be half of that with the most powerful gpu on the market. I don't know which one is incorrect but they can't coexist.
 
Joined
Aug 25, 2021
Messages
955 (1.04/day)
This is a very reliable source in my country with a long history. HU doesn't even give the places in the games where they test and for some titles they're running built-in benchmarks. This is why their results are flattened and the differences between the processors are small.
Its more like which game you play and Pick the best Processor
Hardware Unboxed just seem to have higher Zen4 results compared to the majority of reviews.
At the end of the day folks, Raptor Lake CPUs have a slight edge in gaming with current imperfect measurements (different RAM and all...), but this is nothing revolutionary to lose your head about or argue for/against apologetically. Difference between the top three CPUs is 1.5%-9%, which is mostly negligible, depends on RAM and selection of games.

Average difference seen below is not necessarily relevant for each individual user, depending on their system and game content. Plus, AMD is now addressing with Windows and game developers reports that 7900X and 7950X are a bit slower when both chiplets are used. Expect some improvements in gaming performance of those top CPUs in months ahead.
Performance Intel ADL RPL Zen4 Zen3 3D centre.png
 
Joined
Jan 17, 2018
Messages
359 (0.16/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
Nothing changed about the platform, as i'm saying since yesterday they went from a rtx 3080 to a rtx 4090 which is a more powerful gpu. That's what changed and that's what should have caused a bigger gap.
Again, some of the additional 43 games have even a bigger gap (i listed 5 of them) even if the initial test suite was favoring the 13900k.

The difference shouldn't be 13%, that's obvious, but it can't be half of that with the most powerful gpu on the market. I don't know which one is incorrect but they can't coexist.
But something did change about the platform. I listed both: 3080 to 4090 & different drivers. This can change previous tested game results, as there may be different limitations that the new card is introducing that the old one was not. Raw power is not everything when you're limited by the CPU. Architecture & Driver overhead/optimization can play large roles when you start hitting CPU limitations in a game.

The 5800X3D is also an anomaly in terms of CPU's. It's not simply a brute-force ghz & IPC CPU, it relies on the cache to do most the heavy lifting. There are many games that still favor brute force, and those are shown when the 13900k is clearly ahead in the 1080p graph.
 
Joined
Jun 13, 2019
Messages
466 (0.27/day)
System Name Fractal
Processor Intel Core i5 13600K
Motherboard Asus ProArt Z790 Creator WiFi
Cooling Arctic Cooling Liquid Freezer II 360
Memory 16GBx2 G.SKILL Ripjaws S5 DDR5 6000 CL30-40-40-96 (F5-6000J3040F16GX2-RS5K)
Video Card(s) PNY RTX A2000 6GB
Storage SK Hynix Platinum P41 2TB
Display(s) LG 34GK950F-B (34"/IPS/1440p/21:9/144Hz/FreeSync)
Case Fractal Design R6 Gunmetal Blackout w/ USB-C
Audio Device(s) Steelseries Arctis 7 Wireless/Klipsch Pro-Media 2.1BT
Power Supply Seasonic Prime 850w 80+ Titanium
Mouse Logitech G700S
Keyboard Corsair K68
Software Windows 11 Pro
In that review, the 13900k performs 13% better than the 5800x3d (100/88.6). The 13900k review is 15 days old, nothing changed.
I'd just like to know how we went from 13% with a rtx 3080 to 6.2% with a rtx 4090 in a completely cpu bound scenario.
It has nothing to do with the CPU. It's the 4090. It's a trend I noticed in all the reviews I've seen.
 
Joined
Oct 26, 2022
Messages
57 (0.12/day)
Nothing changed about the platform, as i'm saying since yesterday they went from a rtx 3080 to a rtx 4090 which is a more powerful gpu. That's what changed and that's what should have caused a bigger gap.
Again, some of the additional 43 games have even a bigger gap (i listed 5 of them) even if the initial test suite was favoring the 13900k.

The difference shouldn't be 13%, that's obvious, but it can't be half of that with the most powerful gpu on the market. I don't know which one is incorrect but they can't coexist.
nVIDIA GRD driver issues with hardware scheduler and driver overhead perhaps?
 
Joined
Jun 6, 2022
Messages
591 (0.93/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
What do you mean nothing changed?

1. Different games were benchmarked. Games are not a monolith, the 13900k being 13% faster in 12 games doesn't mean it'll be 13% faster in 53 games. It's pretty clear that the 13900k test suite favored the 13900k over the 5800X3D.
2. The entire GPU & architecture changed. This can change the performance even in similar games.
3. A different Nvidia driver was used. See #2.
And windows 10, not 11. From Alder it is known as Windows 11 must use.
For the review 13900K used Windows 11, now it used 10 and I don't understand why.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,892 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Thank you for finally having all games using they proper APIs in these charts.
I thought of you before publishing the article and made sure Days Gone is DX11 ;)

For the review 13900K used Windows 11, now it used 10 and I don't understand why.
Because the vast majority of gamers of Windows 10. I guess I could do a "50 Games Windows 10 vs Windows 11 RTX 4090 + 13900K" article next .. or maybe something with 7700X or 7600X or 13700K. I guess I could start a poll
 
Joined
Aug 22, 2019
Messages
160 (0.10/day)
System Name Mjolnir
Processor 9900KS bare die
Motherboard Gigabyte Z390 Aorus Master
Cooling Optimus Signature V2, Heatkiller IV 2080Ti block, dual 280mm radiators, dual 10W DDC pumps
Memory 32GB G.Skill 3200C14
Video Card(s) Nvidia 2080Ti Founders Edition
Storage 3x M.2 NVMe
Display(s) Alienware 34" Ultrawide 120Hz 3440x1440
Case Fractal R6
Audio Device(s) Outlaw RR2150 stereo receiver driving DIY kits, Sennheiser HD6XX headphones
Power Supply Seasonic Prime 1000W
It's such a great time right now, so many good options for hardware that you can't go wrong. All depends on preferences, needs and your pocket book.

Although from my perspective, if one is gaming at 1080P still, they're probably on a lower end GPU. I wonder how scaling would work for something like a 3060 class GPU instead of the 4090 at those resolutions. Things run so fast with an xx80-class GPU or better that 1080P runs are basically for ranking not practicality. And as we know at 4K or close, it's all GPU bound so you can get a lot of runway out of a CPU a couple generations old.
 
Joined
Jun 6, 2022
Messages
591 (0.93/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
Because the vast majority of gamers of Windows 10. I guess I could do a "50 Games Windows 10 vs Windows 11 RTX 4090 + 13900K" article next .. or maybe something with 7700X or 7600X or 13700K. I guess I could start a poll
The migration from W10 to W11 is free.
The recommendation for Alder and Raptor owners is to migrate to W11. I guess they did. A fair comparison would be the review with W11 because it does not disadvantage the platforms, but W10 can create problems for the Intel platform because it cannot efficiently manage P and E cores.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,506 (4.53/day)
Location
Kepler-186f
Because the vast majority of gamers of Windows 10. I guess I could do a "50 Games Windows 10 vs Windows 11 RTX 4090 + 13900K" article next .. or maybe something with 7700X or 7600X or 13700K. I guess I could start a poll

Do a poll! :toast:

I'm curious what people want.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,892 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The migration from W10 to W11 is free.
The recommendation for Alder and Raptor owners is to migrate to W11. I guess they did.
A ton of people have not used the upgrade offer, or adoption rates wouldn't be as low as they are

Do a poll! :toast:

I'm curious what people want.
 
Joined
Jun 6, 2022
Messages
591 (0.93/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
A ton of people have not used the upgrade offer, or adoption rates wouldn't be as low as they are
In a review, it is extremely important not to disadvantage any platform. W11 does not disadvantage the 5800X3D, but the tests are inconclusive for the 13900K if W10 is used because it cannot effectively divide the loads between the P and E cores. There is no P and E in W10 and it is very likely that some "P" loads are assigned to the cores E. Honestly, it's like testing modern processors with Cinebench R15 or older.
P.S. I'm using W10 on the i5-12500 system because this processor doesn't have E cores. If it did, I'd migrate to W11
 
Joined
Nov 12, 2020
Messages
136 (0.11/day)
Processor I7-6900K @ 4.2ghz 8c/4.3ghz 2c
Motherboard GA-X99-Ultra Gaming
Cooling Liquid Freezer II 240
Memory 32GB G.Skill @ 3200 15-17-17-34
Video Card(s) RTX 3080 12GB FTW3 Ultra Hybrid
Storage 1TB P31 NVMe and 2x 1TB S31 SATA
Display(s) CU34G2X and Ea244wmi
Case 5000D Airflow
Audio Device(s) Sound Blaster X4
Power Supply SuperNOVA 850 G+
Mouse G502 HERO/G700s
Keyboard Ducky Shine
I've been greatly appreciating these comparison looks as it helps to see where things lie. I know if I had an AM4 setup here I would buy a 5800X3D when on sale just because of how much they swing for gaming. I'm hoping we see a Zen 4 X3D next year and that Intel's MTL competes because this has been a great time for customer choice in CPUs.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.24/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Jumping straight into the action, and starting with the highest 4K Ultra HD resolution, which is the native stomping ground for the GeForce RTX 4090, we're seeing that the Ryzen 7 5800X3D is matching the Core i9-13900K "Raptor Lake" very well. Averaged across all 53 games, the i9-13900K is a negligible 1.3% faster than the 5800X3D.

This is what i was expecting to see in the previous articles (although they didnt focus on the same things)
With the usual CPU reviews performance per watt and power efficiency graphs thrown in, it does make the new intels just seem like a "why the f*ck would anyone want this, for gaming?"


0.1% lows come ahead according to reviews that cover that, which isnt a surprise since the intel systems have time based and thermal based throttles to worry about - if they reach the time limit of PL1 and drop to 125W, they may start to show small FPS drops and microstutter the x3D doesnt at its lower wattages
 

rom64k

New Member
Joined
Oct 31, 2022
Messages
6 (0.01/day)
Thanks for the amazing review.

About the bottleneck generated between the 5800X3D and the RTX 4090, here's my experience

I let you two captures from Modern Warfare II performance test at 4k

This one is taken with my new 5800X3D and RTX 4090.

https://bit.ly/3NCqBkd

This other was before to change the CPU, I had a 5600x paired with same RTX 4090 too.

https://bit.ly/3T4wbNm

Both were taken playing at 4k. You can draw your own conclusions.

I have a 5600x at 4k 120hz must not buy 5800x3d... So tempting at $329.
Please, read my post #150

Best regards
 
Top