• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-13900K

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.21/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Actually, they aren't. The 13600k is more than 50% faster in MT workloads and more than 20% in ST workloads.. The 3d is more comparable to a 12600kf, which alongside a brand new b660 motherboard would cost you around the same price a 5800x 3d costs on it's own. Which is my point all along, you don't really gain anything by mobo upgradability.
workloads are not games

Gaming CPU? x3D


Workload CPU? Anything with more cores.

And at 480 I bet it’s so fast it would grow wings and take off so no one would ever catch it. I have a 480 monitor somewhere if you would like it!!!!
low res testing is how you know the CPU will perform on future more powerful GPUs
 
Joined
Nov 26, 2021
Messages
1,334 (1.56/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Your strategy is very much valid but not spacial in any way to AMD only.
You can also sell any intel mobo and CPU- nothing new here. Again, if you are on a budget almost any change will be at loss, financially wise. Intel second hand market is as booming just as AMD`s, I guress.
Only for AMD can you have such a large jump within the same socket; Intel usually requires a socket change for appreciable CPU upgrades. To each their own, but for me, Intel's strategy of limited compatibilty is repellent.
 
Last edited:
Joined
Nov 4, 2005
Messages
11,655 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
workloads are not games

Gaming CPU? x3D


Workload CPU? Anything with more cores.


low res testing is how you know the CPU will perform on future more powerful GPUs
I know, but dead horse is dead. But maybe they need more horse meat lasagna wherever watts are free?
 
Joined
Jul 15, 2020
Messages
976 (0.72/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.025mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F26 with "Instant 6 GHz" on
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 85%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
Only for AMD can you have such a large jump within the same socket; Intel usually requires a socket change for appreciable CPU upgrades. To each their own, but for me, Intel's strategy of limited compatibilty is repellent.
You are right, but I don't find it as enough reason not to choose them if the product is right for me (faster and cheaper than compatition).
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I know, but dead horse is dead. But maybe they need more horse meat lasagna wherever watts are free?
It's not, you just don't understand the point. Going by 4k results, I should buy a 3600x. It performs almost identical to a 13900k and it costs like 1/4th of the price. What happens next year when I replace my 3080 with a 5080 though? Exactly. So that's why im looking at 720p results, to know what's gonna happen next year / which cpu will last me

workloads are not games

Gaming CPU? x3D


Workload CPU? Anything with more cores.
And im saying, they are not comparable. Its like comparing the 3d to a 7950x. The 3d is closer to the 12600kf, both in games and in other workloads. At least thats what TPU shows. At which point I don't see the huge benefit of mobo upgradability, since the 12600kf with a brand new mobo costs as much as the 3d on its own
 
Joined
Mar 21, 2016
Messages
2,195 (0.75/day)
Even if your CPU lasts at 720p the display won't with a 5080. If you're that concerned about which CPU will last you maybe you should show similar concern for how long the socket will last you as well.
 
Joined
Jan 18, 2021
Messages
74 (0.06/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 32GB (2x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Membrane, baby
Software Win10, Mint, Fedora
Even if your CPU lasts at 720p the display won't with a 5080. If you're that concerned about which CPU will last you maybe you should show similar concern for how long the socket will last you as well.

Forget the resolution. The point isn't that anyone will actually use a 720p display; the point is that low resolution testing removes GPU bottlenecks. This is useful both for determining a CPU's useful lifespan,, and as a proxy for heavily CPU-limited games that might be impractical to benchmark (e.g. certain online games). I'm usually among the first to say that CPU reviews can mislead gamers; usually GPU bottlenecks are a bigger concern in practice--but there is a purpose in low resolution testing. This discussion is a perfect example, because it's all about CPU/socket longevity.

Good reviewers will include various resolutions, offering a composite of the CPU's performance profile, and allowing the audience to make its own judgment as to how relevant different aspects of that composite are for their particular use case. "Hahaha 720p are you kidding," really isn't the rhetorical kill stroke that some people seem to think it is. Why would W1zzard include 720p numbers at all, if they're so ridiculous?
 
Last edited:
Joined
Jun 20, 2022
Messages
28 (0.04/day)
Location
ACCESS DENIED
System Name Who tf is playing megalovania over the mic?
Processor Ryzen 7 5700x
Motherboard ASUS ROG STRIX X570-E GAMING
Cooling Noctua NH-U12S REDUX
Memory 32Gb Corsair vengeance LPX 3200MHz CL16
Video Card(s) MSI RTX 2070S GAMING X
Storage Samsung 980 PRO 2TB
Display(s) ASUS TUF Gaming VG27AQ
Case NZXT H710
Audio Device(s) Logitech G PRO
Power Supply Seasonic PRIME Ultra 650 platinum
Mouse Logitech G604 / G pro wireless (modded)
Keyboard Corsair K70 RGB MK.2 (cherry MX silent) (tape/foam mod)
Benchmark Scores The hell is a benchmark?
Why do you think W1zzard includes 720p numbers at all, if they're so ridiculous?
Because people wouldn't stop asking for them ?
Could be one of the reasons... Without being the main reason...

----------------------------------------------------------------

At this point, with this much fear for your performances, i would just buy a top tier gaming god PC and just sell them 11 months down the line and buy the next big thing...
No need for a crystal ball and Trying to play guessing games with 720p and the gaming industry, you just get a guarantee top performances...
 
Joined
Mar 21, 2016
Messages
2,195 (0.75/day)
To be objective and fair to different scenarios primarily. No one is saying 720p is bad for testing purposes to gauge limitations. Pretty much everyone however agrees that 720p doesn't represent desktop gaming usage in general on a discrete GPU and hasn't for a very long time and is entirely misrepresentation of it on a RTX 4090 at the same time or even a mid range GPU today. There lies the problem with nonsense CPU bottleneck arguments that don't apply or won't apply to the end user with their individual expectations.

The issue is when people make assertions as if it always applies to everyone and rings true and is fair accurate or representative. If they want to point something out fine, but when their own intention is very clearly simply diminish the appeal of a competing brand from consideration it's a classic situation of defamation of character so to speak, but applied to hardware.

When no matter what the argument they spin it to something far fetched and outlandish that tries to steer you in the direction of agreement and writing off a competing product that's a bit of a problem. If it's always brand A is better than brand B because and never mentions any of the area's brand B competes and is better than brand A that's clearly bias. There is no question tech brands don't compete perfectly in every area again another at all times yet some people try to construe it that way by only painting a good light on one brand while simply disparaging the other and ignoring valid points or simply spinning to another counter point to just repeat the same thing once again.

Worse still is when questionable data is brought into the picture that can't be verified or disputed or compared in a fair manner because guess what a key detail isn't listed about the test setup like the memory involved and not simply the MT/s speeds involved because that alone doesn't determine memory performance. No one wants to argue in circles with people like that either then arrive back at yes, but at 720p with a GPU you and I wouldn't use it has more of bottleneck so you shouldn't buy what you see as better value you should get what I've been shilling you to get instead.

To the dismay of some people I don't care about 720p results and never will care about them in terms of it applying to me in regards to gaming because I don't game on integrated graphics. If it was a APU sure you'd have a valid point I might say you know that's a fair point at 720p that is a better APU than a iGPU if I want to run more crappy graphics than better quality laptops these days.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
720p benches have nothing to do with gaming at 720p.

It's a CPU test.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Even if your CPU lasts at 720p the display won't with a 5080. If you're that concerned about which CPU will last you maybe you should show similar concern for how long the socket will last you as well.
Socket longevity is irrelevant, if a cpu lasts for 5 or 6 years, i wont need to upgrade it on the same socket

To be objective and fair to different scenarios primarily. No one is saying 720p is bad for testing purposes to gauge limitations. Pretty much everyone however agrees that 720p doesn't represent desktop gaming usage in general on a discrete GPU and hasn't for a very long time and is entirely misrepresentation of it on a RTX 4090 at the same time or even a mid range GPU today. There lies the problem with nonsense CPU bottleneck arguments that don't apply or won't apply to the end user with their individual expectations.

The issue is when people make assertions as if it always applies to everyone and rings true and is fair accurate or representative. If they want to point something out fine, but when their own intention is very clearly simply diminish the appeal of a competing brand from consideration it's a classic situation of defamation of character so to speak, but applied to hardware.

When no matter what the argument they spin it to something far fetched and outlandish that tries to steer you in the direction of agreement and writing off a competing product that's a bit of a problem. If it's always brand A is better than brand B because and never mentions any of the area's brand B competes and is better than brand A that's clearly bias. There is no question tech brands don't compete perfectly in every area again another at all times yet some people try to construe it that way by only painting a good light on one brand while simply disparaging the other and ignoring valid points or simply spinning to another counter point to just repeat the same thing once again.

Worse still is when questionable data is brought into the picture that can't be verified or disputed or compared in a fair manner because guess what a key detail isn't listed about the test setup like the memory involved and not simply the MT/s speeds involved because that alone doesn't determine memory performance. No one wants to argue in circles with people like that either then arrive back at yes, but at 720p with a GPU you and I wouldn't use it has more of bottleneck so you shouldn't buy what you see as better value you should get what I've been shilling you to get instead.

To the dismay of some people I don't care about 720p results and never will care about them in terms of it applying to me in regards to gaming because I don't game on integrated graphics. If it was a APU sure you'd have a valid point I might say you know that's a fair point at 720p that is a better APU than a iGPU if I want to run more crappy graphics than better quality laptops these days.
And you are completely missing the point. Again. I don't see how this is even debatable. A cpu that is faster in 720p will last you longer, that's just self evident.
 
Joined
Mar 21, 2016
Messages
2,195 (0.75/day)
So now it is irrelevant for a CPU, but with memory it is not? Is that now it works now. Apparently a socket isn't socket. I think you're missing the point people want something that fits their needs and last within their expectations without spending more than they need to. A cpu that is faster in 720p won't necessarily last longer these days purely for gaming. You can get around that with upscale. If only it made any sense to do so much like spending significantly more to get little relevant upside in terms of what the individual might need and want.

Everyone's use cases differs, that's just self evident. Someone with a weaker CPU is probably not pairing it with new top end GPU unless already have plans to upgrade the CPU later soon after. Like wise in the same scenario they might settle on a upper mid range or high end GPU and be perfectly fine with save a good money in doing so and when they feel ready to similarly with the CPU upgrade later. CPU and GPU don't need to be perfectly synchronous in performance capabilities and sure it's nicer when they are closer in parity just like memory ratio's operate, but it isn't mandatory in general at the same time.

I don't think I'm missing the point at all I'm not going to be using 720p now or in the future with my CPU and a better CPU won't change that favorably. I don't even have a GPU like 1/4 the performance RTX 4090 at the same time. I mean damn if I did have one that odds become even more exponentially worse that I would do so. I represent about the worst case scenario and yet I'd still not use 720p. I'd upgrade the CPU longer before that was ever the case especially paired with a RTX 4090. It's a very nonsense argument if merely talking about games and expectations which was the case of the argument at the time between you and another individual.

I'm yeah I'm missing the point on a weaker overall setup than them and wouldn't consider that myself if you say so. I could probably post pone a CPU upgrade on what I have paired with a RTX 4090 for like 5 or 6 years for strictly games short of games strictly demanding more thread to operate than 4 threads by developers and be entirely fine with it. Outside of gaming it wouldn't be the case, but that isn't the argument and if I really need and demand excess MT I might not be considering a consumer level CPU in the first place if I can afford add justify better.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
So now it is irrelevant for a CPU, but with memory it is not? Is that now it works now. Apparently a socket isn't socket. I think you're missing the point people want something that fits their needs and last within their expectations without spending more than they need to. A cpu that is faster in 720p won't necessarily last longer these days purely for gaming. You can get around that with upscale. If only it made any sense to do so much like spending significantly more to get little relevant upside in terms of what the individual might need and want.
Νο, you cannot get around the simple fact that a faster CPU is going to last you longer.
Someone with a weaker CPU is probably not pairing it with new top end GPU unless already have plans to upgrade the CPU later soon after.
And that still doesn't change the fact that a faster CPU will last you longer. Eventually a mid range GPU will bottleneck your CPU even at 4k. That point will be sooner for a CPU that is slower in 720p than a CPU that is faster in 720p. That's just...common sense honestly. I don't see how that's even remotely contestable.
I don't think I'm missing the point at all I'm not going to be using 720p now or in the future with my CPU and a better CPU won't change that favorably. I don't even have a GPU like 1/4 the performance RTX 4090 at the same time.
And then you would end up buying the wrong product. Case in point. CPU A costs 300€, gets 100 fps at 4k and 150fps in 720p. CPU B also costs 300€, gets also 100 fps at 4k but 200 fps at 720p. If you buy A based on 4k results, you made a mistake. Simple as that.
 
Joined
Mar 21, 2016
Messages
2,195 (0.75/day)
You don't get to decided what is the right product for me nor how it lasts or how I intend to use it. Which is technically faster at 720p is irrelevant for me and no you won't argue that differently for me. So just to confirm DDR4 irrelevant yes? or are you going to be hypocrite and argue otherwise selectively.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
You don't get to decided what is the right product for me nor how it lasts or how I intend to use it. Which is technically faster at 720p is irrelevant for me and no you won't argue that differently for me.
I don't care about the right product for you, nobody is talking about you. You can buy whatever you want. Im saying that if someone is interested in gaming, 720p results are very relevant cause they show longevity. The faster 720p cpu IS going to last longer, that's just a fact and no matter how hard you try - you can't change facts.
So just to confirm DDR4 irrelevant yes? or are you going to be hypocrite and argue otherwise selectively.
What does DDR4 have to do with anything? I don't understand what you are asking
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Mar 21, 2016
Messages
2,195 (0.75/day)
I'm done with this circular bs I'll use the ignore button. He knows damn well he said above.

Socket longevity is irrelevant, if a cpu lasts for 5 or 6 years, i wont need to upgrade it on the same socket

I guess sockets are magical and DDR4 sockets are much different and more special than CPU sockets which irrelevant for the latter, but not for the former. Intel supports DDR4 still so apparently it's relevant because shill stance for Intel while CPU socket isn't because shill stance against competition let's name call here I think you know who I put on ignore. Feel free to return the topic thread back to the 13900K before it got derailed somewhere down the line into don't upgrade AM4 CPU because of 720p being a quickie that won't last as long as Intel because moar cores now matters.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I'm done with this circular bs I'll use the ignore button. He knows damn well he said above.



I guess sockets are magical and DDR4 sockets are much different and more special than CPU sockets which irrelevant for the latter, but not for the former. Intel supports DDR4 still so apparently it's relevant because shill stance for Intel while CPU socket isn't because shill stance against competition let's name call here I think you know who I put on ignore. Feel free to return the topic thread back to the 13900K before it got derailed somewhere down the line into don't upgrade AM4 CPU because of 720p being a quickie that won't last as long as Intel because moar cores now matters.
I don't think I ever mentioned DDR4 so I'm not sure what the heck you are talking about?

I never said don't upgrade AM4 because 720p is quicker either, I really have no clue what the heck you are talking about.

Actually I never brought up AMD or Intel, I'm just saying that your "720p is useless" stance is wrong. That's all. Apparently you want to turn it into an amd vs intel for fanboyism reasons, don't let me stop you, just don't involve me in that. I only care about performance and price, I'm brand agnostic. Apparently you aren't. Well...too bad for you i guess
 
Joined
May 31, 2016
Messages
4,323 (1.51/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
And im saying, they are not comparable. Its like comparing the 3d to a 7950x. The 3d is closer to the 12600kf, both in games and in other workloads. At least thats what TPU shows. At which point I don't see the huge benefit of mobo upgradability, since the 12600kf with a brand new mobo costs as much as the 3d on its own
Why aren't these comparable? You can compare the price to game performance for instance. Why not? x3d is a very capable gaming CPU and it costs less than either 7950x or 13900k. It is literally trading blows with 12900k which is pretty good if you ask me. You can't dictate which products can be compared with each other and which can't.
I can bet you, there are plenty of people who liked the 12900k, 7959x, 13900k and other comparisons with a 5800x3d for gaming purposes.
If I were to upgrade my system or platform now for gaming, I'd consider 5800x3d for sure. Especially due to low power usage in comparison to 13900k and 7950x. Even 13600k uses more power than a 5800x3d in games on average and it is slower in general. Not to mention, the 5800x3d would have come as the cheapest option among all the current options available and noticeably faster than my current CPU.
 
Last edited:
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Why aren't these comparable?
Cause the performance is vastly different, the 13600 is what, over 50% faster in MT and over 20% in ST? If you are fine with the 5800x 3d's performance then you shouldn't be looking at a 13600 in the first place, since it's way way faster. If you are purely interested in gaming CPU's wait for the lower end parts from Intel, the 7600x from AMD or last gen's 12600k.


The 3d is laughably expensive - you got to pay for that mobo upgradability feature that amd fans keep talking about. Anyways, Eu pricing, a 13600kf + a cheap b660 costs 60-70€ more than the 5800x 3d on it's own. Which one you consider better, up to you
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
CPU test of doing what?

I will try to explain. The 720p benches were in a CPU review and not a GPU review. The idea is to find out how many FPS a CPU could give if the GPU was not slowing things down rendering frames. Once you know the maximum FPS possible of the CPU then you can better know it's potential right now and future potential as GPUs become more powerful.

You would not test a CPU at higher resolutions to find the maximum number of FPS possible because then the GPU would be slowing things down rendering frames.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I will try to explain. The 720p benches were in a CPU review and not a GPU review. The idea is to find out how many FPS a CPU could give if the GPU was not slowing things down rendering frames. Once you know the maximum FPS possible of the CPU then you can better know it's potential right now and future potential as GPUs become more powerful.

You would not test a CPU at higher resolutions to find the maximum number of FPS possible because then the GPU would be slowing things down rendering frames.
Case in point, someone just looking at 4k results would buy a 9100f, as it performs exactly identical to a 12900k. Then he upgrades to a new GPU and realizes his 9100f is severely bottlenecking his shiny 1k to 2k graphics card.

 
Joined
May 31, 2016
Messages
4,323 (1.51/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Cause the performance is vastly different, the 13600 is what, over 50% faster in MT and over 20% in ST? If you are fine with the 5800x 3d's performance then you shouldn't be looking at a 13600 in the first place, since it's way way faster. If you are purely interested in gaming CPU's wait for the lower end parts from Intel, the 7600x from AMD or last gen's 12600k.
OK MT and ST is fine but I been talking about gaming which basically these processors are for arent they? I'm sure the x3d is for gaming. You need to have some measurable MT and ST performance but at this point it is not a deal breaker that the 5800x3d is slower in MT if you aim is gaming.
Expensive you say? where show me.
Here is something from GN
1666622497572.png

Yeah the 5800x3d is pricier than a 13600K but only with DDR4 Ram combo. That combo is slower in games by a lot than a 5800x3d unfortunately.
You would have to go DDR5 but that would be pricier than a 5800x3d combo. Not to mention, I already have the board so I don't need anything except the CPU.
Here is something from HWUB about the performance of the 13600k to illustrate what I'm talking about.
1666622705388.png


If you are doing some serious MT or ST workload than sure 13600K would be better but that CPU alone is not for those MT and ST workloads which it isn't a monster at those. If you really need something for MT crunch etc you would go with something different.
Solely for gaming, x3d is a better option. In a normal day to day tasks, you wont see the difference between the two when you surf net or do YT or whatever else there is. The only value and advantage the 3600k has, is a possibility to upgrade later to something stronger and then poof, MT and gaming performance go up (13700k for instance) but you need a DDR5 for that anyway and that is expensive nonetheless.
maybe the 13600 or 13500 turn out to be cheap alternative for gaming only for new platform buyers, for a Ryzen system owner, might not be so great.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
OK MT and ST is fine but I been talking about gaming which basically these processors are for arent they? I'm sure the x3d is for gaming. You need to have some measurable MT and ST performance but at this point it is not a deal breaker that the 5800x3d is slower in MT if you aim is gaming.
If your aim is only gaming there are much better vfm options than the 5800x 3d.

Expensive you say? where show me.
The 3d on it's own yes, it's super expensive. You don't even have to compare it to Intel that are decently priced. Even against the overpriced Zen 4 cpus, the 3d is wildly overpriced. Think about it, a 7600x is faster in games, way faster in ST workloads and equal in MT workloads. And yet, even though it's already overpriced, it's cheaper than the 3d. In facts, it's a 100€ cheaper. You might argue that Am3 motherboards are cheaper, but then that's exactly my point, you are paying for the mobo upgradability, it's not free. Instead of paying for a new motherboard, you are overpaying for the CPU.

Yeah the 5800x3d is pricier than a 13600K but only with DDR4 Ram combo. That combo is slower in games by a lot than a 5800x3d unfortunately.
You would have to go DDR5 but that would be pricier than a 5800x3d combo. Not to mention, I already have the board so I don't need anything except the CPU.
Here is something from HWUB about the performance of the 13600k to illustrate what I'm talking about.

What do you mean by a lot? The difference in gaming performance is 5%, yet the 13600kf is 90€ cheaper (359€ vs 449€ in big EU retailer). With that price difference you could get faster ram for example and tie it in gaming, while still offering vastly better MT and ST performance and a better upgrade path.
If you are doing some serious MT or ST workload than sure 13600K would be better but that CPU alone is not for those MT and ST workloads which it isn't a monster at those. If you really need something for MT crunch etc you would go with something different.
I disagree. The 13600kf IS a monster at those. It's the 3rd fastest CPU in ST performance and faster than a 5900x and a 7700x in MT performance. It's pretty freaking strong actually
 
Joined
Jan 14, 2019
Messages
9,725 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Forget the resolution. The point isn't that anyone will actually use a 720p display; the point is that low resolution testing removes GPU bottlenecks. This is useful both for determining a CPU's useful lifespan,, and as a proxy for heavily CPU-limited games that might be impractical to benchmark (e.g. certain online games). I'm usually among the first to say that CPU reviews can mislead gamers; usually GPU bottlenecks are a bigger concern in practice--but there is a purpose in low resolution testing. This discussion is a perfect example, because it's all about CPU/socket longevity.

Good reviewers will include various resolutions, offering a composite of the CPU's performance profile, and allowing the audience to make its own judgment as to how relevant different aspects of that composite are for their particular use case. "Hahaha 720p are you kidding," really isn't the rhetorical kill stroke that some people seem to think it is. Why would W1zzard include 720p numbers at all, if they're so ridiculous?
I guess this is on reviewers in a sense that it isn't explained well, usually. So naturally, the majority of people have no idea why 720p tests exist at all and of course they retort with "no one plays at 720p anymore, duh". Reviews should state that it's to test lifespan longevity, and not actual present-day gaming performance. To be honest, even I didn't know until not too long ago, even though it's logical as heck.
 
Top