• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900KS

Amd can do no wrong. I still remember the bashing the 7700k received ( i was bashing it as well). Yet fast forward to 2022, we have the 5800x 3d, which is basically a worse (and more expensive) version of the 7700k, and people are going nuts over it.
Are you serious saying that 5800x3D is like 7700K? May I ask on what grounds you came up with that conclusion?
 
Are you serious saying that 5800x3D is like 7700K? May I ask on what grounds you came up with that conclusion?
Nah, I'm not saying that. Im saying the 7700k was better back in 2017 than the 3d is in 2022. I dont even think that it is a controversial statement. They are both way way slower on the vast majority of applications not only compared to price competitors but also compared to way cheaper cpus. They are both good basically at just games and released on basically dead end platforms

But what makes the 7700k better is the fact that it was much cheaper, unlocked, had an igpu and was at least beating the r7 1700 in lightly threaded /single threaded / latency dependant applications, not just games. The 3d gets embarrassed by much cheaper offerings like the 12600k and the 12700f or even amds own 5900x
 
Last edited:
Pointless arguing with the AMD heads, they will constantly just try to prove you wrong. My 12700k hardly uses anything gaming i know that, whether it is over 2 hours or 5 mins. It might spike up sometimes but i would expect all CPUs to do that.

We own Intel ADL CPU's we are flogging a dead horse on TPU trying to defend them. Sometimes I think maybe i should just switch to AM4/5800x and join in the pitch forking.
You have a flawed test setup.

You've got your GPU pegged at 100%, and claim that your CPU is nice and cold and that your CPU is amazing because of it.

Except that with anything more powerful than your 5 year old GPU, or in fact just lowering your game settings... that CPU usage, power consumption and heat will skyrocket.
 
You have a flawed test setup.

You've got your GPU pegged at 100%, and claim that your CPU is nice and cold and that your CPU is amazing because of it.

Except that with anything more powerful than your 5 year old GPU, or in fact just lowering your game settings... that CPU usage, power consumption and heat will skyrocket.
But we already know that alderlake are more efficient in games than all zen 3 except the 3d,right? I mean i have a 3090 and a 12900k, we can test it, im absolutely positive a 5950x will get absolutely embarrassed when it comes to gaming efficiency

Also a normal gaming scenario is with the gpu pegged at 100%, else you are having a cpu bottleneck...
 
But we already know that alderlake are more efficient in games than all zen 3 except the 3d,right? I mean i have a 3090 and a 12900k, we can test it, im absolutely positive a 5950x will get absolutely embarrassed when it comes to gaming efficiency

Also a normal gaming scenario is with the gpu pegged at 100%, else you are having a cpu bottleneck...
I'm not going to trust anyones claims without proper evidence backing it up, because of posts like we've just had here with an absolutely flawed and biased setup.

Or can I just swap my 3090 for my 750ti and claim that my CPU is more power efficient too?
 
I'm not going to trust anyones claims without proper evidence backing it up, because of posts like we've just had here with an absolutely flawed and biased setup.

Or can I just swap my 3090 for my 750ti and claim that my CPU is more power efficient too?
But we have derbauer and igorslab testing for gaming efficiency, the 12900k is up to 50% more efficient than the 5950x in games. You think they are biased as well? And that's not anything new, the same applied to comet lake, they also were pretty efficient at gaming. The only intel cpus that were worse than zen 3 in gaming were rocketlake. I even had one, they were pretty atrocious
 
But we have derbauer and igorslab testing for gaming efficiency, the 12900k is up to 50% more efficient than the 5950x in games. You think they are biased as well? And that's not anything new, the same applied to comet lake, they also were pretty efficient at gaming. The only intel cpus that were worse than zen 3 in gaming were rocketlake. I even had one, they were pretty atrocious

Your wasting your time. Might as well talk to a potato

At 1440 with a much better GPU the CPU would still not need to be so taxed as surely the GPU is doing all the work. So the CPU would still use little power.
 
Nah, I'm not saying that. Im saying the 7700k was better back in 2017 than the 3d is in 2022. I dont even think that it is a controversial statement. They are both way way slower on the vast majority of applications not only compared to price competitors but also compared to way cheaper cpus. They are both good basically at just games and released on basically dead end platforms
Actually you did say that explicitly but anyway. 5800x3d is slower in some apps but it is supposedly be a gaming CPU and if you consider that as your main goal it is a very good CPU and its app performance don't suck. It is slightly slower than a 5800x. That's still decent performance. 7700k was overwhelmed by the changing landscape with more cores and higher utilization of resources.
But what makes the 7700k better is the fact that it was much cheaper, unlocked, had an igpu and was at least beating the r7 1700 in lightly threaded /single threaded / latency dependant applications, not just games. The 3d gets embarrassed by much cheaper offerings like the 12600k and the 12700f or even amds own 5900x
Unlocked means nothing here since we are talking about AMD CPUs and these are different and this 'unlocked' literally means nothing in today's world with boosting frequency. With the pricing you have to always remember what happened literally not so long ago with Covid and price hikes. It was different times back then and I'm sure you do realize that. That's how quickly things have developed. Embarrassment? I literally don't know what you are talking about especially, 5900x owners can elaborate on that how much embarrassed they are with their choice. It is literally laughable what you say. Intel released 3 gens of CPUs to stay competitive and yet people choosing AMD 5000 series should be embarrassed?
 
You have a flawed test setup.

You've got your GPU pegged at 100%, and claim that your CPU is nice and cold and that your CPU is amazing because of it.

Except that with anything more powerful than your 5 year old GPU, or in fact just lowering your game settings... that CPU usage, power consumption and heat will skyrocket.

I'll agree with you. The 1080 Ti has seen better days, but if you dare bring that up... :rolleyes:

Just Pascal not supporting multiplane overlays correctly (or rather, at all) is enough for me not to want one regardless of the circumstances. Flip model presentation makes games smooth as butter.

But we have derbauer and igorslab testing for gaming efficiency, the 12900k is up to 50% more efficient than the 5950x in games. You think they are biased as well? And that's not anything new, the same applied to comet lake, they also were pretty efficient at gaming. The only intel cpus that were worse than zen 3 in gaming were rocketlake. I even had one, they were pretty atrocious

Efficiency as in...? Because I don't see any 50% delta in performance or power consumption from the 12900K or KS vs. the 5950X. The 12700K is nice, but so is the 5900X, too. Alder Lake has better idle consumption because no IOD with fixed power draw, but that's about it.
 
Says the guy with the 3090. Still nothing wrong with a 1080ti, shame i don't have money to piss away on a 3090

I mean, I hope that didn't come out the wrong way, because i'm not hitting on you for that one brother. It's that @Mussels has a point, it's already an aging card that isn't capable of running some of the most intensive applications right now, and people will defend it because they are quite fond of it (and for good reasons, too!). I've had people scoff at me when I called the 1080 Ti an aging design before.

But when you want to test a latest-generation high-end processor, having a more modern card is surely handy. I wouldn't even say 3090, if you want to see how much a processor weighs on your setup, i'd probably be looking at the 6900 XT, as that card is better for high frame rate gaming. :)
 
Actually you did say that explicitly but anyway.
Yeah, I was wrong. The 7700k is better than the 3d
5800x3d is slower in some apps but it is supposedly be a gaming CPU and if you consider that as your main goal it is a very good CPU and its app performance don't suck.
And you can say the same thing about the 7700k ;)
That's still decent performance. 7700k was overwhelmed by the changing landscape with more cores and higher utilization of resources.
Overwhelmed? It's still faster in games than any other CPU from it's era. What are you talking about?
Unlocked means nothing here since we are talking about AMD CPUs and these are different and this 'unlocked' literally means nothing in today's world with boosting frequency.
PBO alone increases all core performance by 20-25%. That's not nothing.

With the pricing you have to always remember what happened literally not so long ago with Covid and price hikes. It was different times back then and I'm sure you do realize that. That's how quickly things have developed.
The 12700f was released during covid as well. It doesn't cost a stupid amount like the 3d does

Embarrassment? I literally don't know what you are talking about especially, 5900x owners can elaborate on that how much embarrassed they are with their choice. It is literally laughable what you say. Intel released 3 gens of CPUs to stay competitive and yet people choosing AMD 5000 series should be embarrassed?
You are trying to make this intel vs amd, im not interested. Im just saying the 3d is vastly slower than much much much cheaper SKUS. The price of that thing is ridiculous. At 250-300 msrp it would be okayish.

Efficiency as in...? Because I don't see any 50% delta in performance or power consumption from the 12900K or KS vs. the 5950X. The 12700K is nice, but so is the 5900X, too. Alder Lake has better idle consumption because no IOD with fixed power draw, but that's about it.
Run stock, pick a game of your choice with ingame benchmark, run 720p and post the results with fps and power consumption. The 12900k will absolutely annihilate your 5950x when it comes to fps / wattage.

And this is the test from igorslab, 720p, the 5950x consumes more than 50% per fps compared to the 12900k

Far-Cry-6-AvCPUWattFPS_DE-720p.pngAnno-1800-AvCPUWattFPS_DE-720p.png
 
Nah, I'm not saying that. Im saying the 7700k was better back in 2017 than the 3d is in 2022. I dont even think that it is a controversial statement. They are both way way slower on the vast majority of applications not only compared to price competitors but also compared to way cheaper cpus. They are both good basically at just games and released on basically dead end platforms

But what makes the 7700k better is the fact that it was much cheaper, unlocked, had an igpu and was at least beating the r7 1700 in lightly threaded /single threaded / latency dependant applications, not just games. The 3d gets embarrassed by much cheaper offerings like the 12600k and the 12700f or even amds own 5900x

7700k was a really bad cpu/socket generation. I owned one and used it on my main rig for quite sometime. I mean sure it was ok in 2017, but less than a year later intel releases a new socket with 6, 8 core potential. the x3d actually beats the 12900ks in many games as its intended.
 
7700k was a really bad cpu/socket generation. I owned one and used it on my main rig for quite sometime. I mean sure it was ok in 2017, but less than a year later intel releases a new socket with 6, 8 core potential. the x3d actually beats the 12900ks in many games as its intended.
Im completely in agreement, the 7700k was atrociously bad. But it also beat the 1800x in ALL games, not just some. That still doesn't make it a good cpu. So why does the fact that 3d beats the ks in some games make it a good cpu all of a sudden? :P
 
Im completely in agreement, the 7700k was atrociously bad. But it also beat the 1800x in ALL games, not just some. That still doesn't make it a good cpu. So why does the fact that 3d beats the ks in some games make it a good cpu all of a sudden? :p

Yeah at the time, it beats the 1800x in all games, but I wonder if that's still the case in ALL games now.

I mean, it can beat the ks by alot in certain games while using 1/2~1/3rd the power depending on your ks overclock. Its not a cpu for everyone for sure, but for people invested in AM4, if you wanted a substantial gaming upgrade and productivity matters little, its a compelling option.
 
Run stock, pick a game of your choice with ingame benchmark, run 720p and post the results with fps and power consumption. The 12900k will absolutely annihilate your 5950x when it comes to fps / wattage.

Uh, we're talking about sixteen-core high-performance desktop processors, and the kind used on machines equipped with thousand-watt power supplies and on liquid cooling (or at least high-end air). The higher idle socket power of MTS/VMR due to the IOD is frankly irrelevant in a real-world context. It looks bad on low/moderate load charts, but the fact that it completely flips around under full load tells you a thing or two. The Zen 3 cores themselves are highly power efficient. ADL is just as power efficient, provided... you don't do what the KS is designed to do, that is, completely disregard efficiency in exchange for brute performance.

I'll grant your wish, even though I'll maintain my point, it's rather irrelevant, lower-end games and benchmarks don't leverage the multicore muscle of these processors as you can see in the average active core count measured in HWiNFO here. When Superposition was first loading it peaked at 3.4 cores active, and that coincided with the 10-watt maximums per-core. On a system with PBO enabled, mind you...

Untitled.png
 
Yeah at the time, it beats the 1800x in all games, but I wonder if that's still the case in ALL games now.

I mean, it can beat the ks by alot in certain games while using 1/2~1/3rd the power depending on your ks overclock. Its not a cpu for everyone for sure, but for people invested in AM4, if you wanted a substantial gaming upgrade and productivity matters little, its a compelling option.
It also loses from the ks by ALOT in other certain games. Personally I think it's very unreasonable to compare it to the ks. Compare it to the 12700f, and you end up with a CPU that is 50% more expensive, gets embarrassed in most workloads, and wins by an average of 10% in 360p gaming.
 
It also loses from the ks by ALOT in other certain games. Personally I think it's very unreasonable to compare it to the ks. Compare it to the 12700f, and you end up with a CPU that is 50% more expensive, gets embarrassed in most workloads, and wins by an average of 10% in 360p gaming.

If you are building new it doesn't make sense really. But like I said, if you are invested in am4 already, have good ddr4 etc, then a drop in upgrade like the x3d looks great.

AMD is milking early adopters yes, but all your issues go away once AMD lowers the price.
 
Uh, we're talking about sixteen-core high-performance desktop processors, and the kind used on machines equipped with thousand-watt power supplies and on liquid cooling (or at least high-end air). The higher idle socket power of MTS/VMR due to the IOD is frankly irrelevant in a real-world context. It looks bad on low/moderate load charts, but the fact that it completely flips around under full load tells you a thing or two. The Zen 3 cores themselves are highly power efficient. ADL is just as power efficient, provided... you don't do what the KS is designed to do, that is, completely disregard efficiency in exchange for brute performance.

I'll grant your wish, even though I'll maintain my point, it's rather irrelevant, lower-end games and benchmarks don't leverage the multicore muscle of these processors as you can see in the average active core count measured in HWiNFO here. When Superposition was first loading it peaked at 3.4 cores active, and that coincided with the 10-watt maximums per-core. On a system with PBO enabled, mind you...

View attachment 253835
You may consider it irrelevant (I do too) but gaming efficiency was what I replied to and said that the 12900k is way more efficient than zen 3. Haven't run supo for a while, Im no34 on 4k optimized with a 3090 on air :p

Uh, we're talking about sixteen-core high-performance desktop processors, and the kind used on machines equipped with thousand-watt power supplies and on liquid cooling (or at least high-end air). The higher idle socket power of MTS/VMR due to the IOD is frankly irrelevant in a real-world context. It looks bad on low/moderate load charts, but the fact that it completely flips around under full load tells you a thing or two. The Zen 3 cores themselves are highly power efficient. ADL is just as power efficient, provided... you don't do what the KS is designed to do, that is, completely disregard efficiency in exchange for brute performance.

I'll grant your wish, even though I'll maintain my point, it's rather irrelevant, lower-end games and benchmarks don't leverage the multicore muscle of these processors as you can see in the average active core count measured in HWiNFO here. When Superposition was first loading it peaked at 3.4 cores active, and that coincided with the 10-watt maximums per-core. On a system with PBO enabled, mind you...

View attachment 253835
Ok, just run the same test, peak wattage was 49.2w, average was 41 and the score was 48k. So yeah, not only is the 12900k way faster, it also consumes way less...

Hwinfo shows 48.64 peak wattage, but during the run afterburner had a peak at 49.2w. Anyways, doesn't really make a difference
 
Last edited:
You may consider it irrelevant (I do too) but gaming efficiency was what I replied to and said that the 12900k is way more efficient than zen 3. Haven't run supo for a while, Im no34 on 4k optimized with a 3090 on air :p


Ok, just run the same test, peak wattage was 49.2w, average was 41 and the score was 48k. So yeah, not only is the 12900k way faster, it also consumes way less...

Hwinfo shows 48.64 peak wattage, but during the run afterburner had a peak at 49.2w. Anyways, doesn't really make a difference

No pics though? Also, see my GPU usage is nowhere near maxed out, yours shouldn't be either.

I noticed you have a Gamerock which has a higher power limit than my card, but even then, GPU usage was not high to begin with.
 
No pics though? Also, see my GPU usage is nowhere near maxed out, yours shouldn't be either.

I noticed you have a Gamerock which has a higher power limit than my card, but even then, GPU usage was not high to begin with.
My gpu usage was 94%. Sorry for no pic, I don't expect anyone to lie about it so I assumed people would assume I wouldn't either. Here is the screenshot

asdf.png
 
My gpu usage was 94%. Sorry for no pic, I don't expect anyone to lie about it so I assumed people would assume I wouldn't either. Here is the screenshot

View attachment 253846

No, i'm not accusing you of lying, just wondering why. Your GPU usage seems much higher. Maybe my run is bunk or it's something to do with the newer build of Windows. Neither CPU or GPU was under any meaningful load, and I just considered your card's higher PL, so that makes sense. Either way since we aren't comparing stock for stock, it's not really valid. Nice though.
 
No, i'm not accusing you of lying, just wondering why. Your GPU usage seems much higher. Maybe my run is bunk or it's something to do with the newer build of Windows. Neither CPU or GPU was under any meaningful load, and I just considered your card's higher PL, so that makes sense. Either way since we aren't comparing stock for stock, it's not really valid. Nice though.
I have higher usage cause the 12900k pushes the 3090 more than the 5950x. The power limit is irrelevant at this resolution, the 3090 never pulled more than 300watts. Yours should have been even lower judging by the temperature.

There is no overclock in my run, with my oc it should score much higher actually, since I run 5.6ghz on 2 cores.
 
Pointless arguing with the AMD heads, they will constantly just try to prove you wrong. My 12700k hardly uses anything gaming i know that, whether it is over 2 hours or 5 mins. It might spike up sometimes but i would expect all CPUs to do that.

We own Intel ADL CPU's we are flogging a dead horse on TPU trying to defend them. Sometimes I think maybe i should just switch to AM4/5800x and join in the pitch forking.
W1zzard did that Alder Lake power limit scaling article a few months back.
Single-threaded and low-thread-count stuff runs practically full speed within PL1 (125W)

It's only rendering or other workloads that require all threads at 100% which really push the consumption up. Since most games aren't pushing more than 4-6 threads, AL runs at similar power levels to an AMD equivalent, whilst also being marginally faster.
 
Last edited:
Also a normal gaming scenario is with the gpu pegged at 100%, else you are having a cpu bottleneck...
This!

If your GPU isn't pegged to 100% usage with Vsync disabled and no FPS cap, it means it could do much more for you, but your graphics settings, or your weak CPU won't allow it. Reviewers like using unrealistic settings to test for pure CPU performance, but let's be honest, no one ever plays at 720p minimum. It's just another form of media sensationalism, nothing more. Completely pointless.
 
Last edited:
Back
Top