• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Core i5-12600K

What security problems are you concerned with?
I know many of the early security issues (e g. Spectre, Meltdown) have been mitigated on the hardware level, but curious what others have been implemented on the hardware level such as MDS attacks as well as what has not. Last I checked (which has been a year or so) have only been on the firmware and OS level, thus affecting performance. Hardware are usually more efficient implementations.

Personally, I am not concerned about these security issues with personal computers, but more curious about the server field. For example, my company had to disable HT on our older Intel xeon line of cpus. This affected performance significantly on the workloads that are done. Eventually we had to replace all of those with Epyc CPUs, which had the hardware mitigations. The downside is that these were really expensive and less configure options available with our vendor compared to the Intel equivalent.
 
I know many of the early security issues (e g. Spectre, Meltdown) have been mitigated on the hardware level, but curious what others have been implemented on the hardware level such as MDS attacks as well as what has not. Last I checked (which has been a year or so) have only been on the firmware and OS level, thus affecting performance. Hardware are usually more efficient implementations.

Personally, I am not concerned about these security issues with personal computers, but more curious about the server field. For example, my company had to disable HT on our older Intel xeon line of cpus. This affected performance significantly on the workloads that are done. Eventually we had to replace all of those with Epyc CPUs, which had the hardware mitigations. The downside is that these were really expensive and less configure options available with our vendor compared to the Intel equivalent.
That's what I thought you meant, but clarification never hurts. As far as I know, Alder Lake has no known hardware security vulnerabilities.
 
Most people drive petral and diesel cars, but the future is?! Not them.
See, the issue with your comparison is that "ev is the future" has been a motto since the 1990s, arguably even earlier. The very first time that was said was the electric model t. We all saw how that went. Just because a new tech is good doesnt mean it will dominate nor is it imminent. Everyone in 2006 who said that EVs would dominate by 2020 look pretty foolish, what with EVs making up less then 1% of consumer car sales and all.

As far as resolutions go, 4k ahs been around for years now. 1440p is not a new resolution by any means. Yet, 1440p and 4k combined are less then 10% of the total market, and 25% of the market is sub 1080p. Claiming 4k120 is the future and that 1080p should be ignored is rediculous, at thsi rate itll take decades before either 1440p or 4k get to half of 1080p's usage numbers, let alone exceed 1080p in the market.
 
See, the issue with your comparison is that "ev is the future" has been a motto since the 1990s, arguably even earlier. The very first time that was said was the electric model t. We all saw how that went. Just because a new tech is good doesnt mean it will dominate nor is it imminent. Everyone in 2006 who said that EVs would dominate by 2020 look pretty foolish, what with EVs making up less then 1% of consumer car sales and all.

As far as resolutions go, 4k ahs been around for years now. 1440p is not a new resolution by any means. Yet, 1440p and 4k combined are less then 10% of the total market, and 25% of the market is sub 1080p. Claiming 4k120 is the future and that 1080p should be ignored is rediculous, at thsi rate itll take decades before either 1440p or 4k get to half of 1080p's usage numbers, let alone exceed 1080p in the market.
Hang on let me just re read 8 pages ago for context and I'll get back to you.

But you can start with steam stats aren't worth much.

And end on that's the comment you want to drag on ?!.

Me quoted again
"
I know, is steam defined universally as the goto definition of what people are buying to game on in the future ?! No it's at best an indicator of what they have now.

Do people buy new tech just to continue with what they're doing now, or do a major proportion of new buyers buy parts expecting an improvement or increase in gaming performance?!.

I already know the answers to these too."

So again what do people buy new stuff for, to do the same stuff ?!.

It will take time obviously but doesn't mean only 1080p counts until then, been 4k here for 3 years now and I still own a 1080p laptop until it dies, and it isn't getting swapped for another 1080p.
 
Last edited:
That's what I thought you meant, but clarification never hurts. As far as I know, Alder Lake has no known hardware security vulnerabilities.
Thank you. It is always good to ask for clarification and not assume (the latter gets me into more trouble than its worth ;))
After more searching online, I finally found a security vulnerability spreadsheet that contains a CPU family list dating back to Sandy Bridge thru Alder Lake, which is what I have been seeking. So I might as well post it here if anyone else is interested:
https://www.intel.com/content/www/u...-affected-consolidated-product-cpu-model.html
 
He doesn't really need to. No game that I know of will load a CPU to it's max like the other tests have done. Even if you find one that does, all it'll do is cause the CPU to exhibit the same thermal characteristics that have already been shown with the current tests. The testing that has been done tells you everything you need to know. W1zzard's testing didn't miss a single beat.

Not taking anything away from W1zzard's God-like reviews.... the best material available for the CPU know how.

Only i'd fancy a quick look at power consumption whilst gaming too. Unfortunately with the data provided we don't have these game specific stats. it's not a problem though... as other reviewers readily have these available but it would be nice to see all the stats on one page (and i prefer TPUs style of charts).

Update: recenly he added this thread: https://www.techpowerup.com/forums/...er-consumption-testing-in-cpu-reviews.288761/ and the good news being he mentions "I do have definite plans to add "Gaming Power Consumption", using Cyberpunk 2077 at highest settings, v-sync off" (w1zzard you're a legend)
 
This mirrors what I've seen at some other sites, including Anandtech. Peak load benchmarks have really always just been troll bait.

I'd say it's definitely helpful to know peak power consumption so that you can tailor your motherboard, PSU, and cooler to any conditions your CPUs will be under.

IMO if you are fine with 720p or 1080p benchmarks it's hypocritical to complain about these peak power tests. You can't complain about tests designed to demonstrate peak power consumption when the CPU tests are designed based on the same principle.

Alder lake is a fantastic addition from Intel but those power consumption numbers are all over the place. I'm not sure what to make of it.
 
I'd say it's definitely helpful to know peak power consumption so that you can tailor your motherboard, PSU, and cooler to any conditions your CPUs will be under.

IMO if you are fine with 720p or 1080p benchmarks it's hypocritical to complain about these peak power tests. You can't complain about tests designed to demonstrate peak power consumption when the CPU tests are designed based on the same principle.

Alder lake is a fantastic addition from Intel but those power consumption numbers are all over the place. I'm not sure what to make of it.

And I think it hypocritical for you to say that the peak power tests are so very useful, and in the same exact post state :

"...those power consumption numbers are all over the place. I'm not sure what to make of it."

You don't know what to make of it because it is not a realistic scenario. Even your PSU argument falls apart and shows just how misled and/or ignorant you are. The power levels both peak and average can *and should* be controlled by a DIY builder by changing PL1/PL2 and Tau power levels to fit the build. In point of fact, *Intel's technical documentation overtly states that the builder should adjust those settings for the platform's capabilities*. Zen can be unlocked to PBO2 (unlimited power) and consume upwards of 250W as well, it's just that Zen gets no benefit to speak of from it, but Intel does, so no one speaks of doing it on Zen other than to say 'not worth it'. Somehow this is presented as a negative for Intel.

If you are not capable or willing to understand those power settings then don't DIY, go buy an OEM rig and rely on your warranty. They will configure it for you.
 
And I think it hypocritical for you to say that the peak power tests are so very useful, and in the same exact post state :

"...those power consumption numbers are all over the place. I'm not sure what to make of it."

You don't know what to make of it because it is not a realistic scenario. Even your PSU argument falls apart and shows just how misled and/or ignorant you are. The power levels both peak and average can *and should* be controlled by a DIY builder by changing PL1/PL2 and Tau power levels to fit the build. In point of fact, *Intel's technical documentation overtly states that the builder should adjust those settings for the platform's capabilities*. Zen can be unlocked to PBO2 (unlimited power) and consume upwards of 250W as well, it's just that Zen gets no benefit to speak of from it, but Intel does, so no one speaks of doing it on Zen other than to say 'not worth it'. Somehow this is presented as a negative for Intel.

If you are not capable or willing to understand those power settings then don't DIY, go buy an OEM rig and rely on your warranty. They will configure it for you.

I'm not sure if you've read my profile specs or any of my prior posts but the personal attacks are completely uncalled for. I miss the days when I could go onto a tech form and not be personally attacked. Completely unwarranted.

You don't know what to make of it because it is not a realistic scenario.

That was the point, neither CPU testing nor peak power consumption testing are realistic scenarios. As I stated in my last comment, you either have to except the fact that they are going to have some tests that seek to extract maximum performance. "realistic" is a moving target as I stated previously. Realistic for one person is not realistic for another. For the vast majority of people 720p is not realistic yet it's completely acceptable because it's the best way to extract maximum performance. The same applies to power consumption, while the maximum power consumption (just like CPU performance at 720p) is not "realistic" to most, it's an important metric to have nonetheless.
 
Last edited:
I'm not sure if you've read my profile specs or any of my prior posts but the personal attacks are completely uncalled for.

Then don't make personal attacks yourself. If you can't stand the heat, stay out of the kitchen. I don't need to see your profile to know that you are either misrepresenting, or you are clueless, about PL1/PL2 power level settings. Either way my comment is on target.

I miss the days when I could go onto a tech form and not be personally attacked. Completely unwarranted.

Speaking of hypocrisy, do I need to refer you to your very first post in this thread?

That was the point, neither CPU testing nor peak power consumption testing are realistic scenarios. As I stated in my last comment, you either have to except the fact that they are testing in a manner to extract maxmiums or you do all testing in a "realistic" manner (which is a moving target as I stated previously as realistic for one person is not realistic for another). For the vast majority of people 720p is not realistic yet it's completely acceptable because it's the best way to extract maximum performance.

A false assertion. You don't have to do anything of the sort. You can test at different power levels to very clearly illustrate performance / power scaling.
 
Then don't make personal attacks yourself. If you can't stand the heat, stay out of the kitchen. I don't need to see your profile to know that you are either misrepresenting, or you are clueless, about PL1/PL2 power level settings. Either way my comment is on target.



Speaking of hypocrisy, do I need to refer you to your very first post in this thread?



A false assertion. You don't have to do anything of the sort. You can test at different power levels to very clearly illustrate performance / power scaling.

I didn't and feel free to link.
 
I didn't and feel free to link.

IMO if you are fine with 720p or 1080p benchmarks it's hypocritical to complain about these peak power tests. You can't complain about tests designed to demonstrate peak power consumption when the CPU tests are designed based on the same principle.

Alder lake is a fantastic addition from Intel but those power consumption numbers are all over the place. I'm not sure what to make of it.

720p tests are not peak power tests.
 
720p tests are not peak power tests.
They can be, in the same way that furmark is a power hungry beast.

Removing bottlenecks along the way (such as lower resolution and settings) can definitely cause the usage of another part (CPU or GPU) to skyrocket outside typical usage.
 
They can be, in the same way that furmark is a power hungry beast.

Removing bottlenecks along the way (such as lower resolution and settings) can definitely cause the usage of another part (CPU or GPU) to skyrocket outside typical usage.

I'm not aware of any games that can provide both a consistent and multi-core / all core workload - and I think it is pretty clear that references to the 720p and 1080p benchmarks is talking about games. Games simply do not lend themselves to this, yes they'll be multi-threaded but it's entirely unlike a render job. Rendering and encoding as examples (typically used for a 'real world' load test) can usually be evenly divided into similar sized workloads which are all doing the same type of work. Furmark is very close to those types of jobs, but it is an OpenGL test not a 720p game bench. That flow simply doesn't describe a game, where you will have potentially multiple threads for 'AI', for draw calls, for sound and so on which are all entirely dissimilar.
 
Yes, it'd be game related, otherwise why bring in resolution at all? 2D and workload tasks dont really care about what your monitor is set to.
There ends up being a minefield of thousands of variables, and w1zz has to choose just a few but as an example, think a 720p gamer with Vsync on vs off. 60FPS vs 400FPS is going to use MASSIVELY different power amounts.
 
Games are usually bottlenecked by memory/cache and not IPC. So yeah, even at 720p or 640x480 most will not be able to load a CPU as much as like, Blender could.
 
720p tests are not peak power tests.

That's not what I was trying to say. I was saying that they are designed to extract maximum performance from the CPU.

I think you maybe misunderstood my comment. At no point did I attempt to insult anyone and when I said I don't know what to make of the power consumption numbers, I meant that in a sense of approaching the platform from a buyer's perspective. If I am considering purchasing a platform, I have to purchase a mobo and cooler that will work with it in all scenarios so the question would be do we target closer to the max power draw or closer to the average given the large gap between the two. You could make an argument for either.
 
Yes, it'd be game related, otherwise why bring in resolution at all? 2D and workload tasks dont really care about what your monitor is set to.
There ends up being a minefield of thousands of variables, and w1zz has to choose just a few but as an example, think a 720p gamer with Vsync on vs off. 60FPS vs 400FPS is going to use MASSIVELY different power amounts.

Depends on how well balanced those threads are. In an game you may have two threads doing draw calls at 20% usage going 60fps at 1440p, then you go to 720p and it goes 300fps so the two threads are now huffing it at 60%. That is not going to change your AI or sound and so on. That's why I said you need it to be consistent *and* balanced, games simply don't work like that, you can quadruple the rate of draw calls yes but the rest remains the same.
 
The leaks overhyped Alder Lake to the moon, but in reality it's just a normal generational jump with disappointing power consumption, possible backwards compatibility issues and high platform costs ...
i5 12600K = best overall cpu & best gaming cpu for 2021


Sure, It beats the 5600X conclusively by consuming 90W more power. Sorry, the 5600X still gets my money.
If you're worried about power then stick to laptops.

AMD Ryzen 5 5600X £287.48

vs

Intel Core i5 12600KF £259.98
 
Pretty sure your wPrime benchmark in this review is off.
I got 81.847 sec on my 12600K which I just built at stock settings.
Pretty sure you need to set thread count to 20 instead of default 4 or will get score u got.
 
Pretty sure your wPrime benchmark in this review is off.
I got 81.847 sec on my 12600K which I just built at stock settings.
Pretty sure you need to set thread count to 20 instead of default 4 or will get score u got.
It's going to run at the programs default settings, otherwise you aren't benchmarking fairly.

If the processor/OS has issues with detecting the amount of threads/cores with older programs and behaves poorly for some of them, that's good to know - and the result is still useful and fair since every CPU is tested the same way.

If and when it becomes a situation where Wprime just runs poorly all the time due to it being an older program, i'm sure w1zz would just retire it.
 
It's going to run at the programs default settings, otherwise you aren't benchmarking fairly.

If the processor/OS has issues with detecting the amount of threads/cores with older programs and behaves poorly for some of them, that's good to know - and the result is still useful and fair since every CPU is tested the same way.

If and when it becomes a situation where Wprime just runs poorly all the time due to it being an older program, i'm sure w1zz would just retire it.
Yeah pretty old and outdated program tbh. Wouldn't even run unless run as Admin either and will not populate the hardware info from cpu-z correctly either.
 
Pretty sure your wPrime benchmark in this review is off.
I got 81.847 sec on my 12600K which I just built at stock settings.
Pretty sure you need to set thread count to 20 instead of default 4 or will get score u got.
Yeah you can hand-tune things to ensure wPrime doesn't end up on the e-cores on Windows 11 and run it with fewer threads for higher perf.

I want a fair apples-to-apples comparison, so I run it the same way on every system, using the advertised thread count

Yeah pretty old and outdated program tbh. Wouldn't even run unless run as Admin either and will not populate the hardware info from cpu-z correctly either.
I'll probably drop wPrime in the 2022 CPU test system setup, not sure about a replacement yet
 
Back
Top