Monday, March 11th 2024

Intel Core i9-14900KS Pricing Confirmed to be $749

Pricing of Intel's upcoming enthusiast-segment desktop processor, the Core i9-14900KS, has been confirmed to be $749, according to a MicroCenter listing. This price is identical to what the company asked for the previous generation i9-13900KS and i9-12900KS. As a Special Edition SKU, the i9-14900KS may not be available in all markets you'd normally find the i9-14900K in, also the chip is expected to have higher cooling- and power requirements. Based on the "Raptor Lake Refresh" silicon, this 8P+16E core processor is expected to come with maximum boost frequencies of 6.20 GHz, and generally better overclocking headroom than the regular i9-14900K. The Core i9-14900KS is expected to go on sale this Thursday, March 14, 2024. Whether it beats the AMD Ryzen 7 7800X3D at gaming is the $749 question we'll answer soon.
Source: VideoCardz
Add your own comment

103 Comments on Intel Core i9-14900KS Pricing Confirmed to be $749

#51
Lew Zealand
So in other words, idle power is less a factor of a CPU and more about what you surround it with.

I just tested my main gaming machine which has a Zen 3 5800X3D and 6800 XT but on a B450 with only 3200 MHz CL16 RAM and it idles at 41W. I don't see the 5800X3D's idle total system power at 41W in any of the reviews in this thread which suggests that you can't make specific claims unless you match equipment exactly to the reviewed system. Which people rarely (lol, never) do when building a system.

Looks like Zen 4 idles at higher power but how much higher if you set up your system with idle power in mind? Again comparing to setting up an Intel system similarly.
Posted on Reply
#52
dgianstefani
TPU Proofreader
Lew ZealandSo in other words, idle power is less a factor of a CPU and more about what you surround it with.

I just tested my main gaming machine which has a Zen 3 5800X3D and 6800 XT but on a B450 with only 3200 MHz CL16 RAM and it idles at 41W. I don't see the 5800X3D's idle total system power at 41W in any of the reviews in this thread which suggests that you can't make specific claims unless you match equipment exactly to the reviewed system. Which people rarely (lol, never) do when building a system.

Looks like Zen 4 idles at higher power but how much higher if you set up your system with idle power in mind? Again comparing to setting up an Intel system similarly.
To get "low" idle power on Zen 4 you have to run the RAM at JEDEC speeds. Crippling your CPU/gaming performance. It's still not that low because the IO die and the IF are always consuming a non negligible amount of power for the system to be able to run.
So in other words, idle power is less a factor of a CPU and more about what you surround it with
Both are factors. As stated the chiplet/multi die approach (where only the core die/s are on a current process, 5nm in case of Zen 4) of Zen has inherently worse efficiency than the monolithic die of Intel. This can also be seen in the non monolithic RDNA 3 cards vs the monolithic 7600 etc.
Posted on Reply
#53
Lew Zealand
dgianstefaniTo get "low" idle power on Zen 4 you have to run the RAM at JEDEC speeds. Crippling your CPU/gaming performance. It's still not that low because the IO die and the IF are always consuming a non negligible amount of power for the system to be able to run.
I'd love to see some numbers attached to this but considering that idle power is ignored in so many reviews out there I don't have hope to find a deep dive on actual numbers in different settings. Sucks as high idle power was the biggest annoyance when I built my first AMD system, R5 2600.

My Intel i5-8400 idled at 31W total system which dropped to 25W when adding an AMD GPU (same effect as the Dell above). But the R5 2600 used ~45W. 41W with the 5800X3D seems decent in comparison, considering it's increased capability.
Posted on Reply
#54
dgianstefani
TPU Proofreader
Lew ZealandI'd love to see some numbers attached to this but considering that idle power is ignored in so many reviews out there I don't have hope to find a deep dive on actual numbers in different settings. Sucks as high idle power was the biggest annoyance when I built my first AMD system, R5 2600.

My Intel i5-8400 idled at 31W total system which dropped to 25W when adding an AMD GPU (same effect as the Dell above). But the R5 2600 used ~45W. 41W with the 5800X3D seems decent in comparison, considering it's increased capability.
Like I've said, I spoke to W1z about this and the next CPU test bench will include idle power draw.

5800X3D in particular was very efficient, not true in idle/low load for the rest of the Zen family on AM4 or AM5, including the new X3D parts.
Posted on Reply
#55
RandallFlagg
Lew ZealandI'd love to see some numbers attached to this but considering that idle power is ignored in so many reviews out there I don't have hope to find a deep dive on actual numbers in different settings. Sucks as high idle power was the biggest annoyance when I built my first AMD system, R5 2600.

My Intel i5-8400 idled at 31W total system which dropped to 25W when adding an AMD GPU (same effect as the Dell above). But the R5 2600 used ~45W. 41W with the 5800X3D seems decent in comparison, considering it's increased capability.
Yeah but even in the vortez reviews, Zen 1+ (2000 series) had lower idle power draw than intel Z390/Z490 systems which would have been intel gen 8/9. ~10% lower, which corresponds to what you're saying you've experienced.

They didn't lose in idle power draw until they went to chiplets.
Posted on Reply
#56
Lew Zealand
RandallFlaggYeah but even in the vortez reviews, Zen 1+ (2000 series) had lower idle power draw than intel Z390/Z490 systems which would have been intel gen 8/9. ~10% lower, which corresponds to what you're saying you've experienced.

They didn't lose in idle power draw until they went to chiplets.
I saw lower idle power with Intel i5-8400, not AMD R5 2600 though I'm using B450/B460 AsRock Pro4 Mobos. And I remember specifically choosing that Intel Mobo for its own low power use based on a review. Both with 16GB memory, though 2933 for the AMD and 2666 for the Intel. 25-31W Intel vs ~45W AMD. Now I'm wondering what I did wrong with the AMD setup. I still have the CPU in a MSi Mobo, I should have another look.
Posted on Reply
#57
Veseleil
Excuse me for interrupting, but which is the source your idle power draw numbers come from? HwInfo readings? And all of those charts you people posted, how did they measure that power exactly? If you refer to a HwInfo readings, it's ~30W CPU and ~11W GPU on idle with just Firefox, and I just can't trust them.
Posted on Reply
#58
RandallFlagg
Lew ZealandI saw lower idle power with Intel i5-8400, not AMD R5 2600 though I'm using B450/B460 AsRock Pro4 Mobos. And I remember specifically choosing that Intel Mobo for its own low power use based on a review. Both with 16GB memory, though 2933 for the AMD and 2666 for the Intel. 25-31W Intel vs ~45W AMD. Now I'm wondering what I did wrong with the AMD setup. I still have the CPU in a MSi Mobo, I should have another look.
Yes but you have to normalize components, and I'm not sure that the i5-8400 was a true match to an R5-2600. Seems like that would have been an i5-8600. Also memory, 2933 is obv a different kit, and higher clock.
VeseleilExcuse me for interrupting, but which is the source your idle power draw numbers come from? HwInfo readings? And all of those charts you people posted, how did they measure that power exactly? If you refer to a HwInfo readings, it's ~30W CPU and ~11W GPU on idle with just Firefox, and I just can't trust them.
My direct examples came from Vortez.net, just look under reviews -> cpus/motherboard and pick a mb review of interest, and look under power consumption.

This is just the easiest way I've seen to directly compare same CPU on different motherboards idle consumption.

There are a lot of other more anecdotal comparisons though. I haven't seen very many that don't show the same trend, just typically you don't know details. For example (7950X review) :



And from techradar - everyone looks at the big bar, but probably spends 95% of their time at the little bar (cpu only):

Posted on Reply
#59
Darmok N Jalad
Yeah, if you go to monolithic Ryzen, like the 8700G, you get decent system idle, 41W. The negative of IF has always been higher idle consumption.
Posted on Reply
#60
dgianstefani
TPU Proofreader
Darmok N JaladYeah, if you go to monolithic Ryzen, like the 8700G, you get decent system idle, 41W. The negative of IF has always been higher idle consumption.
True, but then you get gimped performance in other ways due to the halved cache compared to 7700X and slower PCIe etc. There's always a compromise.

The 8700G and other APUs still use IF by the way, it's just they're on monolithic silicon.
Posted on Reply
#61
Lew Zealand
VeseleilExcuse me for interrupting, but which is the source your idle power draw numbers come from? HwInfo readings? And all of those charts you people posted, how did they measure that power exactly? If you refer to a HwInfo readings, it's ~30W CPU and ~11W GPU on idle with just Firefox, and I just can't trust them.
I'm using a Kill a Watt from the wall as I don't trust the difference in CPU power reporting from the 2 brands. And in the end, total system idle is what I'm interested in. Generally other equipment is similar with an NVMe SSD and an AMD GPU except for those Nvidia-AMD GPU power difference tests on Intel*. Intel Power Gadget (RIP) does seem to report those power differences properly and I use it for CPU power tracking on Intel chips as I see BIOS and Throttlestop TDP changes reflected accurately to the watt once the Intel CPU exits Tau.

I do not put much trust in the idle power reporting from any of those websites because it's very clearly dependent on a variety of factors, so which one if any is "real"? And when W1zz does his tests here, who's to say they will match the average user's experience? W1zz's tests will be great as the community here can test and input their findings too and having a large number of people testing may help pin down best practices for minimizing idle power on both platforms.

* As total system idle is what I'm interested in, Intel Power Gadget's (CPU-only) reported reduction in idle power when swapping from Nvidia to AMD GPU was greater than the observed system idle power reduction. My assumption is that the AMD GPU is making up the difference with slightly higher idle power use while the AMD driver (somehow?) puts the Intel CPU into a lower power idle state. I'll take the nice net reduction.
RandallFlaggAnd from techradar - everyone looks at the big bar, but probably spends 95% of their time at the little bar (cpu only):

Those seem to be numbers I expect, especially from Intel. Almost all Intel CPUs I've used can be coaxed into a sub 2W idle but I don't know enough about the tech details to know how it's done. Another example of this is my i5-8400 would not go below ~13W with any changes I was willing to change in BIOS, however a Dell Optiplex with i5-8500 at work idles at barely over 1 watt.

The CPU can do it and Dell's BIOS is set to do it but I just don't know what changes to make on my machine to implement it. That said I still have never used my 8400 with it's iGPU, maybe only running it that way is the "answer" but I suspect there are other solutions. One bit of funny business I've observed (with Intel Power Gadget reporting) is when the (Nvidia) GPU is being used for light work like watching a video, my 8400 will go to as low as 3.5W but after that demand is removed, in lockstep with the GPU reducing it's power usage the 8400 goes right back to ~13W. It seems some power states are being managed there but I dunno how to invoke them full time.
Posted on Reply
#62
iameatingjam
Yeah I'm happy with my 14700k.... Not really interested in the best of the best binning. But hey... if people are into that, then its their money I guess. And if it helps intel with margins then thats good too. Its bad for everybody if there's only one player in this field. But intel can't just keep upping clock speeds, I sure hope Arrow lake amazes.... it really has to, to get intel out of the hole they are in right now.
Posted on Reply
#63
dgianstefani
TPU Proofreader
iameatingjamYeah I'm happy with my 14700k.... Not really interested in the best of the best binning. But hey... if people are into that, then its their money I guess. And if it helps intel with margins then thats good too. Its bad for everybody if there's only one player in this field. But intel can't just keep upping clock speeds, I sure hope Arrow lake amazes.... it really has to, to get intel out of the hole they are in right now.
Hole? They have almost twice the CPU marketshare of sold chips than AMD.
Lew ZealandI'm using a Kill a Watt from the wall as I don't trust the difference in CPU power reporting from the 2 brands. And in the end, total system idle is what I'm interested in. Generally other equipment is similar with an NVMe SSD and an AMD GPU except for those Nvidia-AMD GPU power difference tests on Intel*. Intel Power Gadget (RIP) does seem to report those power differences properly and I use it for CPU power tracking on Intel chips as I see BIOS and Throttlestop TDP changes reflected accurately to the watt once the Intel CPU exits Tau.

I do not put much trust in the idle power reporting from any of those websites because it's very clearly dependent on a variety of factors, so which one if any is "real"? And when W1zz does his tests here, who's to say they will match the average user's experience? W1zz's tests will be great as the community here can test and input their findings too and having a large number of people testing may help pin down best practices for minimizing idle power on both platforms.

* As total system idle is what I'm interested in, Intel Power Gadget's (CPU-only) reported reduction in idle power when swapping from Nvidia to AMD GPU was greater than the observed system idle power reduction. My assumption is that the AMD GPU is making up the difference with slightly higher idle power use while the AMD driver (somehow?) puts the Intel CPU into a lower power idle state. I'll take the nice net reduction.



Those seem to be numbers I expect, especially from Intel. Almost all Intel CPUs I've used can be coaxed into a sub 2W idle but I don't know enough about the tech details to know how it's done. Another example of this is my i5-8400 would not go below ~13W with any changes I was willing to change in BIOS, however a Dell Optiplex with i5-8500 at work idles at barely over 1 watt.

The CPU can do it and Dell's BIOS is set to do it but I just don't know what changes to make on my machine to implement it. That said I still have never used my 8400 with it's iGPU, maybe only running it that way is the "answer" but I suspect there are other solutions. One bit of funny business I've observed (with Intel Power Gadget reporting) is when the (Nvidia) GPU is being used for light work like watching a video, my 8400 will go to as low as 3.5W but after that demand is removed, in lockstep with the GPU reducing it's power usage the 8400 goes right back to ~13W. It seems some power states are being managed there but I dunno how to invoke them full time.
AMD graphics driver has slightly less CPU overhead than NVIDIA.

There's other downsides to the approach they take to achieve this though.
Posted on Reply
#64
Lew Zealand
dgianstefaniAMD graphics driver has slightly less CPU overhead than NVIDIA.

There's other downsides to the approach they take to achieve this though.
While true, this shouldn't be making a 24% increase in total idle power and a 225% increase in CPU-only idle over AMD's GPU driver. The CPU seems to be going into a different power state as when the Nvidia GPU gets a light load vs idle, the CPU power is reduced by a similar amount to almost match the CPU idle power w/AMD GPU. I'm pretty sure this CPU state could be set the same way with an Nvidia driver when at idle, I'd just love to see how.
Posted on Reply
#65
iameatingjam
dgianstefaniHole? They have almost twice the CPU marketshare of sold chips than AMD.
Yeah, hole. They got a bit of a hole to dig themselves out of to get back on top.

Hey man I'm on team intel here, but it can't just be me who's noticed in the build it yourself community, Intel's reputation has not been doing so great lately. I don't totally get it myself. Sure 14th gen didn't bring much new to the table but it did bring 13900k performance down the i7s which is really fantastic for people like me... who really shouldn't be buying flagship CPUs, but still get nearly that performance for $400. I'm pretty happy with that.

But that being said, there is a huge rush of consumers going straight for amd/x3d chips as of late. You're going to tell me Intel has nothing to worry about in this market? I mean I appreciate the optimism, but it just doesn't sound like reality to me, sorry. Maybe there's a miscommunication somewhere.
Posted on Reply
#66
dgianstefani
TPU Proofreader
iameatingjamYeah, hole. They got a bit of a hole to dig themselves out of to get back on top.

Hey man I'm on team intel here, but it can't just be me who's noticed in the build it yourself community, Intel's reputation has not been doing so great lately. I don't totally get it myself. Sure 14th gen didn't bring much new to the table but it did bring 13900k performance down the i7s which is really fantastic for people like me... who really shouldn't be buying flagship CPUs, but still get nearly that performance for $400. I'm pretty happy with that.

But that being said, there is a huge rush of consumers going straight for amd/x3d chips as of late. You're going to tell me Intel has nothing to worry about in this market? I mean I appreciate the optimism, but it just doesn't sound like reality to me, sorry. Maybe there's a miscommunication somewhere.
It's very much a perception issue, not a competitive issue in my opinion. Like with this thread. Where people look at the red herring of peak synthetic power draw numbers and conclude intel=inefficient. When the reality is at any given moment an Intel chip is more likely to be using less power than an equivalent AMD one. There's also the fact that they're using an inferior process and what's basically a tweaked three year old architecture, yet still have the IPC advantage and competitive gaming performance.

Besides, while it's true what you've mentioned about the build it yourself community, that's very much the minority in terms of actual chips sold, e.g. Laptops, business, prebuilts etc. Intel also has what seems to be a contemporary fab business, arguably better than Samsung anyway and maybe competitive with TSMC if things pan out. I'd argue this is a better situation than AMD, which has to fight Apple for second dibs on the second tier TSMC capacity, has some mindshare for CPUs amongst youtubers and techheads, but is a massive minority for GPUs and can't even get AI features/upscaling to work better over three generations than Intel did in one generation.

Intel stock is popular in the private communities I'm in for these reasons and more.

I'm also very impressed with Intel drivers and the pace of their improvement within a single generation, open-source development, their Linux flavour "Clear Linux" etc.

Not to mention a stock Intel chip running at almost 400 W under an AIO still runs cooler than a Zen 4 chip which peaks at 95 C even at 100 W.

www.techpowerup.com/320061/unreleased-intel-core-i9-14900ks-already-de-lidded-10-c-temperature-drop-on-offer

(He tested it before and after delidding).

With both backside power delivery and ribbonFET/gate all around transistors coming with 15th gen, besides a more advanced version of the chiplets AMD has (foveros with tiles), I think Intel is going to be coming out of this perception swinging, with the momentum a company of their heritage can muster.
Posted on Reply
#67
iameatingjam
dgianstefaniIt's very much a perception issue, not a competitive issue in my opinion. Like with this thread. Where people look at the red herring of peak synthetic power draw numbers and conclude intel=inefficient. When the reality is at any given moment an Intel chip is more likely to be using less power than an equivalent AMD one. There's also the fact that they're using an inferior process and what's basically a tweaked three year old architecture, yet still have the IPC advantage and competitive gaming performance.

Besides, while it's true what you've mentioned about the build it yourself community, that's very much the minority in terms of actual chips sold, e.g. Laptops, business, prebuilts etc. Intel also has what seems to be a contemporary fab business, arguably better than Samsung anyway and maybe competitive with TSMC if things pan out. I'd argue this is a better situation than AMD, which has to fight Apple for second dibs on the second tier TSMC capacity, has some mindshare for CPUs amongst youtubers and techheads, but is a massive minority for GPUs and can't even get AI features/upscaling to work better over three generations than Intel did in one generation.

Intel stock is popular in the private communities I'm in for these reasons and more.

I'm also very impressed with Intel drivers and the pace of their improvement within a single generation, open-source development, their Linux flavour "Clear Linux" etc.

Not to mention a stock Intel chip running at almost 400 W under an AIO still runs cooler than a Zen 4 chip which peaks at 95 C even at 100 W.

www.techpowerup.com/320061/unreleased-intel-core-i9-14900ks-already-de-lidded-10-c-temperature-drop-on-offer

(He tested it before and after delidding).
I don't necessarily disagree with anything you are saying. I grew up on intel and have barely known anything else ( except some of the early motorollas and powerpcs, story for another time) and I want them to succeed. The OEM market is bigger than the build it yourself market, true. But also, the build it yourself market I think, has more influence on people's buying decisions, due to influencers and what not. Intel's got a chance here, to be one of the first to get ASMLs new machine... that should at least give them the opportunity to pull out ahead, assuming they don't mess something up ( which is a pretty big risk when we are talking designs so complicated its hard to get your head around.)

Anyway, please don't see me as an enemy. I am a friend who wants to see Intel succeed. I've just seem some.... questionable moves in the past so its all up in the air at the moment. But with Intel from what I understand, being one of the first to get ASMLs new machine, if everything going right, they really could push ahead.

I'm wondering if Intel is going to go the AMD route with productivity CPUs and gaming CPUs. I mean, it does kind of make sense from where AMD is sitting. Why have 16 cores when games only need 8? Lets use that extra space for cache that actually matters in games.

At the same time its nice to be able to pick up a 14700k or 14900k or even 14900ks and be like "Yeah, this can do everything."

Idk, we'll have to wait and see.
Posted on Reply
#68
dgianstefani
TPU Proofreader
From what I understand, Foveros doesn't have the inefficiency or latency issues the Zen chiplet + IO die design has, and should actually be more efficient than Intel's current monolithic architecture, while also being easier to iterate with, as they can simply replace a tile with an updated one.
iameatingjamI don't necessarily disagree with anything you are saying. I grew up on intel and have barely known anything else ( except some of the early motorollas and powerpcs, story for another time) and I want them to succeed. The OEM market is bigger than the build it yourself market, true. But also, the build it yourself market I think, has more influence on people's buying decisions, due to influencers and what not. Intel's got a chance here, to be one of the first to get ASMLs new machine... that should at least give them the opportunity to pull out ahead, assuming they don't mess something up ( which is a pretty big risk when we are talking designs so complicated its had to get your mind around.)

Anyway, please don't see me as an enemy. I am a friend who wants to see Intel succeed. I've just seem some.... questionable moves in the past so its all up in the air at the moment. But with Intel from what what I understand, being one of the first to get ASMLs new machine, if everything going right, they really could push ahead.

I'm wondering if Intel is going to go the AMD route with productivity CPUs and gaming CPUs. I mean, it does kind of make sense from where AMD is sitting. Why have 16 cores when games only need 8? Lets use that extra space for cache that actually matters in games.

At the same time its nice to be able to pick up a 14700k or 14900k or even 14900ks and be like "Yeah, this can do everything."

Idk, we'll have to wait and see.
I don't consider anyone on these forums as an enemy, but I also don't hold back in discussion so people confuse that with emotion.
Posted on Reply
#69
fevgatos
==
Lew ZealandClaimed Ryzen 7000 power draw is all over the place in reviews. Some say they are similar, with Ryzen overall a bit higher than 13th and 14th Gen Core:

CPU only:



Whole system:



What I've found it most/all non-K Intel CPUs can idle lower than 5W but different circumstances in the OS can change that to 15-20W at idle.

For example: Dell Optiplex 9020 i7-4790.

Intel iGPU drivers: 15W idle
Nvidia drivers: 15W idle
AMD drivers: 4W idle

These are GPU drivers so WTF?
Even the K ones idle at 2-3 watt in balanced power plan. Guy on that graph might be using high performance
Lew ZealandSo in other words, idle power is less a factor of a CPU and more about what you surround it with.

I just tested my main gaming machine which has a Zen 3 5800X3D and 6800 XT but on a B450 with only 3200 MHz CL16 RAM and it idles at 41W. I don't see the 5800X3D's idle total system power at 41W in any of the reviews in this thread which suggests that you can't make specific claims unless you match equipment exactly to the reviewed system. Which people rarely (lol, never) do when building a system.

Looks like Zen 4 idles at higher power but how much higher if you set up your system with idle power in mind? Again comparing to setting up an Intel system similarly.
A 5800x at jedec at true idle was drawing 22w measured from the cables. With high speed ram it hit 28w. Balanced power plan btw. Keep in mind that dual CCD CPUs draw more than that.
Posted on Reply
#70
iameatingjam
dgianstefaniI don't consider anyone on these forums as an enemy, but I also don't hold back in discussion so people confuse that with emotion.
Not saying you did. Just making sure you know where I'm coming from ( a place of love for Intel and technology in general) and listening to where you're coming from and finding common ground and not letting things escalate out of control like they all too often do in this space.... Sigh.... :p

Miscommunications turn into tension, tensions turns into arguments, arguments turn into flame wars and before you know it's just another red vs green thread. Thats always the end of of a productive conversation lol. ( Kind of kidding, kind of not). Used to be when mustache man made an appearance, now its when red team green team voice their favorites for the upcoming chariot races erm I mean computer components.... yeah that.


Anyway, never meant in the slightest to put intel down aside from mentioning some hard truths that the company ( and the fans for that matter) will have to get over, and I have a feeling they will. Maybe its just marketing type but I feel like intel's about to go on an upward swing... ( or god please don't let this age too bad).
Posted on Reply
#71
fevgatos
RandallFlaggAnd from techradar - everyone looks at the big bar, but probably spends 95% of their time at the little bar (cpu only):

And more importantly, people ignore that the big bar can be fixed with 3 clicks on the bios. The small bar cannot.
Posted on Reply
#72
bug
fevgatosIn what way? If they are just 5% better binned than the average K then these are useless. How will you know about the bin quality without an actual review?
In the way where these are built for people that buy the top of the line no matter what. If you care about efficiency, power draw and such just a bit, these aren't for you.
Even "better bin" is a subjective term, when it comes to these. The extra 100-200 MHz these would see can easily be negated by luck of the draw. Or simply variance of case airflow.
Posted on Reply
#73
Veseleil
RandallFlaggYes but you have to normalize components, and I'm not sure that the i5-8400 was a true match to an R5-2600. Seems like that would have been an i5-8600. Also memory, 2933 is obv a different kit, and higher clock.



My direct examples came from Vortez.net, just look under reviews -> cpus/motherboard and pick a mb review of interest, and look under power consumption.

This is just the easiest way I've seen to directly compare same CPU on different motherboards idle consumption.

There are a lot of other more anecdotal comparisons though. I haven't seen very many that don't show the same trend, just typically you don't know details. For example (7950X review) :



And from techradar - everyone looks at the big bar, but probably spends 95% of their time at the little bar (cpu only):

Lew ZealandI'm using a Kill a Watt from the wall as I don't trust the difference in CPU power reporting from the 2 brands. And in the end, total system idle is what I'm interested in. Generally other equipment is similar with an NVMe SSD and an AMD GPU except for those Nvidia-AMD GPU power difference tests on Intel*. Intel Power Gadget (RIP) does seem to report those power differences properly and I use it for CPU power tracking on Intel chips as I see BIOS and Throttlestop TDP changes reflected accurately to the watt once the Intel CPU exits Tau.

I do not put much trust in the idle power reporting from any of those websites because it's very clearly dependent on a variety of factors, so which one if any is "real"? And when W1zz does his tests here, who's to say they will match the average user's experience? W1zz's tests will be great as the community here can test and input their findings too and having a large number of people testing may help pin down best practices for minimizing idle power on both platforms.

* As total system idle is what I'm interested in, Intel Power Gadget's (CPU-only) reported reduction in idle power when swapping from Nvidia to AMD GPU was greater than the observed system idle power reduction. My assumption is that the AMD GPU is making up the difference with slightly higher idle power use while the AMD driver (somehow?) puts the Intel CPU into a lower power idle state. I'll take the nice net reduction.



Those seem to be numbers I expect, especially from Intel. Almost all Intel CPUs I've used can be coaxed into a sub 2W idle but I don't know enough about the tech details to know how it's done. Another example of this is my i5-8400 would not go below ~13W with any changes I was willing to change in BIOS, however a Dell Optiplex with i5-8500 at work idles at barely over 1 watt.

The CPU can do it and Dell's BIOS is set to do it but I just don't know what changes to make on my machine to implement it. That said I still have never used my 8400 with it's iGPU, maybe only running it that way is the "answer" but I suspect there are other solutions. One bit of funny business I've observed (with Intel Power Gadget reporting) is when the (Nvidia) GPU is being used for light work like watching a video, my 8400 will go to as low as 3.5W but after that demand is removed, in lockstep with the GPU reducing it's power usage the 8400 goes right back to ~13W. It seems some power states are being managed there but I dunno how to invoke them full time.
My Watt meter show 80W-120W idle consumption, depending on the AMD GPU driver's mood and number of monitors used, plugged in, etc. And I can feel that heat coming from the exhaust, so those values might come close to being accurate. Let's calculate 30W for the CPU and 11W for a GPU (trust HwInfo for now), that's 41W. If the total power draw is 80W, is it possible that 8 case fans, 2 CPU fans, ~150cm RGB LED (5050 diodes) strip, 2x3TB HDDs, 1 NVME use only ~40W on idle?
Posted on Reply
#74
fevgatos
bugIn the way where these are built for people that buy the top of the line no matter what. If you care about efficiency, power draw and such just a bit, these aren't for you.
Even "better bin" is a subjective term, when it comes to these. The extra 100-200 MHz these would see can easily be negated by luck of the draw. Or simply variance of case airflow.
These are the perfect cpus for those that care about efficiency actually. Better bins = less power at same clockspeeds. This will be significantly more efficient than eg. a 13900k.
Posted on Reply
#75
bug
fevgatosThese are the perfect cpus for those that care about efficiency actually. Better bins = less power at same clockspeeds. This will be significantly more efficient than eg. a 13900k.
Even if a significant proportion of the prospective buyers would care about that, I don't expect efficiency to be 10% than the non-S model. But we'll never have the benches to prove either way, so let's just stop here.
Posted on Reply
Add your own comment
May 16th, 2024 17:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts