• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-14900KS Draws as much as 409W at Stock Speeds with Power Limits Unlocked

tjmax also is not just a spec'd performance or efficiency parameter, its a safety parameter.
Exactly! I remember this argument during the Zen 4 launch when AMD said that running it at 95 °C is fine, which some people misinterpreted as "the CPU boosts to 95 °C by design". No, it does freaking not. It only does if your cooling is inadequate to handle its power limit or voltage-frequency curve... said by someone who owns a 7800X3D that never goes above 82 °C under full load with PBO enabled.

Simple. Defaults. Default Clocks/Boosts, default power limits, default everything.
Default by Intel, or default by the motherboard?
 
Not sure if it's a troll or a kid(absoluteUnit), but I doubt anyone interested will be running this at stock settings, unless he has no idea how to tweak a system
Sounds like the kind of person that buys the most expensive MB/CPU/RAM, loads the XMP profile and then going into forums to show how "good" he is, unaware of what it actually looks like.

In a serious note, this is a product that exists for 2 reasons:
  1. performance crown
  2. overclockers
You can easily run the 14900KS at whatever power you want. It might actually be a high-leakage chip that's better suited for high clocks/high voltages and having worse power efficiency at low power.
We are flooded with BIOS options you can adjust to your use-case, be it high 1T boosts, 24/7 efficiency or anything else you want to target

So if you are getting frame drops on modern Intel or AMD platforms in games that reach >200fps, you clearly have a lack of knowledge and should read some guides or buy a locked CPU, it will save you money.

The kid/troll does have a point though, AMD's turbo implementation requires a few extra steps to make it "compatible" for older game engines

It's too depressing that every youtuber that calls themselves "experts" these days is just enabling a setting in the BIOS and nobody bothers to ask some pro overclockers for help with tweaking of settings (not talking about RAM timings) like power-savings/c-states/Windows tweaking even BCLK overclocks which would be even more interesting to see.
Some settings can give you serious responsiveness improvements in games

At least popular youtubers should put some effort, since motherboard vendors would definitely sponsor for showcasing their expensive external clock generator motherboards.

LTT and GN specifically could run a parallel project where they do this, they have the money, time and personnel to spare.
What's the point of showing us the i3 or ryzen 3 on a 500$ motherboard. At least der8auer and buildzoid release interesting findings to a degree, even for older platforms.
 
I paid like $660 for a 14700K SR 4x16GB 7000MT/s CL34 system memory and Z790-H Asus Rog Strix MB + taxes. I'm not feeling bad at all about that when 1950X ThreadRipper CPU alone in 2017 would've costed $999 and hell that doesn't even take into account inflation. On top of it the performance absolutely embarrasses the Threadripper that didn't exactly sip power itself at 180W TDP. It certainly isn't if populating the PCIE lanes with GPU's either that chug more power than a CPU. Anyone remember triple SLI/quad SLI and Titans yeah exactly no one wants to remember it. I wonder how many people complaining at one point or another ran SLI/CF on a mid range or high end GPU for a less than stellar uplift experience filled with micro stutter.

Even you aren't entirely fond of the 14700K on stock power draw without power limits it's hard to not concede I didn't get amazing value for dollar and at the same time it's easy enough to readily tame power pretty significantly and w/o sacrificing huge amounts of performance especially depending on intended usage. It's pretty much a breeze to drop MT or ST if you need a lot of either and bring down general power draw a good bit by focusing on one or the other when you understand how to go about it.

I capitalized on a handful of great deals much like 1080Ti GPU owners did those were pretty killer value for a high end GPU at that point in time. As the saying goes seize the day. It was the parts I needed to update more and at bargain pricing why would I have chosen differently and worse value for dollar just to appease brand loyalists or a negligible different in power draw in scenario that isn't typical in the first place for daily usage. That would've been a really dumb decision quite honestly and it can sip performance with some easy enough bios tuning and with what I saved very readily I can afford to do so and sacrifice a bit of performance. It can consume a lot in the worst case scenario while performing a lot of work or it can easily sip power as well plus it's ST performance in particular is outstanding and can still be while sipping power.

I'd drop in a 14900KS in a heartbeat if I didn't have to pay for it even with 240 AIO that would struggle more to keep it cool and not throttle. It wouldn't be hard to tune 4 to 8 of the E cores to keep power consumption fairly in line with a 14700K. I could easily do that and stretch the ST/MT higher than 14700K at the same power limits because it's better silicone quality and has 4 additional core/thread with more cache available. It doesn't need to consume that much peak and still perform better than a 14700K. It's not a whole lot different than a 14900K in reality just a continuation and higher power draw in turn for a bit more performance. It could probably actually beat the 7950X in both ST and MT with a 14900KS and same power limits I'm running on the 14700K I'm presuming, but might struggle to do so on a 240 AIO to be fair. The package temps with another 4E cores has to be a little more dicey at load for a 240 AIO, but it does have more cache and threads so it's got upside in those area's and more to work with on tuning that a savvy tech users can leverage.

It's not a chip aimed at tech users that aren't a bit tech savvy in the first place anyway really. It's a high end chip which is generally aimed at more tech savvy users that can afford such technology and willing to pay a bit more for it. You don't need to be tech savvy to buy one, but I don't think a over abundance of not so tech savvy people are going to buy a chip that high end in reality. I mean some will naturally, but majority won't.
 
Such chip would self destroy itself at these frequencies and power draw in minutes.
Leakage doesn't depend on frequency. Temperature, on the other hand, does have an influence on leakage.
 
Hahahaha Intel and Nvidia battling for the crown of most wasteful chip?

Mother of god
The 4090 is a pretty efficient chip when tested watt for watt against a 3090, lock both cards to 250w/350w and the 4090 is still a lot faster.

Why the hell anyone would buy a 400w CPU over AMDs current offerings I'll never know as AMD is still damn fast but sips power so less heat to deal with and far easier to cool.
 
I'll probably be buying one sadly, (I'm a sucker for good intel silicon) on the upside it's going to be a good test of my new CPU loop I'll be building : )
Hi,
Don't need much more than a chiller hehe
 
400W+ is basically what the 14900K does with power limits lifted.

I think its quite impressive intels manufacturing process is capable of handling this level of power. I like seeing things pushed to the max as a window shopper but not equally enthused to get my hands wet with it (or burnt). A nice collectors piece at best.
 
The 4090 is a pretty efficient chip when tested watt for watt against a 3090, lock both cards to 250w/350w and the 4090 is still a lot faster.

Why the hell anyone would buy a 400w CPU over AMDs current offerings I'll never know as AMD is still damn fast but sips power so less heat to deal with and far easier to cool.

I don't see the difference really the 4090 is pretty much equally out of hand on excessive TDP at default. Just the same likewise at the same performance levels relative to earlier hardware things look much better in general with power limits. The reasons why people want it are the same as with a RTX 4090 to get more work done and more quickly. I don't see how that's complicated to figure out. Not everything is about gaming alone or one particular workload. Hell even in the case of gaming there are instances where Intel Raptor Lake pulls ahead of AMD X3D with all that cache because ST and MT performance are still relevant along with system memory performance and it's IMC support is higher and is known to OC well above support speeds.

I'd be shocked if most 14700K chips can't do at least 6600MT/s on average with decent memory kit and board and that's across 4 DIMM's let alone 2 DIMM's. Who knows maybe I struck silicone lottery on IMC on my 14700K with 7000MT/s CL30 on 4 DIMM's, but perhaps not that might be typical or even below average provide you know how to tune the memory timings and voltages to run 7000MT/s in the first place and tighten things up to CL30.

It is a bit excessive at 400W and I just probably get a 14900K if anything myself because it's probably modestly cheaper and isn't as extreme on the power draw and cooling requirements, but still has the additional 4E cores and cache upside over a 14700K. I don't plan to upgrade my 14700K to 14900K mind you because it would be foolish, but if offered to 14900K to replace the 14700K I'd 100% do so because it's the better CPU chip and silicone bin quality between the two and power limits exist and CPU ratio's are adjustable and I know how to do so just fine. I could maintain the same power draw usage and have higher performance and at the same time if performance at higher power draw it's fully capable and can provide it.

I mean more work performed tends to consume more power in general though node advancements and things can offset that issue so the future will always tend to look brighter with technology advancements until we really slam are heads against the wall with low hanging fruit and we're getting closer and closer to that point unless we have break thru that changes things pretty significantly.

It's a lot of wattage, but that's at peak power and performance and if you omitted cache driven scenario's or AVX-512 name better a performing CPU for consumer desktops. Sure there is the much more expensive ThreadRipper and Epyc's that aren't in the same price tier, but that's not saying a lot. Raptor Lake crushed at ST and a 14700K's MT really isn't too far beyond a 7590X let alone a 14900K or 14900KS with additional 4E cores and more cache along with better binning.

What it offers for what it costs relative to competition and not simply in one scenario like gaming or more appropriately rendering is a important factor. Also that scenario is GPU driven anyway and I can easily see stacked cache on GPU's negating much of the need for CPU stacked cache along faster VRAM and more capacity. There amazing chips for DAW's larger projects at lower ASIO settings and higher mixer playback quality and quicker rendering time are all good perks.

It's also not as if both Nvidia and AMD aren't literally chugging power at the high end of GPU's more so than with CPU's and haven't been known to. Hell we use to have quad SLI/CF systems for FFS so it's a bit ironic given the overall level of criticism pointed at Intel on power draw. A CPU tends to by more multipurpose than GPU overall, but all some individuals care about is gaming, but that isn't the case for everyone. Get the technology that best suits yourself and worry a bit less about others.

It's fine enough to say 400W is pushing it a bit and excessive, but that applies just as much to GPU's. These chips generally won't be running at that wattage for extended periods anyway by most people buying them. If anything a GPU is more liable to run at full power draw more of the time than a CPU because gamer's have a bit of a unhealthy obsession with having the highest levels of performance in general which is why quad SLI/CF existed for so long a period of time.
 
The 4090 is a pretty efficient chip when tested watt for watt against a 3090, lock both cards to 250w/350w and the 4090 is still a lot faster.

Why the hell anyone would buy a 400w CPU over AMDs current offerings I'll never know as AMD is still damn fast but sips power so less heat to deal with and far easier to cool.
If its so efficient they should've just left it at 300W peak power. They didn't. They chose to go full Intel KS on that shit.
 
I love reading all the excuses for the existence of this waste of sand. It proves that with a bit of bullshit marketing magic (performance crown), and a pinch of fanboyism (it's not that bad, you can undervolt it), everything can be sold. :laugh:
 
... this waste of sand.
You have people tweaking the CPUs - worrying about tiny fractions of volts or tens of megahertz in frequency, for these tweakers the CPU brings some real measurable benefit. Other people simply need to have the best thing available. 14900KS will be selling to these people, and as long they buy it, the sand was not wasted on them.
 
I love reading all the excuses for the existence of this waste of sand. It proves that with a bit of bullshit marketing magic (performance crown), and a pinch of fanboyism (it's not that bad, you can undervolt it), everything can be sold. :laugh:
I like to think of it this way... the over the top, well binned and expensive CPUs for ultra enthusiasts are good for profit margins and in a way, subsidize the more down earth offerings for us plebs. From what I understand a lot of intel's mid tier chips are barely profitable if it all, the 'high' end chips balance the books. So if people want 6.2ghz... then sell it to them! It only helps the rest of us. Nobody wants a monopoly here ( I hope )
 
Last edited:
I love reading all the excuses for the existence of this waste of sand. It proves that with a bit of bullshit marketing magic (performance crown), and a pinch of fanboyism (it's not that bad, you can undervolt it), everything can be sold. :laugh:

Similar products never meant to have any value related to the cost.
This cpu has the same value to the specific target group as a ....11900KS (it doesn't exist).
It's not about performance. It's about the best performance from the specific gen/SKU for the records, charts and benchmarks.

So, you would pay the ridiculous amount of money they request, if you look to break a, let's say, 8.5Ghz frequency record.
 
You have people tweaking the CPUs - worrying about tiny fractions of volts or tens of megahertz in frequency, for these tweakers the CPU brings some real measurable benefit. Other people simply need to have the best thing available. 14900KS will be selling to these people, and as long they buy it, the sand was not wasted on them.
If you're looking for millivolts, or tens of megahertz of improvement, or want to have "the best of the best" (bragging rights), and you're willing to pay serious cash to get it, then you're an idiot. Simple as that.

I like to think of it this way... the over the top, well binned and expensive CPUs for ultra enthusiasts are good for profit margins and in a way, subsidize the more down earth offerings for us plebs. From what I understand a lot of intel's mid tier chips are barely profitable if it all, the 'high' end chips balance the books. So if people want 6.2ghz... then sell it to them! It only helps the rest of us. Nobody wants a monopoly here ( I hope )
I don't think it works that way. Mid-range is the best seller in any product line. Why would Intel sell it at a loss only to recoup the losses with the low number sales at the ultra high-end? It doesn't make sense.
 
If you're looking for ... or want to have "the best of the best" (bragging rights), and you're willing to pay serious cash to get it, then you're an idiot.
Federation of the Swiss watch industry categorically disagrees with you!
 
I don't think it works that way. Mid-range is the best seller in any product line. Why would Intel sell it at a loss only to recoup the losses with the low number sales at the ultra high-end? It doesn't make sense.
Best seller... in volume. I mean look... what would you do with defective chips? Throw them away? Or recoup some of the cost by binning them down, maintaining market share and perhaps making a small margin?

Either way, unit margins are much stronger with the 14900k and 14900ks, especially when you consider a lot of the lower end skus are going to system integrators who get an additional discount. The 14900ks doesn't have to move in big numbers nor does the 14600k have to be sold at a loss for the 14900ks to be good for those buying the 14500 or something like it. It gives room for flexibility in pricing, that wouldn't necessarily be there without the higher end and more profitable skus.

If people want to spend ridiculous amounts of money for the best of the best binning, I really don't see the problem with that. As long as bottom feeders like me can continue to buy the middle of the road stuff and still get most of the performance.
 
Last edited:
I love reading all the excuses for the existence of this waste of sand.
Then this CPU clearly isn't for you. I know people who will happily say: "Please get me one of these and take my money!"

The reality is: the best and fastest is the best and the fastest. That's what this chip is. It ISN'T for everyone..
 
The arguments over this CPU are silly. It’s not meant to make sense from a practical standpoint. It exists for a very specific subset of people who want the nebulous “best” even if it comes at a great monetary cost and is not efficient in any way. Someone mentioned expensive watches and yeah, that kinda tracks. A better analogy would be buying a supercar with tons of power that can never be reasonably used on any road and daily driving it. Is it practical? No. Will it guzzle fuel like nobody’s business? Oh yeah. Can you get the same experience of pure driving pleasure from something like an MX-5 or, hell, the cheapest Caterham if you don’t care about practicality at all? Sure. But it won’t be “the best” and for some people that’s all that matters.
 
Imagine The best in 10 000, with pure silver heatspreader, cap and pin.

Or The best in 1 000 000 with heavy gold plated silver heatspreader, wooden presentation box, wall plaque, cap and pin.

How much could they cost and do you think that Intel would have problems selling them?
 
The arguments over this CPU are silly. It’s not meant to make sense from a practical standpoint. It exists for a very specific subset of people who want the nebulous “best” even if it comes at a great monetary cost and is not efficient in any way. Someone mentioned expensive watches and yeah, that kinda tracks. A better analogy would be buying a supercar with tons of power that can never be reasonably used on any road and daily driving it. Is it practical? No. Will it guzzle fuel like nobody’s business? Oh yeah. Can you get the same experience of pure driving pleasure from something like an MX-5 or, hell, the cheapest Caterham if you don’t care about practicality at all? Sure. But it won’t be “the best” and for some people that’s all that matters.

I'm struggling with the Supercar analogy. a 200mhz bump vs a supercar.... one pulls chicks all night long and the other pulls 2-3 FPS more and burns your house down lol
 
I'm struggling with the Supercar analogy. a 200mhz bump vs a supercar.... one pulls chicks all night long and the other pulls 2-3 FPS more and burns your house down lol
At least with the KS you don’t have to worry about any STDs from all the chicks you pulled, so in that way it’s infinitely better than the supercar.
 
Back
Top