• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-13900K

I almost honestly feel like given the backlash, this trend isnt going to continue but instead finally pop. We have yet to see AMDs new GPUs but I almost feel like this is netburst and fermi all over again and the next platform jump we will start to see reductions in temps.

I personally think this will come in the form of IPC and arc gains, not physical nodes. Its my belief anyway that the voltage reduction from node shrinking has past the point of diminishing returns and we are simply too thermally dense now.

To correct this imo, they will increase overall die size to spread this out, and/or they will work (probably) on e-core efficiency or increase their work load ability as too keep the P-cores more inactive.

Just conjecture of course, I havent finished me first coffee and im about to be late for a meeting so im shooting in the dark tbh.
 
Amazing performance no doubt but at huge power draw and very high temperature,
I think all AMD needs to do is to cut their CPUs prices to a competitive points and users to enable Eco mode in MB UEFI.
 
Der8auer posting interesting numbers.

1666285324199.png
 
I almost honestly feel like given the backlash, this trend isnt going to continue but instead finally pop. We have yet to see AMDs new GPUs but I almost feel like this is netburst and fermi all over again and the next platform jump we will start to see reductions in temps.

I personally think this will come in the form of IPC and arc gains, not physical nodes. Its my belief anyway that the voltage reduction from node shrinking has past the point of diminishing returns and we are simply too thermally dense now.

To correct this imo, they will increase overall die size to spread this out, and/or they will work (probably) on e-core efficiency or increase their work load ability as too keep the P-cores more inactive.

Just conjecture of course, I havent finished me first coffee and im about to be late for a meeting so im shooting in the dark tbh.
Probably for Intel but not AMD, that's because chiplets (or tiles) will necessitate lower clocks & probably result in lower temps for Intel. At least till they master the tech & get close or above AMD in overall clock speeds. Now once AMD takes the IPC crown, likely with zen5, they would lower the insane power draw at the top end as well but yeah then there's physics ~ all of the major chipmakers will have to lower clocks going forward. Aside from IPC & more cores there's no way they can sell these chips to the retail (non extreme) DIY market.

Der8auer posting interesting numbers.

View attachment 266353
7950x is slightly more efficient at "65W" eco mode.
 
What’s interesting about this? Every review site has 13 gen leading in far cry 6 that’s tested the game.
Doesn't matter the game, point is, in gaming, the 13900K even at stock uses similar amounts of power to the 7950X while giving much better performance.

1666285891912.png


The only issue with 13th gen I see, is that mobo manufacturers remove power limits, and the tune is way out of the efficiency curve, but then you could say the same about AMD.

Per core OC, with power limits enforced seems to be the ideal way to tune these chips.
 
Doesn't matter the game, point is, in gaming, the 13900K even at stock uses similar amounts of power to the 7950X while giving much better performance.

View attachment 266354

The only issue with 13th gen I see, is that mobo manufacturers remove power limits, and the tune is way out of the efficiency curve, but then you could say the same about AMD.

Per core OC, with power limits enforced seems to be the ideal way to tune these chips.

Again, every review has pointed this out? 13900k leads, 5800x3d close by, 7000 series a few more percent off.

MT AMD leads in efficiency, ST AMD/Intel are relatively close in efficiency.

Pretty much the story in every review I’ve read.
 
Ill keep my lil old 5900x and 6800xt.. I just upgraded furnace and don't need another.
 
Again, every review has pointed this out? 13900k leads, 5800x3d close by, 7000 series a few more percent off.

MT AMD leads in efficiency, ST AMD/Intel are relatively close in efficiency.

Pretty much the story in every review I’ve read.
And yet there's five pages of posters whining about power draw? And since when is gaming a purely ST load?
 
Would be interesting to see if my MSI Z690 tomahawk DDR motherboard can cope with 13900k even at stock?

Currently got 12600kf that maxes out at 199W no issues, should be Ok with mild OC on the new i9 say at 300 watts?

Looking to buy it in 1 - 2 years' time when they drop(potentially used market)
 
And yet there's five pages of posters whining about power draw? And since when is gaming a purely ST load?
Since when was gaming the most important or only load, yet it is all your going on about.
My PC do 95% work and at best 5% gaming.
Commercial and enterprise pc use eclipses gaming use.
 
So basically, if you want peak performance, your man cave is a furnace and you still only gained what, 10% over any other sane CPU.

Utterly pointless product. Efficiency is nice, but its limits for sure aren't, then again, its a K model, so whatever. But still. Writing's on the wall, as it has been: to get more perf, you need more power, so basically that's all the development on big little end of story.

Its clear though that 125W is a complete joke.
 
Since when was gaming the most important or only load, yet it is all your going on about.
My PC do 95% work and at best 5% gaming.
Commercial and enterprise pc use eclipses gaming use.
So use the Intel power limits that motherboard manufacturers disable by default???? Profit.

So basically, if you want peak performance, your man cave is a furnace and you still only gained what, 10% over any other sane CPU.

Utterly pointless product. Efficiency is nice, but its limits for sure aren't, then again, its a K model, so whatever. But still. Writing's on the wall, as it has been: to get more perf, you need more power, so basically that's all the development on big little end of story.
Are you unable to change a single setting in the BIOS? Didn't realise the enthusiast community was forced to run everything at stock?
 
So basically, if you want peak performance, your man cave is a furnace and you still only gained what, 10% over any other sane CPU.

Utterly pointless product. Efficiency is nice, but its limits for sure aren't.
It depends the main use. I believe that this CPU it's not for gaming, but for productivity and gaming. Only for gaming, there are best options.

For example, I've a 9900K and I'm switching to 13900K, in gaming the improve it's minimal, but in productivity... big improvements.
 
And yet there's five pages of posters whining about power draw? And since when is gaming a purely ST load?

Because it uses more power than the 7950X (7000 series), and thus is also hotter. They’re facts, and anyone is free to express their opinion or concerns about power draw and temps.

You seem to be intentionally ignoring those facts to claim the opposite, as are other people here. Both Zen 4 and RPL are hot and power hungry in stock configs, RPL is worse in those regards. They’re all great CPUs performance wise, doesn’t excuse the fact or change the objective truth.
 
It depends the main use. I believe that this CPU it's not for gaming, but for productivity and gaming. Only for gaming, there are best options.

For example, I've a 9900K and I'm switching to 13900K, in gaming the improve it's minimal, but in productivity... big improvements.
I mean, it's not minimal. You're going from around 13-1400 ST to more than 2200 ST. That's a huge improvement for gaming workloads.

Because it uses more power than the 7950X (7000 series), and thus is also hotter. They’re facts, and anyone is free to express their opinion or concerns about power draw and temps.

You seem to be intentionally ignoring those facts to claim the opposite, as are other people here. Both Zen 4 and RPL are hot and power hungry in stock configs, RPL is worse in those regards. They’re all great CPUs performance wise, doesn’t excuse the fact or change the objective truth.
It uses the same power for much more performance in lightly threaded loads (therefore running cool), and uses more power for heavily threaded loads (while still being faster). This is the "objective truth".

And the same way that AMD has "eco mode" where the CPU is limited to 65 or 95 W, you can do this with Intel if you're so concerned with power draw or can't cool the CPU, without losing much performance (especially in lighter threaded loads), and generally still being faster than the previous generation.
 
Hi,
Gutsy using EOL mx-5 :eek:
 
Yikes that power consumption is too much, I do not understand why AMD and Intel think using brute force to overtake in performance while suffering high temperatures and power usage is a good idea . I'm now curious to see how the 13700k will perform and hope it's not as inefficient as this monstrosity.
Intel's locked cpu's are going to be a hit. Proof is in this vid.

 
Are you unable to change a single setting in the BIOS? Didn't realise the enthusiast community was forced to run everything at stock?
What? That isn't the point. The point is how it gets its spot as a top dog in Intel's stack: by adding ridiculous amounts of wattage.

Every half wit understands you can tweak a K CPU. I even pointed it out in my post. Why is it the way it is, out of the box? To provide the illusion of having a purpose at the top at a retarded price point. Let's just be real about it man. This isn't personal even if you bought one or want to buy a whole box. Its about the company, the product, and the marketing. It is actually possible, - I know this might be surprising to some, evident from your post - to feel a certain way about a product and then apply different considerations when its about your own purchase/usage decisions. Its a stance called realism. Try it.

I feel the exact same way wrt AMD's latest temp target. Utterly fkin ridiculous nonsense. Oh I can undervolt it to work better? Fantastic, but why isn't it like that to begin with? Or is the definition of better simply no longer matching with what the product presents to us? At least AMD couples it with real efficiency gains gen to gen, still, but I wonder if that lasts.

The reality is, this is how we reap the fruits of a new gen now. We add some silicon/die size to make bigger what got smaller and we yank even more volts through it. Such progress. There is also a reality where efficiency is key because realistically, nobody needs this performance in consumer segment, and we could do with lower global power usage. A very unlikely trend I know. But still, that, to me, seems more like progress in CPU now that we're reaching a pinnacle in performance and end to shrinks.

GPU is much the same way. There is a good reason I'm still on this 1080. And its not because there's no money to upgrade. I literally don't see why I should, even after a monitor upgrade.
 
Last edited:
I almost honestly feel like given the backlash, this trend isnt going to continue but instead finally pop. We have yet to see AMDs new GPUs but I almost feel like this is netburst and fermi all over again and the next platform jump we will start to see reductions in temps.

I personally think this will come in the form of IPC and arc gains, not physical nodes. Its my belief anyway that the voltage reduction from node shrinking has past the point of diminishing returns and we are simply too thermally dense now.

To correct this imo, they will increase overall die size to spread this out, and/or they will work (probably) on e-core efficiency or increase their work load ability as too keep the P-cores more inactive.

Just conjecture of course, I havent finished me first coffee and im about to be late for a meeting so im shooting in the dark tbh.
I don't think it will be that complicated. Both the Ryzen 7000 series and the Nvidia 4090 have shown that a tiny bit of underclocking makes a world of difference.

The 7950x loses a whopping 1-2 percent performance, worst case like 5, in exchange you limit power from 170 w to 65.

All these companies have to do is not clock things to the friggin moon. One bin back, that's it, and you save huge on power consumption numbers.
 
So... What kind of temps did you get with the aio cooler or did I miss it?
 
I mean, it's not minimal. You're going from around 13-1400 ST to more than 2200 ST. That's a huge improvement for gaming workloads.


It uses the same power for much more performance in lightly threaded loads (therefore running cool), and uses more power for heavily threaded loads (while still being faster). This is the "objective truth".

And the same way that AMD has "eco mode" where the CPU is limited to 65 or 95 W, you can do this with Intel if you're so concerned with power draw or can't cool the CPU, without losing much performance (especially in lighter threaded loads), and generally still being faster than the previous generation.

The objective truth is your reading comprehension is lacking. Go take another look at the power consumption summary page in this review.

The only explicit data point that shows RPL, specifically the 13900K, using less power in a ST load, is a graph on MP3 encoding.

Otherwise across 12 games, the 13900k on average uses 31w more than the 7950X. And across 45 recorded applications it uses 45w more than the 7950X.

You are objectively wrong. Cherry picking one or two results from other reviews doesn’t change the average across the board, or the summation of all reviews. 13900K is hotter, uses more power, has a a lead in gaming performance, and trades blows across MT apps, while using more power on average (need to restate this as I feel it’s needed).
 
What? That isn't the point. The point is how it gets its spot as a top dog in Intel's stack: by adding ridiculous amounts of wattage.

Every half wit understands you can tweak a K CPU. I even pointed it out in my post. Why is it the way it is, out of the box? To provide the illusion of having a purpose at the top at a retarded price point. Let's just be real about it man. This isn't personal even if you bought one or want to buy a whole box. Its about the company, the product, and the marketing. It is actually possible, - I know this might be surprising to some, evident from your post - to feel a certain way about a product and then apply different considerations when its about your own purchase/usage decisions. Its a stance called realism. Try it.

I feel the exact same way wrt AMD's latest temp target. Utterly fkin ridiculous nonsense. Oh I can undervolt it to work better? Fantastic, but why isn't it like that to begin with? Or is the definition of better simply no longer matching with what the product presents to us? At least AMD couples it with real efficiency gains gen to gen, still, but I wonder if that lasts.

The reality is, this is how we reap the fruits of a new gen now. We add some silicon/die size to make bigger what got smaller and we yank even more volts through it. Such progress. There is also a reality where efficiency is key because realistically, nobody needs this performance in consumer segment, and we could do with lower global power usage. A very unlikely trend I know. But still, that, to me, seems more like progress in CPU now that we're reaching a pinnacle in performance and end to shrinks.

GPU is much the same way. There is a good reason I'm still on this 1080. And its not because there's no money to upgrade. I literally don't see why I should, even after a monitor upgrade.
If you don't like the K series, don't plan on tuning (why buy a K then), then get the non K versions, which will have very similar boost clocks, but more reasonable power targets.

The objective truth is your reading comprehension is lacking. Go take another look at the power consumption summary page in this review.

The only explicit data point that shows RPL, specifically the 13900K, using less power in a ST load, is a graph on MP3 encoding.

Otherwise across 12 games, the 13900k on average uses 31w more than the 7950X. And across 45 recorded applications it uses 45w more than the 7950X.

You are objectively wrong. Cherry picking one or two results from other reviews doesn’t change the average across the board, or the summation of all reviews. 13900K is hotter, uses more power, has a a lead in gaming performance, and trades blows across MT apps, while using more power on average (need to restate this as I feel it’s needed).
Do you understand the concept of performance per watt? (here come the MT synthetic load comparisons...)

Using more power is irrelevant if it also performs faster?

Have another look at the graphs, from the review, and from Der8auer, who I consider to be the foremost expert of tuning. Then maybe think about criticising the reading comprehension of the person who proofed the article you're quoting.

It uses the same power for much more performance in lightly threaded loads (therefore running cool), and uses more power for heavily threaded loads (while still being faster). This is the "objective truth".

And the same way that AMD has "eco mode" where the CPU is limited to 65 or 95 W, you can do this with Intel if you're so concerned with power draw or can't cool the CPU, without losing much performance (especially in lighter threaded loads), and generally still being faster than the previous generation.
 
If you don't like the K series, don't plan on tuning (why buy a K then), then get the non K versions, which will have very similar boost clocks, but more reasonable power targets.


Do you understand the concept of performance per watt? (here come the MT synthetic load comparisons...)

Using more power is irrelevant if it also performs faster?

Have another look at the graphs, from the review, and from Der8auer, who I consider to be the foremost expert of tuning. Then maybe think about criticising the reading comprehension of the person who proofed the article you're quoting.
Its still strange this K CPU is marketed as a 125W CPU, which it really isn't.

Also, the perf/watt isn't better, it's worse; objectively worse at MT; and at ST-focused gaming, the 5800X3D is still better, as is even a 7600X

And note: this is not architecture. Its pure and plain stock power/clock settings. If you go over an Intel xx600k, you're in silly land in that regard.

1666290971865.png
 
Last edited:
Back
Top