• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-13900K

Hey I hear Winter is coming, we'll ignore the last 2 seasons of GoT for now, so this is a nice investment in a relatively cheap heater? Though I guess your electricity bills will probably bankrupt you as well :nutkick:

No I'm, pretty sure someone said 40% more at similar wattage.

I'd like to go full Greenpeace on you but I'll just add any sane person would, those who don't care for the environment don't deserve respect IMO!
I'm guessing you live in Europe where electricity is scarce. Locked at 90W and it still mopped the floor with AMD's new lineup in regards to gaming. Intel's upcoming locked cpu's are going to be the dagger in AMD's heart.

 
Need to stop testing with a 3080 when the industry tests with the best gpu in the world 4090 the gap will be bigger in favour of intel
 
What, less than 5% at 1080p for significantly higher power draw and temperatures?! You're not playing games at this point, you're playing yourself.
It's 10.5% faster in games than the next fastest competitor (7950X) at stock, while using a whole 31 W more.

I expect that difference to increase when we retest with a 4090, which is exactly what the other evidence I linked (der8auer's 4090 tests) shows.

1666293514407.png
1666293535212.png
 
So you're questioning the expertise of der8auer?
You cannot trust a single data point, no matter who it is from. Science 101.
He's a qualified engineer who advises motherboard, GPU and CPU manufacturers on products. What are your qualifications?
Well, for starters I can read the graphs TPU puts out. If a single review is taken as fact, then why do you disagree with your own website?

TPU1.png
TPU2.png

I'm challenging your assertions with evidence.

You are doing the same.

This is neither petty nor disingenuous.
Handwaving proof that disagrees with your own, and insinuating that readers of your website are complaining simply to complain because they are "not the target market" is both petty and disingenuous. That's already been written up in comments in this thread. Look at the way W1zzard responds to criticism VS the way you do, it's rather eye opening. The way you responded to me, claiming that I was questioning the expertise of a reviewer because I dont take one data point as holy text and asking what my qualifications are is incredibly inappropriate for a TPU staff member, and it reflects poorly on this site as a whole.
 
The average Joe shouldn't be buying a K series CPU. The i9 13900 (non K), and other i5/7/9 chips will be an excellent choice for them, and will be what is used in most OEM systems that aren't marketed towards gamers. The average Joe isn't going to be building a PC either. The people who do, know how to tune or ask advice, for the most part.

I agree with your statement about pushing chips past their efficiency curves, but everyone, Intel, AMD, NVIDIA etc., is doing this. For one player to go the efficient route when the rest are going the performance route is bad marketing.

There's nothing BS about anything I've said.


Personally I think if you're the use case that can fully utilize and profit from a 24 core CPU, you probably also have the budget for a Quadro equivalent, or see the 24 GB frame buffer of the xx90 series cards as being worth whatever NVIDIA chooses to sell them for.

I think the main issue is that everyone is evaluating these chips from their own budgets and needs. If you're not a HFR gamer or a person who actually makes money from the processing power of their computer - this chip isn't for you.
I don't think it's for you or me to decide what people should buy though I do agree with your last point.
I think they could have done better though, it's a good chip I'm clear on that, I just hoped they would have gotten a more intelligent work flow and resource utilisation algorithm and system in play, in chip, AmD have machine learning tech to optimise workflow and it works quite well not perfect, I hoped Intel would have done more than a trim and shrink I suppose.
 
nice review, good job Intel nonetheless (bringing on the heat, pun intended /s also ), but i am glad i have a worthy upgrade on AM4 in the form of the 5800X3D, if it continues to drop in price and reach the sub 450chf (and it's quite close atm ) since my res is 1620p
 
You cannot trust a single data point, no matter who it is from. Science 101.

Well, for starters I can read the graphs TPU puts out.

Handwaving proof that disagrees with your own, and insinuating that readers of your website are complaining simply to complain because they are "not the target market" is both petty and disingenuous. That's already been written up in comments in this thread. Look at the way W1zzard responds to criticism VS the way you do, it's rather eye opening. The way you responded to me, claiming that I was questioning the expertise of a reviewer because I dont take one data point as holy text and asking what my qualifications are is incredibly inappropriate for a TPU staff member, and it reflects poorly on this site as a whole.
You're the one questioning results from a very respected source - the burden of evidence is on you to provide proof as to why we should take that questioning seriously, hence why I asked what your qualifications are. TPU results are great, and contextually provide really reliable datasets which are useful in comparing different hardware. The issue is that we're in the middle of several huge generational product releases, CPU and GPUs are all being refreshed at the same time practically.

There's a good reason why the TPU graphs are somewhat slewed, it's because we had to use a 3080. Why exactly do you think we're retesting with a 4090? For fun? It's incredibly difficult to put modern CPUs in a position where they are actually the limiting factor in a system, making testing hard, and justifying the use of academic 720p testing. Just look at 4K results, you can infer the same thing from the other end of the extreme, why bother getting anything above a 3700X when you'll only get a 4% performance increase for your money? Obviously the actual performance difference is significantly different, but in that context, you wouldn't think so. Just like in the context of not using a 4090, the 13900K seems silly. People upgrade their GPU a lot more than they upgrade their CPU for this reason. And it's also why we try to have so many different tests, so you can get an idea of actual chip performance, instead of chip performance when it's being bottlenecked by a different component.

You're still part of a community buddy. This was feedback.

For a staff member, a more open stance could be expected. We (or at least, I) am not jabbing at you, its about the content we're discussing and what might make it more valuable. Insights, you know. I think that gaming energy efficiency result is a very interesting thing to highlight, because it points out the limited necessity for going this high up the Intel stack for 'better gaming'. Seeing as we are speaking of a gaming champ. Could less actually be more? ;)
Exactly why the 13600K is titled "best gaming CPU", it's an overclocking beast, and I see no reason why it can't achieve the same clocks as the 13900K when tuned a bit. It goes from 628 to 712 FPS in CS:GO with a 500 MHz OC to 5.6 GHz, pretty sweet. The 13900K gets 736 FPS at the same clocks (5.6 GHz), so the difference of 24 FPS is probably due to the two extra P cores and more cache.

I believe I have a pretty open stance, I'm not planning on upgrading to this or any other next gen CPU released by Intel or AMD, for the next few years. I just dislike the criticisms of people who don't even have the use case a product is aimed at (the best HFR gaming experience, or making money with their processing power). If anyone has doubts about my own feelings on performance/energy efficiency - have a look at my specs, I use a 5800X3D. As you said, I'm part of a community, and these posts are my own, just because I'm a staff member doesn't mean that every opinion or post I make is the official representation of TPU, nor should it be IMO. The responsibility I have as a staff member is to ensure my TPU work is of good quality and unbiased, which I believe it is. It's not like I'm slandering people here. The responsibility I have as a member of this community is to present my statements as best I can in order to foster greater understanding, which I will continue to do.

To clarify and restate - if you're not chasing 240 Hz gaming, don't care about e-peen, or don't have the time=money attitude, to where a little extra on the power bill is irrelevant in the context of each minute saved in that workflow = $$$. This CPU isn't really for you.
 
I don't think it's for you or me to decide what people should buy though I do agree with your last point.
I think they could have done better though, it's a good chip I'm clear on that, I just hoped they would have gotten a more intelligent work flow and resource utilisation algorithm and system in play, in chip, AmD have machine learning tech to optimise workflow and it works quite well not perfect, I hoped Intel would have done more than a trim and shrink I suppose.

I agree all these new CPUs are just making the 5800X3D look better and better for gaming.


I can't believe Amazon is trying to charge $800 for this cpu lmao no wonder it is currently sitting at 38th on their best cpu sellers list smh.
 
Last edited:
I agree all these new CPUs are just making the 5800X3D look better and better for gaming.


I can't believe Amazon is trying to charge 800 for this cpu lmao no wonder it is currently sitting at 38th on their best cpu sellers list smh.
If I only gamed I would have one tbh.
 
It's pretty much in line with what I thought would be the case. It comes with caveats Intel chased benchmark scores a bit over aggressively, but it still performs and with appropriate manual tuning it can still be quite good. I don't really see the issue if you can't hold out longer for Zen 4 X3D that might tilt things back the other direction.

Also let's see how well the iGPU's perform on these new chips at 720p/1080p.
 
I agree all these new CPUs are just making the 5800X3D look better and better for gaming.


I can't believe Amazon is trying to charge $800 for this cpu lmao no wonder it is currently sitting at 38th on their best cpu sellers list smh.
heck yeah, the 5800X3D look like AMD freaking achievement of the year (or more )

Intel going Hybrid for 3 gen and that "fiasco of a non OC CPU" (the 5800X3D obviously) still hold strong... and AMD current gen is also putting a good fight

hilariously the 7700X is not even that expensive even for me @ 449chf given the place in the ranking (although it's a bloody shame since the 5700X is 249 chf and the 5800X is 299chf )

and a tray 5800X3D is 449.90 right now (the boxed version is 499.90) oh wait ... it's sub 450chf ...
if it continues to drop in price and reach the sub 450chf (and it's quite close atm )
alright ... placing an order right now ...


as if ... my 3600 will last a bit more i reckon ... just a little more drop and i fire.

well if i would do a full platform change the 7X00X3D would be where my money were to go, if i had money tho ... i need a e-bike, mobility come first :laugh:
 
You cannot trust a single data point, no matter who it is from. Science 101.

Well, for starters I can read the graphs TPU puts out. If a single review is taken as fact, then why do you disagree with your own website?

View attachment 266373View attachment 266374

Handwaving proof that disagrees with your own, and insinuating that readers of your website are complaining simply to complain because they are "not the target market" is both petty and disingenuous. That's already been written up in comments in this thread. Look at the way W1zzard responds to criticism VS the way you do, it's rather eye opening. The way you responded to me, claiming that I was questioning the expertise of a reviewer because I dont take one data point as holy text and asking what my qualifications are is incredibly inappropriate for a TPU staff member, and it reflects poorly on this site as a whole.

I can’t understand what he’s even talking about at this point; I didn’t even notice the fact he’s the supposed proof reader of articles here, that’s wild considering he’s going against what TPUs own article plainly states.

Very unprofessional
 
Pretty impressive chip minus the power consumption..

I have been running my 7950X at 105W (Eco mode) and loving it, it doesn't lose much performance at all.
I also run my 12900K machine at a 200W Limit.

These could also be limited in bios of course, i feel like BOTH intel and AMD should have released their latest at much lower power levels considering how little performance you lose by capping it to something reasonable.
 
Pretty impressive chip minus the power consumption..

I have been running my 7950X at 105W (Eco mode) and loving it, it doesn't lose much performance at all.
I also run my 12900K machine at a 200W Limit.

These could also be limited in bios of course, i feel like BOTH intel and AMD should have released their latest at much lower power levels considering how little performance you lose by capping it to something reasonable.

Agree, I was more impressed with the 7950X at 125w/65w than I am with It's stock performance. I'm sure once more people get to grips with the 13900k we will see similarly impressive numbers.
 
It's 10.5% faster in games than the next fastest competitor (7950X) at stock, while using a whole 31 W more.

I expect that difference to increase when we retest with a 4090, which is exactly what the other evidence I linked (der8auer's 4090 tests) shows.

View attachment 266375View attachment 266376

Not that a direct comparison is the best idea to der8auers work (different test systems)…

Far Cry 6 1080p 3080 vs 4090

With 3080, the TPU results are 162 FPS with the 13900k and 131 with the 7950X for Far Cry 6

With Der8auers 4090, his results are 169 with the 13900k and 138 with the 7950X.

Considering ONLY the change in FPS, test configs aside, the 13900k becomes less efficient compared to TPUs results…

The fps gain for the 7950x when using a 4090 being 5.3%

The fps gain for the 13900k when using a 4090 being 4.3%
 
Last edited:
Not that a direct comparison is the best idea to der8auers work (different test systems)…

Far Cry 6 1080p 3080

With 3080, the TPU results are 162 FPS with the 13900k and 131 with the 7950X for Far Cry 6

With Der8auers 4090, his results are 169 with the 13900k and 138 with the 7950X.

Considering ONLY the change in FPS, test configs aside, the 13900k becomes less efficient compared to TPUs results…

The fps gain for the 7950x when using a 4090 being 5.3%

The fps gain for the 13900k when using a 4090 being 4.3%
I'm looking forward to the TPU testing.
 
Its still strange this K CPU is marketed as a 125W CPU, which it really isn't.

Also, the perf/watt isn't better, it's worse; objectively worse at MT; and at ST-focused gaming, the 5800X3D is still better, as is even a 7600X

And note: this is not architecture. Its pure and plain stock power/clock settings. If you go over an Intel xx600k, you're in silly land in that regard.

View attachment 266368
I hope you realize that the graph you just posted about efficiency basically goes against every other review out there. Not saying it's wrong - it might be that every other review is wrong, but most reviews have the 13900k at 38.5 to 39.5k points @ 253watts, bringing it to around 160points / watt instead of 113 that your graph has it on.
 
It's 10.5% faster in games than the next fastest competitor (7950X) at stock, while using a whole 31 W more.

I expect that difference to increase when we retest with a 4090, which is exactly what the other evidence I linked (der8auer's 4090 tests) shows.

View attachment 266375View attachment 266376

Wonder why you have hid the resolution where we see this 10% difference...

At resolutions that are sensible, the difference goes from 7% to 6%. Whereas using 30W more, which is significant when the power draw is already so high.
 
There's a good reason why the TPU graphs are somewhat slewed, it's because we had to use a 3080. Why exactly do you think we're retesting with a 4090? For fun? It's incredibly difficult to put modern CPUs in a position where they are actually the limiting factor in a system, making testing hard, and justifying the use of academic 720p testing. Just look at 4K results, you can infer the same thing from the other end of the extreme, why bother getting anything above a 3700X when you'll only get a 4% performance increase for your money? Obviously the actual performance difference is significantly different, but in that context, you wouldn't think so. Just like in the context of not using a 4090, the 13900K seems silly. People upgrade their GPU a lot more than they upgrade their CPU for this reason. And it's also why we try to have so many different tests, so you can get an idea of actual chip performance, instead of chip performance when it's being bottlenecked by a different component.
Oh, okay, I thought we were discussing your proofreading not too long ago since you brought it up. But then I'm wrong :) Np!

By the way, on the bit I quoted up there specifically, I'm on that exact same page with you. But again, you confirm it yourself, the gaming purpose of a CPU this high up the stack is still extremely limited, you can't even properly max it out in gaming, and this echoes throughout CPU history. Going this far out with your CPU for future GPU upgrades really never paid off, except in periods of complete stagnation; quad i7's proved their value over i5 in the late days of Skylake. And why? Only - and I do mean only - because of core/thread count. Even that isn't in the picture anymore with these CPUs, you can drop to near bottom of the stack and still have enough. And... are there outliers where you do find the bonus FPS in actual gaming? Sure! But again. Its so limited.
 
Last edited:
It's 10.5% faster in games than the next fastest competitor (7950X) at stock, while using a whole 31 W more.
& what's the Wattage total in % over the 7950X ? because that matter?
 
Its just a CPU guys... Please don't get all antsy :)

Intel seems to have it in the games section but I think for the most part of everything else, AMD still has the edge but again even with in games, it trades blows depending on the games from what reviews/results I've seen.

I'm not in the market for either CPU at the moment, but I agree with the video Jayztwocents put out and said "whichever platform you go for, you'll have an amazing gaming experience with" and I think its true. I honestly feel that Intel hasn't really done anything special with this release. I think its just purely based on the higher clocks and the IPC increase, that's all that's pushing it forward, in effect turning it up to 11 from 10 or from 10 to 11, whichever way you want it :) The higher temp limit seems to help that because without it, I think the performance might be similar or less than the AMD counterpart...

I think the hours that has gone into this review and every review that @W1zzard does is amazing, the data is good and solid and there's a massive variance of it.
 
Wonder why you have hid the resolution where we see this 10% difference...

At resolutions that are sensible, the difference goes from 7% to 6%. Whereas using 30W more, which is significant when the power draw is already so high.
"at resolutions where the GPU is the bottleneck, not the CPU..." is what I believe you mean.

I didn't hide anything - the review is there for everyone to see, I snipped up to the next relevant CPU after the 7950X - the 5800X3D.

Oh, okay, I thought we were discussing your proofreading not too long ago since you brought it up. But then I'm wrong :) Np!

By the way, on the bit I quoted up there specifically, I'm on that exact same page with you. But again, you confirm it yourself, the gaming purpose of a CPU this high up the stack is still extremely limited, you can't even properly max it out in gaming, and this echoes throughout CPU history. Going this far out with your CPU for future GPU upgrades really never paid off, except in periods of complete stagnation; quad i7's proved their value over i5 in the late days of Skylake. And why? Only - and I do mean only - because of core/thread count. Even that isn't in the picture anymore with these CPUs, you can drop to near bottom of the stack and still have enough. And... are there outliers where you do find the bonus FPS in actual gaming? Sure! But again. Its so limited.
I didn't bring it up - people have been criticising my opinion, which supposedly should be different since I'm the proofreader? I disagree that the gaming purpose of this CPU is limited, maybe for now, when tested with a two year old 3080 it's overkill, but bear in mind people keep their CPU/Mobo platforms a lot longer than they keep their GPUs. Lots of people still on Skylake derivatives, but still rocking RTX 3xxx chips for instance. Having CPU power in excess is useful, since you'll probably keep the platform for several GPU generations, where the CPU will then be able to stretch it's legs more. You can make the same argument with AM5 and it's long lasting platform, but that assumes you're willing to also spend money to upgrade the CPU.
 
"at resolutions where the GPU is the bottleneck, not the CPU..." is what I believe you mean.

I didn't hide anything - the review is there for everyone to see, I snipped up to the next relevant CPU after the 7950X - the 5800X3D.


I didn't bring it up - people have been criticising my opinion, which supposedly should be different since I'm the proofreader? I disagree that the gaming purpose of this CPU is limited, maybe for now, when tested with a two year old 3080 it's overkill, but bear in mind people keep their CPU/Mobo platforms a lot longer than they keep their GPUs. Lots of people still on Skylake derivatives, but still rocking RTX 3xxx chips for instance. Having CPU power in excess is useful, since you'll probably keep the platform for several GPU generations, where the CPU will then be able to stretch it's legs more.
Your opinion is yours alone. Be wary it doesn't reflect in your job ;)
 
Your opinion is yours alone. Be wary it doesn't reflect in your job ;)
If the opinion I (and other people share) ever biases my work, people can criticize as much as they want, and i'm sure it would come up a lot faster internally too.
 
Back
Top