• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Core i9-13900K

I guess this is on reviewers in a sense that it isn't explained well, usually.
Νο, trust me, that's not the issue. It's fanboys (usually of one specific company, won't name which), that know why 720p tests exist, but they come up with all kinds of excuses when that specific company they support is losing in them. Guru3d (i'm not sure but i think that was the one) was forced to REMOVE (im not even kidding you) 720p results from one of their reviews cause supporters of that same company went freaking nuts in the comments section cause their favorite company's CPU was losing badly. Holy freaking cow :eek:
 
Last edited:
If your aim is only gaming there are much better vfm options than the 5800x 3d.
probably but this one is better in price and performance than a 13600k.
The 3d on it's own yes, it's super expensive. You don't even have to compare it to Intel that are decently priced. Even against the overpriced Zen 4 cpus, the 3d is wildly overpriced. Think about it, a 7600x is faster in games, way faster in ST workloads and equal in MT workloads. And yet, even though it's already overpriced, it's cheaper than the 3d. In facts, it's a 100€ cheaper. You might argue that Am3 motherboards are cheaper, but then that's exactly my point, you are paying for the mobo upgradability, it's not free. Instead of paying for a new motherboard, you are overpaying for the CPU.
not really. 12900K is super expensive. The x3d is fairly cheap if you buy a combo even if you compare with a 13600k I guess you have omitted screenshots I have posted, If you have and AM4 board than it becomes even cheaper. Fact is, it is a great gaming CPU and it has no equal in performance per wat or performance per $. Downside is lack of upgrade but for what it is and what it offers, it will last you long time. Long enough you can literally wait until you change your entire rig.
What do you mean by a lot? The difference in gaming performance is 5%, yet the 13600kf is 90€ cheaper (359€ vs 449€ in big EU retailer). With that price difference you could get faster ram for example and tie it in gaming, while still offering vastly better MT and ST performance and a better upgrade path.
Either way it is faster and by that logic there is no point for replacing anything since any noticeable difference can be seen only with stupidly fast graphics and at ridiculous resolution of 1080p.
I disagree. The 13600kf IS a monster at those. It's the 3rd fastest CPU in ST performance and faster than a 5900x and a 7700x in MT performance. It's pretty freaking strong actually
Not saying its performance suck since it is approaching 12900k's performance and that is huge but there is better like 13700K much more capable with MT and I think if you consider MT you should be going with this one. It surpasses 12900k so that is a great upgrade if MT workloads are what you are looking for.
 
not really. 12900K is super expensive. The x3d is fairly cheap if you buy a combo even if you compare with a 13600k I guess you have omitted screenshots I have posted, If you have and AM4 board than it becomes even cheaper.
Τhe 12900k is twice as fast in multithreading performance, again - you are comparing completely different products. That's like saying the 12900k is good value for money cause it's much cheaper than the Threadripper 7990wx. I mean..what?

Fact is, it is a great gaming CPU and it has no equal in performance per wat or performance per $.
No it does not. That's just absolutely false.

These are the results from the TPU review


But even if we go by the ones you posted from hwunboxed, it definitely isn't good in performance per $, I don't even know why you would say something like that. The vast majority of CPUs have better performance per $ even if we are just talking about gaming. HECK, the insanely overpriced 7600x has way better performance per $ in games. LOL
Either way it is faster and by that logic there is no point for replacing anything since any noticeable difference can be seen only with stupidly fast graphics and at ridiculous resolution of 1080p.
No, it is not. The 13600kf is faster when you pair it with ddr5. Actually according to TPU it's faster with ddr4 as well.
 
I guess this is on reviewers in a sense that it isn't explained well, usually. So naturally, the majority of people have no idea why 720p tests exist at all and of course they retort with "no one plays at 720p anymore, duh". Reviews should state that it's to test lifespan longevity, and not actual present-day gaming performance. To be honest, even I didn't know until not too long ago, even though it's logical as heck.

Yeah I don't think it's made explicit often enough. FWIW, what I like to do is look at the CPU bottlenecked scores out of what you might call academic or long-term interest, and then shift over to the 4k scores as a sanity check. At least up until the launch of the 4090, 4k benchmarks were reliably GPU limited, and thus they were a good proxy for, "ok, if I pair one of these mid-range CPUs with a sensible GPU, this is what I'm most likely to see most of the time, i.e. no effective difference."

I think it's still generally true that CPU upgrades are overrated for gamers, though that was easier to argue back when everyone was targeting 60 Hz. Certainly most gamers don't need a CPU upgrade with every new generation, nor even every two or three or four. "Good enough" is the only target that matters; it can be difficult to keep that in mind if you immerse yourself in tech-enthusiast news/discussion.

Bottom line is buy the fastest CPU you can for your preferred price point and ride it until its performance becomes noticeably disappointing.

Νο, trust me, that's not the issue. It's fanboys (usually of one specific company, won't name which), that know why 720p tests exist, but they come up with all kinds of excuses when that specific company they support is losing in them. Guru3d (i'm not sure but i think that was the one) was forced to REMOVE (im not even kidding you) 720p results from one of their reviews cause supporters of that same company went freaking nuts in the comments section cause their favorite company's CPU was losing badly. Holy freaking cow :eek:

Come on now. This isn't a partisan issue, and no sociopathic mega-corporation deserves a free pass. While we're on the subject of CPU value, you could just as easily say that without competition from that "one specific company," the other big company would still be pumping out 4-core i7s with a 5% generational uplift every year or two.

The objections to low-resolution CPU testing go back way farther than you suggest, and they have nothing to do with pro-AMD or pro-Intel factions. Bringing fanboy wars into any discussion is extreme cringe, bro. Can't imagine hitching my wagon to some faceless globocorp. Good god.
 
Last edited:
Come on now. This isn't a partisan issue, and no sociopathic mega-corporation deserves a free pass. While we're on the subject of CPU value, you could just as easily say that without competition from that "one specific company," the other big company would still be pumping out 4-core i7s with a 5% generational uplift every year or two.
I wasnt criticising the corporation though, just the fans. Funny though, how did you realize which one I was talking about? :roll:
 
It's not, you just don't understand the point. Going by 4k results, I should buy a 3600x. It performs almost identical to a 13900k and it costs like 1/4th of the price. What happens next year when I replace my 3080 with a 5080 though? Exactly. So that's why im looking at 720p results, to know what's gonna happen next year / which cpu will last me


And im saying, they are not comparable. Its like comparing the 3d to a 7950x. The 3d is closer to the 12600kf, both in games and in other workloads. At least thats what TPU shows. At which point I don't see the huge benefit of mobo upgradability, since the 12600kf with a brand new mobo costs as much as the 3d on its own

First part: if you had a 3600x, you dont need to upgrade yet. If you're buying new, you'd want that headroom for sure.

I'm not confident with TPU's results here - other websites show really different results for the x3D, it may be down to whats being measured since theres no 0.1% lows

the x3D also had major price drops and is the flagship of the series, imagine if you could slap a 13400f into your 9th gen mobo - that's whats on offer here
 
It's probably getting time to upgrade my 8700k. The new AMD CPUs are very good and Intel still holds a small advantage in gaming performance. But the elephant in the room is the potential 78003dx. For strictly gaming (which is me) I think I'll wait till it comes out (thought it was supposed to be November) before I make a choice on an upgrade.


Do you run your rig at 100% 24/7? Unless you are a power user and do a lot of content creation where the system is running full peak all the time any difference on your electric bill will be negligible compared to whatever you currently run.

Gaming is not even close to a full load on the system and how much do you game? 12-15 hours a day? Most gamers (especially adults who can afford this equipment) wont be spending that much time gaming. So you complaints about power bills and lights dimming are silly.

If you can afford the 4090 or the cost of a platform upgrade, you can afford a few extra kWh on your electric bill.
Yep, sure thing boss. If you say so.
 
Τhe 12900k is twice as fast in multithreading performance, again - you are comparing completely different products. That's like saying the 12900k is good value for money cause it's much cheaper than the Threadripper 7990wx. I mean..what?
Yes it is faster but I was talking specifically gaming. 12900K is hardly a gaming CPU but MT CPU for sure.
No it does not. That's just absolutely false.

These are the results from the TPU review

https://tpucdn.com/review/intel-core-i9-13900k/images/relative-performance-games-1280-720.png
But even if we go by the ones you posted from hwunboxed, it definitely isn't good in performance per $, I don't even know why you would say something like that. The vast majority of CPUs have better performance per $ even if we are just talking about gaming. HECK, the insanely overpriced 7600x has way better performance per $ in games. LOL
Yes 13900K is faster and it costs way more. With all the mobos and DDR5 and shit. even DDR4 it is still $350 more (if I'm not mistaken). 5800x3d is faster than 13600k and can be cheaper. I posted pictures from HWUB and GN. Why are we talking about this again? 5800x3d is a better option for gaming than 13900K. It costs way less and even though it is a tad slower it does not make a huge difference. Not to mention, the power consumption of the 13900k is literally twice of 5800x3d. 13600k has a higher gaming power consumption than x3d. So, I'd pick x3d for myself anytime instead of moving to Intel.
No, it is not. The 13600kf is faster when you pair it with ddr5. Actually according to TPU it's faster with ddr4 as well.
It is faster if equipped with DDR5 but in gaming, according to HWUB, tested with 12 games 5800x3d is faster.
Even if you pair it with DDR5 it's not faster than 5800x3d in gaming.
 
Yes it is faster but I was talking specifically gaming. 12900K is hardly a gaming CPU but MT CPU for sure.

Yes 13900K is faster and it costs way more. With all the mobos and DDR5 and shit. even DDR4 it is still $350 more (if I'm not mistaken). 5800x3d is faster than 13600k and can be cheaper. I posted pictures from HWUB and GN. Why are we talking about this again? 5800x3d is a better option for gaming than 13900K. It costs way less and even though it is a tad slower it does not make a huge difference. Not to mention, the power consumption of the 13900k is literally twice of 5800x3d. 13600k has a higher gaming power consumption than x3d. So, I'd pick x3d for myself anytime instead of moving to Intel.

It is faster if equipped with DDR5 but in gaming, according to HWUB, tested with 12 games 5800x3d is faster.
Even if you pair it with DDR5 it's not faster than 5800x3d in gaming.
Not to mention the extra cost of buying a new motherboard and DDR5 RAM. That's what upgrade costs ultimately come down to these days and that's why AM5 isn't selling well. I bet the vast majority of 12th (and now 13th) gen Intel Core systems out there are using DDR4.
 
Νο, trust me, that's not the issue. It's fanboys (usually of one specific company, won't name which), that know why 720p tests exist, but they come up with all kinds of excuses when that specific company they support is losing in them. Guru3d (i'm not sure but i think that was the one) was forced to REMOVE (im not even kidding you) 720p results from one of their reviews cause supporters of that same company went freaking nuts in the comments section cause their favorite company's CPU was losing badly. Holy freaking cow :eek:
The red cult lol
 
First part: if you had a 3600x, you dont need to upgrade yet. If you're buying new, you'd want that headroom for sure.

I'm not confident with TPU's results here - other websites show really different results for the x3D, it may be down to whats being measured since theres no 0.1% lows

the x3D also had major price drops and is the flagship of the series, imagine if you could slap a 13400f into your 9th gen mobo - that's whats on offer here
Sure but imagine if the 13400f cost 400€ in order to allow you to slap the 13400f in there. I mean, that's what Im saying, the 3d is nice, but judging the price as a standalone CPU, its way overpriced. Think about it, it's more expensive (by 100€) than the already overpriced 7600x, and it loses in everything, even games. So upgradability costs - the cost is included in the price of the CPU, otherwise the 3d should be cheaper than the 7600x, right?

It's funny how you are not confident with TPU's results, remember what you told me back when I said the same about the 12900k power limited numbers?
 
Sure but imagine if the 13400f cost 400€ in order to allow you to slap the 13400f in there. I mean, that's what Im saying, the 3d is nice, but judging the price as a standalone CPU, its way overpriced. Think about it, it's more expensive (by 100€) than the already overpriced 7600x, and it loses in everything, even games. So upgradability costs - the cost is included in the price of the CPU, otherwise the 3d should be cheaper than the 7600x, right?
I agree that the x3d is overpriced, but its main selling point is not having to buy a motherboard and RAM if you're already on AM4. The 7600X + motherboard + DDR5 RAM combo is way more expensive. If you're on an older DDR4 Intel platform, 13th gen Core i5 with a DDR4 motherboard is a good option.
 
Good news, I think I found a fix for the low Cinebench score (and possibly others). Doing more testing and will update the review soon, I hope
 
Could you please clarify what do you mean with “stock” setting ? because I see a stock power consumption of 283W while Intel stock settings should be 253W for the 13900K. Thank you.
 
Could you please clarify what do you mean with “stock” setting ? because I see a stock power consumption of 283W while Intel stock settings should be 253W for the 13900K. Thank you.
PL1 = 253, PL2 = 253. Do you have these values too?
 
I wonder a bit, in gaming 13900K performs about 10% better than 7950X, but in all other reviews I have read they perform very similar or 13900K is less than 5% faster. Why do TPU get better results for 13900K than others?
 
I wonder a bit, in gaming 13900K performs about 10% better than 7950X, but in all other reviews I have read they perform very similar or 13900K is less than 5% faster. Why do TPU get better results for 13900K than others?
They dont. Club386 tested with a 4090 and wherever there is no gpu bottleneck the difference is 25 %

image017-1-1200x381.png

image023-1-1200x381.png

image035-1200x381.png

image041-1200x381.png

image011-1-1200x381.png
 
They dont. Club386 tested with a 4090 and wherever there is no gpu bottleneck the difference is 25 %

image017-1-1200x381.png

image023-1-1200x381.png

image035-1200x381.png

image041-1200x381.png

image011-1-1200x381.png
I know the difference will increase with a 4090 but TPU uses 3080, that is odd results compared to other test-sites. I know 13900K is faster, not doubting that, just find it a bit strange that TPU get so much difference with a significantly slower TPU than 4090 when others using similar GPU don't.

I seriously dount that there in general will be 25% difference when there is no GPU bottleneck, in some games yes, but not all. Some games fares better on 7950X vs 13900K aswell, but the majority prefers 13900K.

Sure but imagine if the 13400f cost 400€ in order to allow you to slap the 13400f in there. I mean, that's what Im saying, the 3d is nice, but judging the price as a standalone CPU, its way overpriced. Think about it, it's more expensive (by 100€) than the already overpriced 7600x, and it loses in everything, even games. So upgradability costs - the cost is included in the price of the CPU, otherwise the 3d should be cheaper than the 7600x, right?

It's funny how you are not confident with TPU's results, remember what you told me back when I said the same about the 12900k power limited numbers?
Remember that ram and MBs cost a lot more for 7600X than 5800X3D. Where I live the cheapest option for 7600X costs 800usd (400+250+150) vs 650usd (500+70+80) fpr 5800X3D. I would pick the 3D anyday with that pricing, but prefer to wait for 7000 3D.
 
PL1 = 253, PL2 = 253. Do you have these values too?
I was referring to your test

1666778347240.png

Why are you seeing 283W at stock if PL2 is 253W ?
 
TechPowerUp's CPU-only power measurements are taken from the motherboards' 12V inputs, so they include the power consumption of motherboard components as well, notably the VRMs (which generate heat under load = wasted power).
 
I was referring to your test

View attachment 267282

Why are you seeing 283W at stock if PL2 is 253W ?
CPU sensor isn't 100% accurate, my measurement also includes losses through the VRM, because I measure on the ATX 12 V power cables of the mainboard. It's still much better than relying on the CPU's own sensors which are inaccurate, not calibrate, vary between batches, and vary between AMD/Intel
 
CPU sensor isn't 100% accurate, my measurement also includes losses through the VRM, because I measure on the ATX 12 V power cables of the mainboard. It's still much better than relying on the CPU's own sensors which are inaccurate, not calibrate, vary between batches, and vary between AMD/Intel
Then every websites/reviewers have different way of reporting CPU power consumption hence the wattage differences (different testing softwares/testing components). It's fair to say all of their reports are right in their respective ways and readers should do heavy research before deciding what is what.
 
CPU sensor isn't 100% accurate, my measurement also includes losses through the VRM, because I measure on the ATX 12 V power cables of the mainboard. It's still much better than relying on the CPU's own sensors which are inaccurate, not calibrate, vary between batches, and vary between AMD/Intel
Thank you. I wasn't expecting 30W differente anyway. That's interesting.
 
Thank you. I wasn't expecting 30W differente anyway. That's interesting.

Some VRMs supposedly only have efficiencies in the 80-85% range under full load, so if anything I would have expected an even larger difference.
253W / 0.825 = 306.7W
 
Phoronix finally got around to posting their review of the 13900k, and perhaps most interesting to me was the discrepancy between their SVT-AV1 numbers and yours. I realize there are a lot of variables here between the two tests, e.g. different OS and Phoronix didn't test 4K at preset 10 (the default) as you did. However, the difference is so large it got me to wondering, was your SVT-AV1 built with AVX-512 support?

Note that this has to be explicitly enabled when compiling it, even if you're doing a standard release build.

Edit: Just found this article which might also explain some or all of the discrepancy.
 
Last edited:
Back
Top