• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i7-13700K

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,639 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
With the Core i7-13700K, Intel has built a formidable jack-of-all-trades processor. Our review confirms that it offers fantastic application performance, beating the more expensive Ryzen 9 7900X, and in gaming it gets you higher FPS than any AMD processor ever released, delivering an experience very similar to the Core i9-13900K.

Show full review
 
Even my i7 12700K is still a king with 1440p gaming.:cool:

relative-performance-games-2560-1440.png
 
This makes me love my i7-12700k even more, I got it for ~$290 brand new, never been happier.

Thanks for the review
 
Another CPU that I won't be recommending to average users due to its insane cooling requirements at stock.
 
good performance with sacrifice hugeeee power............
 
Takeaways

no point OC’ing a modern cpu
Games love big cache
Ecores better than hyperthreading
4K gaming cpu don’t really matter

decent price for what it is. However with zen 3 deals I just got a 5600 and the mobo and 2x16GB of ram I’m looking at all together are less than that cpu (am5) as well, so that will be my upgrade path. Which is fine since I have a lot of maintenance to do to the vehicle :|
 
With 6xx motherboards and DDR4 memory support being mentioned as a pro for budget friendliness, I wonder what performance hit you get going that route? Be nice to see some numbers with that setup to compare
 
Definitely the best Raptor Lake CPU of the bunch so far, but I'm still not happy about full load power draw of >250W.

I'm wondering how it would look with a 150W PL2. Chances are it would still match or beat a 7700X and 150W is basically what a "105W" Zen3 pulls a full boost on all-core loads.
 
One day I'm sure the Linux scheduler will support BIG.little cores but until then I won't move away from my 5900X.
 
Will wait to see perf with DDR4 but this is another great toaster for a warm winter :s
Why is there a fking editor's choise in every tests ? No matter the cons, next gen CPU will be on 500w and GPU on 1000 and they will still have this reward (only intel arc didnt get it)
 
There's something seriously wrong with TPU's testing regime when this CPU runs cooler at 5.5ghz overclock than it does at stock speeds.
 
oi, what's with those CPU temps?

mine hovers around 70deg C after 10+ minutes of CineBench R23
 
can you test game bench under Win11/Win 10 for 13900K/7950X ?
 
There's something seriously wrong with TPU's testing regime when this CPU runs cooler at 5.5ghz overclock than it does at stock speeds.
No, it just means a static voltage and clockspeed draw less power than Intel's stock voltage curve.
 
@W1zzard

is faste rthan all AMD Zen 4 (and Zen 3 processors). The differences are not huge though, I'd even say that in a blind test you won't be able to notice any difference betwen

The 13700K does offer considerably performance
(missed more).

Please take some rest.

I only slightly disagree with "Very high power usage" - this is valid only for highly multi-threaded workloads. Otherwise the CPU is just fine.

Another CPU that I won't be recommending to average users due to its insane cooling requirements at stock.

Average users never load all 24 threads. Average users do not need 24 threads. Not recommending this CPU to the average user is simply a no brainer. The average user will be just fine with Core i5 13400. This comment is a fine example of "I wanna say something bad about the product which I'm not even considering buying or recommending".

I guess you'll happily recommend 7600X right? Along with super expensive AMD motherboards and DDR5 6000 RAM (recommended by AMD).

Definitely the best Raptor Lake CPU of the bunch so far, but I'm still not happy about full load power draw of >250W.

I'm wondering how it would look with a 150W PL2. Chances are it would still match or beat a 7700X and 150W is basically what a "105W" Zen3 pulls a full boost on all-core loads.

Will wait to see perf with DDR4 but this is another great toaster for a warm winter :s
Why is there a fking editor's choise in every tests ? No matter the cons, next gen CPU will be on 500w and GPU on 1000 and they will still have this reward (only intel arc didnt get it)

For those who prefer to live under the rock:

One day I'm sure the Linux scheduler will support BIG.little cores but until then I won't move away from my 5900X.

It's not about just the scheduler. User space must support it as well and I don't see it happening any time soon unless something steps in and writes a daemon to hint the kernel which tasks require which cores.
 
Last edited:
I had intended to get the 7900X at one point as I also do a lot of non-gaming stuff, especially photo editing, running simulations, Matlab, COMSOL, Mathematica, fluid sims etc and frankly in these non-gaming scenarios, RL is much better. I currently have a 5800X and it's destroyed by the 13600K and 13700K in things like COMSOL. Based on this the 7900X sales will be virtually non-existent now without a huge price cut of at least $100. In fact the 7900X3D needs to be $479 max.

I will almost certainly grab the 13700K which IMO almost makes the 13900K look redundant. I would power limit the 13700K to a max of 180W or so. Probably for gaming 90W would be more appropriate as shown by DerBauer on the 13900K it barely makes any difference, sure it's the same for the 13700K.

i7 14700K could run 24 E cores and 15700K 32 E cores as top line spec is 40 E Cores on Arrow Lake. But AMD will be using Zen 5c cores for their hybrid architecture and Zen 4c cores have been leaked as being very strong, almost same as Zen 4 just with half the L3 cache and lower clocks. Zen 5 is also a huge architectural shift. Zen 5 vs Meteor Lake and Arrow Lake will be fun. If AMD just continues with non-hybrid design they are going to be left behind IMO.
 
Average users never load all 24 threads. Average users do not need 24 threads. Not recommending this CPU to the average user is simply a no brainer. The average user will be just fine with Core i5 13400. This comment is a fine example of "I wanna say something bad about the product which I'm not even considering buying or recommending".

I guess you'll happily recommend 7600X right? Along with super expensive AMD motherboards and DDR5 6000 RAM (recommended by AMD).
No, I do not recommend the 7600X either, due to insane motherboard + DDR5 costs.

I know that the average user doesn't need 24 threads. Nor does the average user need a CPU that eats twice as much power in gaming than a 12600 or 5800X.

Don't assume one thing just because I commented on another.

Your comment is a "fine example" of projecting your ideas about a person based on one single comment that in fact, doesn't contain anything to base your projection on. Not everybody who dislikes the 13700K is an AMD fanboy, you know.
 
Last edited:
Even my i7 12700K is still a king with 1440p gaming.:cool:

relative-performance-games-2560-1440.png
Really tempting. I wish Intel extend MB support 2 more generations. Doing so would render X3D and buying AMD flagship pointless since every gamers would jump on Intel without a second thought.
With Z690/Z790 being a dead end platform, I'm forced to go with 7800X3D instead of 13600K/13700K.
 
Last edited:
I had intended to get the 7900X at one point as I also do a lot of non-gaming stuff, especially photo editing, running simulations, Matlab, COMSOL, Mathematica, fluid sims etc and frankly in these non-gaming scenarios, RL is much better. I currently have a 5800X and it's destroyed by the 13600K and 13700K in things like COMSOL. Based on this the 7900X sales will be virtually non-existent now without a huge price cut of at least $100. In fact the 7900X3D needs to be $479 max.

I will almost certainly grab the 13700K which IMO almost makes the 13900K look redundant. I would power limit the 13700K to a max of 180W or so. Probably for gaming 90W would be more appropriate as shown by DerBauer on the 13900K it barely makes any difference, sure it's the same for the 13700K.

i7 14700K could run 24 E cores and 15700K 32 E cores as top line spec is 40 E Cores on Arrow Lake. But AMD will be using Zen 5c cores for their hybrid architecture and Zen 4c cores have been leaked as being very strong, almost same as Zen 4 just with half the L3 cache and lower clocks. Zen 5 is also a huge architectural shift. Zen 5 vs Meteor Lake and Arrow Lake will be fun. If AMD just continues with non-hybrid design they are going to be left behind IMO.

Talking about power has become a fad with many bandwagon jumpers.

As you say, power consumption can be anything you configure it to be, for one.

I've run HWInfo64 for multiple days on my old 10850K, and recently on this 12700KF OC'd to 5.3/5.2/5.1 and got the same result. Max around 150W, average 35W.

But even if we just use TPUs numbers at face value, and
  1. Assume you run a heavily threaded rendering operation 12hrs/day 300 days a year.
  2. 13900K vs 7950X, there is 48W difference at stock
  3. 48W * 12hrs = 0.576 KWH/day
  4. .576 KWH/day * 300 days = 172.8 KWH/year
  5. 172.8 KWH/year * 0.15c/KWH = $25.92 /year
So the difference in cost to operate is about $26/year if you do 100% all core load 12hrs/day 300 days per year.

None of which is realistic.

And even if it were, $26 is not worth talking about on a rig that costs $1000+++.

This is far more realistic of people actually using a computer all day - either idle, or single \ light thread loads. It's super rare for any normal user to use all core max, and even for professionals it's not the normal state. It takes a lot of work typically to get to a point where for example one is ready to render, or compile, or simulate.
Of course, nobody talks about this one.

1666931053201.png
 
Talking about power has become a fad with many bandwagon jumpers.

As you say, power consumption can be anything you configure it to be, for one.

I've run HWInfo64 for multiple days on my old 10850K, and recently on this 12700KF OC'd to 5.3/5.2/5.1 and got the same result. Max around 150W, average 35W.

But even if we just use TPUs numbers at face value, and
  1. Assume you run a heavily threaded rendering operation 12hrs/day 300 days a year.
  2. 13900K vs 7950X, there is 48W difference at stock
  3. 48W * 12hrs = 0.576 KWH/day
  4. .576 KWH/day * 300 days = 172.8 KWH/year
  5. 172.8 KWH/year * 0.15c/KWH = $25.92 /year
So the difference in cost to operate is about $26/year if you do 100% all core load 12hrs/day 300 days per year.

None of which is realistic.

And even if it were, $26 is not worth talking about on a rig that costs $1000+++.

This is far more realistic of people actually using a computer all day - either idle, or single \ light thread loads. It's super rare for any normal user to use all core max, and even for professionals it's not the normal state. It takes a lot of work typically to get to a point where for example one is ready to render, or compile, or simulate.
Of course, nobody talks about this one.

View attachment 267523
Yup. ST is always relevant, MT is sometimes relevant. As I've said many times. Synthetic 100% load MT is even more sometimes relevant :D.
 
AMD still has the better product, but their pricing demands are off. Bad pricing equals bad product.

Having built many Intel systems recently (10900k, 11700k, 12900k, and a bunch of Ryzen) I much prefer the more efficient Ryzen chips. Ryzen 5000 was awesome. Price drops happened quickly after 12th gen launched and Ryzen stayed number one.

I'm probably buying a new system tomorrow and it might be an Intel system, AMD is taking too long to lower their prices. Thankfully there is a Ryzen + Gigabyte MB bundle already with $130 off at my local store, otherwise I wouldn't buy it, the MSRP is crazy. The Ryzen 7700X for $130 off is great, but come on:

AMD needs to take action.
 
To me, the pcie halving when an M2 drive is used is utter idiocy... Not that PCE5 8x ain't enough, motherboard makers could split 8+8 for an another slot...

For a flagship SKU being crippled so much in PCIe department again... Just NO.
 
AMD still has the better product, but their pricing demands are off. Bad pricing equals bad product.

Having built many Intel systems recently (10900k, 11700k, 12900k, and a bunch of Ryzen) I much prefer the more efficient Ryzen chips. Ryzen 5000 was awesome. Price drops happened quickly after 12th gen launched and Ryzen stayed number one.

I'm probably buying a new system tomorrow and it might be an Intel system, AMD is taking too long to lower their prices. Thankfully there is a Ryzen + Gigabyte MB bundle already with $130 off at my local store, otherwise I wouldn't buy it, the MSRP is crazy. The Ryzen 7700X for $130 off is great, but come on:

AMD needs to take action.
If AMD has the better product than the price difference really is irrelevant … what’s an extra even $400? $40 a month in 1 year, $20 a month in 2 years etc none needs a new cpu every year.
 
There's something seriously wrong with TPU's testing regime when this CPU runs cooler at 5.5ghz overclock than it does at stock speeds.

I thought I made that sufficiently clear in the text.

oi, what's with those CPU temps?

mine hovers around 70deg C after 10+ minutes of CineBench R23
With 240 mm AIO, sounds about right to get 70°C. What's your temps on air in R23?

Why is there a fking editor's choise in every tests ?
Which CPU would you buy for $400-$500?

Please take some rest.
Impossible, flying to AMD event next week, NVIDIA launch after that, AMD launch after that, Xmas, then CES, then more launches. Also need to get you those 50 game articles with 4090, 5800X3D, 7700X, 13900K, then work on new SSD bench for Gen 5, and start retesting all CPUs with 4090 and start retesting GPUs with 13900K. Wanna make the big bucks? Gotta work hard
 
I really wish the MMOs and older games that are still super popular and have a large player base would be tested to see the effect of L3 cache too, in the games section. Like Finsl Fantasy, WoW, etc. They are not niche use cases for gamers. I see plenty of reports from people upgrading to 5800X3D and seeing large fps jumps in scenarios where lots of data from other players in the same zones is being processed. At the moment there’s no info on how the new AMD and Intel processors fare against 5800X3D for those types of games.
 
  • Like
Reactions: cbb
Back
Top