• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i7-13700K

I think the overall PC market is in a downturn, both companies released decent products.
 
I think the overall PC market is in a downturn, both companies released decent products.

Intel's client compute group was +5% quarter over quarter - still down 17% from Q3 2021.

But AMDs updated guidance for Q3 client compute group was -53% quarter over quarter, -40% from Q3 2021.

We'll find Tuesday what really happened. Their server business and gaming (XBox/Playstation) business guidance was still very good even with their client business getting kicked around.
 
I think the overall PC market is in a downturn, both companies released decent products.
I think the problem is that Intel RPL is too hot and hungry, and AMD Zen 4 is too expensive as a platform upgrade. Besides, if you have Coffee Lake, Zen 2 or newer, you don't need an upgrade even as a gamer. Games aren't evolving as fast as hardware is. So yeah, decent products that are made for no one.
 
There is no instruction set difference. Read the primer article on Raptor Lake.

This is reworked silicon with more cache and more E-cores on a tweaked Intel 7 process that allows higher peak clocks (at the cost of higher power consumption).

Actual IPC improvements are very low, did you somehow skip the 13900K review?
I didnt skip but for all the hype the RL is not that much different than AL chips. Thought maybe I missed some important info.
 
I didnt skip but for all the hype the RL is not that much different than AL chips. Thought maybe I missed some important info.
Nope. It's officially "just a refresh" of Golden Cove, not a new microarchitecture. I don't think you'll see actual new microarchitecture until they move on from the name "Cove"

The actual microarchitecture differences over Golden Cove come down to a tweaked dynamic prefetch algorithm and more cache, which is tech jargon for "they picked the low-hanging fruit, and phoned this one in"
 
Last edited:
Slightly off topic but if you do end up upgrading to another platform, ill buy your 9900KS lol
Once the 13900KS is released mine will be on sale.
 
Great article, generally good discussion. Couple of points around one meta observation.

PC enthusiasts need to learn how to kill sacred cows. This has always been an issue, honestly, going back to the dawn of the hobby.

Why it's relevant here? Three reasons I see...

1) "more power bad!" Isn't always true. Just like you can't get 40MPG (or 250wh/mi for EVs ) at 165MPH, you can't expect x86 to achieve the kinds of insane computational power at peak we're seeing from both Zen 4 and RL without significant power draw. And like for like, at the same power draw as their predecessors, they both perform significantly better, so the efficiency work IS being done. Full stop. As for heat, heat is relative. Something designed to run at 60C is in the danger zone at 70. Something designed to run at 100C is fine at 95. It's not absolute.

2) "CPU doesn't matter for 4k". This is only true if you're GPU bound which, as of ADA (and presumably RDNA3), you're not. It all always matters, it just depends which subsystem is under the most pressure. As an example if your goal is 1440P, 120Hz, no RT, you could have stopped upgrading a while ago. RDNA2/3000 and any reasonably modern CPU is fine .

3) "Microcenter is relevant" look, I get it. I love Microcenter too. But nearly all of their stuff, including deals, is "in store only" and like 2% of the planet lives near one. Unless you're volunteering to make trips for people, stop considering them indicative of pricing, availability, or really anything honestly.
 
Reality kicks in, below some interesting Microcenter offers!
Some examples if you break down price bundles:
AMD Ryzen 7 7700X $299
AMD Ryzen 9 7900X $449
AMD Ryzen 9 7950X $599
G.Skill Flare X5 Series 32GB (2 x 16GB) DDR5-5600 CL36 $139.99
ASUS B650-PLUS Prime $109.99
Gigabyte B650 AORUS Elite AX $139.99
ASRock X670E Pro RS $189.99

You-Will-Get-A-Free-32-GB-DDR5-5600-Kit-50-US-Off-AM5-Motherboards-If-Your-Purchase-An-AMD-Ryzen-7000-CPU-At-Microcenter.jpg

Below additional promos applicable to both Intel & AMD (not included in the above prices that i quoted as an example)
View attachment 267675
View attachment 267676
Naw, 12700k + Asus Tuf z690 Wifi for $350 at microcenter is the God deal right now (out of stock in some stores already, huge stock in others)

Another CPU that I won't be recommending to average users due to its insane cooling requirements at stock.
So idk, maybe.. undervolt it? It is a K sku after all? I'm pulling 31k in R23 multicore at 180w with 13700k. I'm also definitely without question idling at at least half the power of Zen 3/4.
 
Last edited:
The thing that is strange is that no one is dropping Zen 4 prices despite obviously bad sales numbers.

That tells me one of two things.

Either AMD has restricted retailers from selling below the MSRP - something Apple has long done on iPhones and such - *or* there is no margin on Zen 4 chips. Of those two, I would bet that the first is in play with the 2nd a far distant possibility.

Why? If retailers were free to sell at whatever price they wanted, we'd see various combinations of 'deals' to move product, not fixed prices and stuff like free RAM on the side.

This becomes a big problem for them, because their capital is tied up on those chips and can't be used to order other stuff - that's the way buyers/purchasing works in every retail company whether you're selling underwear or $50,000 cars. If you can't keep merchandise turning over so you can get the next big thing on the shelf, you're sunk, that's how retailers go under.

I certainly expected it to sell pretty well in the DIY space at least at first, even expected it to drive up DDR5 prices for a month or two. Not like 2020 no, but it really didn't enter my mind that it would flop like this. AMD doesn't release new CPUs but every 2 years. The almost total lack of interest is, frankly, shocking.
I imagine that it's all about how it looks on the earnings reports. Even though it has the same financial outcome, offering promotions probably looks better to investors than directly writing down goods.
 
Naw, 12700k + Asus Tuf z690 Wifi for $350 at microcenter is the God deal right now (out of stock in some stores already, huge stock in others)
Great catch, much better than the ryzen bundles, was it only only for a day or so?
 
So idk, maybe.. undervolt it? It is a K sku after all? I'm pulling 31k in R23 multicore at 180w with 13700k. I'm also definitely without question idling at at least half the power of Zen 3/4.
I said "for average users". They're definitely not going to undervolt.
 

I was reading this, and also this:


Now I am slightly worried there will be older games not even on this list that I simply won't be able to play... anyone have any thoughts on this? @P4-630 I know you said you have had no problems, most of the games on this list have also probably been patched by now... but there are literally thousands of games... I am sure there is not too much to worry about. Just wondering if anyone has had any bad experiences?
 

I was reading this, and also this:


Now I am slightly worried there will be older games not even on this list that I simply won't be able to play... anyone have any thoughts on this? @P4-630 I know you said you have had no problems, most of the games on this list have also probably been patched by now... but there are literally thousands of games... I am sure there is not too much to worry about. Just wondering if anyone has had any bad experiences?
From December:


And then by January:


I was reading your link and scratching my head, because I've tried at least a couple of those games (mostly just to benchmark) and noticed no problem. But I bought my Alder Lake rig in April, which was apparently late enough to miss all of the issues I've seen reported. I wouldn't worry about older, unknown games, by the way, as the issue seemed to be confined to a specific, newish, batch of DRM software.
 
From December:


And then by January:


I was reading your link and scratching my head, because I've tried at least a couple of those games (mostly just to benchmark) and noticed no problem. I bought my Alder Lake rig in April, which was apparently late enough to miss all of the issues I've seen reported.

Thank you so much for this! This is great to hear. Hopefully Denuvo in older games has also been tested though, lot of games have that and aren't popular anymore, so may have slipped through the cracks simply because no one plays them anymore (but I might if going through my backlog)... but maybe not too!

Regardless, this is great news to read. Thanks! :toast:
 
Thank you so much for this! This is great to hear. Hopefully Denuvo in older games has also been tested though, lot of games have that and aren't popular anymore, so may have slipped through the cracks simply because no one plays them anymore (but I might if going through my backlog)... but maybe not too!

Regardless, this is great news to read. Thanks! :toast:
Anytime, man. Yeah you may be right that there are older games; I'm far from expert on the topic of game DRM. But presumably Denuvo knows which products use their software, and I imagine they'd be very thorough when working to appease the 800-lb Intel gorilla.
 
Talking about power has become a fad with many bandwagon jumpers.

As you say, power consumption can be anything you configure it to be, for one.

I've run HWInfo64 for multiple days on my old 10850K, and recently on this 12700KF OC'd to 5.3/5.2/5.1 and got the same result. Max around 150W, average 35W.

But even if we just use TPUs numbers at face value, and
  1. Assume you run a heavily threaded rendering operation 12hrs/day 300 days a year.
  2. 13900K vs 7950X, there is 48W difference at stock
  3. 48W * 12hrs = 0.576 KWH/day
  4. .576 KWH/day * 300 days = 172.8 KWH/year
  5. 172.8 KWH/year * 0.15c/KWH = $25.92 /year
So the difference in cost to operate is about $26/year if you do 100% all core load 12hrs/day 300 days per year.

None of which is realistic.

And even if it were, $26 is not worth talking about on a rig that costs $1000+++.

This is far more realistic of people actually using a computer all day - either idle, or single \ light thread loads. It's super rare for any normal user to use all core max, and even for professionals it's not the normal state. It takes a lot of work typically to get to a point where for example one is ready to render, or compile, or simulate.
Of course, nobody talks about this one.

View attachment 267523
You should try that calculation of yours with kWh pricing in Europe.

My energy contract now contains a unit price of € 0.60
That's a full 100 EUR per year on a rig of 1000. Suddenly we're speaking of 10% additional cost of ownership. And that's even with limiting to 150W - you only calculated the difference to another top end part.
Say you run the rig five years. That's half the purchase price for a few % extra perf.

Oh... and let's not forget about cooling requirements too, they also add up to initial cost.

Bandwagons, indeed. Even at 26 dollars per year (still $125,- in additional expenses over 5 years, still 12,5% in total if the rig were 1k) the added performance does not weigh up to the cost, but at 4x that expense, we're in silly land, thanks for confirming this. The power consumption 'bandwagon' was always there, and it will always matter, despite what you like to think based on a very low cost for electricity. This isn't an Alder Lake & onwards point of discussion either, and its still a fact CPU TDPs are exploding.

Regardless, you do have a point, especially (and perhaps exclusively) when it comes to K-parts, users can indeed tweak those and are expected to do so. But then one might wonder why the higher end parts are even in the picture, for most, its a completely pointless exercise. And those that do need those cores, are the specific group that is liable to run that all core load we're calculating here ;) For that group this inefficiency question stands and total cost of ownership is going to take a hit.

I think the most important part of the TDP increases we see, is the sentiment that goes along with it. The industry is moving in a direction that seems counter intuitive, and frankly, IS counter intuitive if we consider the challenges ahead of us. Why are we not trending towards equal performance at lower stock TDPs; or just same TDP and better perf/w? After all, the performance isn't actually needed on consumer, anywhere. We need a massive turnaround in our approach to everything if we even want to stabilize our harmful output on the planet.

Maybe that unit cost for energy needs to be much higher still before we get the memo.
 
Last edited:
You should try that calculation of yours with kWh pricing in Europe.

My energy contract now contains a unit price of € 0.60
That's a full 100 EUR per year on a rig of 1000. Suddenly we're speaking of 10% additional cost of ownership. And that's even with limiting to 150W - you only calculated the difference to another top end part.
Say you run the rig five years. That's half the purchase price for a few % extra perf.

Oh... and let's not forget about cooling requirements too, they also add up to initial cost.

Bandwagons, indeed. Even at 26 dollars per year (still $125,- in additional expenses over 5 years, still 12,5% in total if the rig were 1k) the added performance does not weigh up to the cost, but at 4x that expense, we're in silly land, thanks for confirming this. The power consumption 'bandwagon' was always there, and it will always matter, despite what you like to think based on a very low cost for electricity. This isn't an Alder Lake & onwards point of discussion either, and its still a fact CPU TDPs are exploding.

Regardless, you do have a point, especially (and perhaps exclusively) when it comes to K-parts, users can indeed tweak those and are expected to do so. But then one might wonder why the higher end parts are even in the picture, for most, its a completely pointless exercise. And those that do need those cores, are the specific group that is liable to run that all core load we're calculating here ;) For that group this inefficiency question stands and total cost of ownership is going to take a hit.

I think the most important part of the TDP increases we see, is the sentiment that goes along with it. The industry is moving in a direction that seems counter intuitive, and frankly, IS counter intuitive if we consider the challenges ahead of us. Why are we not trending towards equal performance at lower stock TDPs; or just same TDP and better perf/w? After all, the performance isn't actually needed on consumer, anywhere. We need a massive turnaround in our approach to everything if we even want to stabilize our harmful output on the planet.

Maybe that unit cost for energy needs to be much higher still before we get the memo.

Well, that gets a bit political and regional. Europe made its own bed, now it must sleep in it. Stop letting 15 year olds lecture about energy policy for an entire continent might be a good start.

I chose the national average for the USA.

This is what I pay - 3 year contract, won't change until 2024 :

1668184308855.png
 
Well, that gets a bit political and regional. Europe made its own bed, now it must sleep in it. Stop letting 15 year olds lecture about energy policy for an entire continent might be a good start.

I chose the national average for the USA.

This is what I pay - 3 year contract, won't change until 2024 :

View attachment 269506
Maybe now you can understand 'the bandwagon'. That was the point ;)
 
Maybe now you can understand 'the bandwagon'. That was the point ;)

My example was hyperbolic - meaning an unreasonable use case for a desktop CPU - in the first place.

No sane person is going to run one of these unlocked 300 days/year 12hrs/day at anything like max power. My average power draw is about 30W *total* and my CPU is overclocked. I've run monitors to see that, and my calculation was based on a 48W *difference*.

Obviously you're not going to get a 48W difference between CPUs when your average total is 30W on the 'less efficient' machine.

The difference, if there is one, is more likely to be 2-3W.

That means, even in Europe, we're now talking about pocket change over the course of a year - a couple of Euro per year, maybe.

Your selection of GPU, even at idle, will literally be a magnitude of order (10X) more important. And that is just based on idle power draw.

I mean, you really can't get away from this single monitor high idle power draw: difference of 15W continuous between a 3090 and a 6800+ :

1668190853593.png


Even worse, multi-monitor idle :

1668190964331.png
 
This is what I pay - 3 year contract, won't change until 2024 :

View attachment 269506
That's not the full price though. There's also a distribution charge, which is usually at least equal to if not more than the cost of generating the electricity.
 
W1zzard really appreciate the virtualization benchmarks.

I'm looking towards the 7900x at current pricing as I heavily use VMware workstation, but the 13700k is so tempting, wonder if Intel can optimize VM use.
 
W1zzard really appreciate the virtualization benchmarks.

I'm looking towards the 7900x at current pricing as I heavily use VMware workstation, but the 13700k is so tempting, wonder if Intel can optimize VM use.
Wait for the reviews soon of the upcoming 7800X3D and its versions before pulling the trigger as they might be something special.

I'm normally an Intel man, but these are looking really good and you don't have to put up with e cores.
 
Back
Top