• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-13900K

It's probably getting time to upgrade my 8700k. The new AMD CPUs are very good and Intel still holds a small advantage in gaming performance. But the elephant in the room is the potential 78003dx. For strictly gaming (which is me) I think I'll wait till it comes out (thought it was supposed to be November) before I make a choice on an upgrade.
What games are choking your 8700K? I'm curious as I have yet to find one, though of course I'm on an 'ancient' GPU.
 
Looking forward to future CPU testings with a 4090
It is worse than 12900k let alone 5800x if lowest 1% are included (super relevant for gaming), so better not.
 
What games are choking your 8700K? I'm curious as I have yet to find one, though of course I'm on an 'ancient' GPU.
I'm on 8700K with 3080 and haven't noticed any "choking"
 
And I think AMD outdid Intel in power and temperature, both teams managed to deliver good meh in this generation. I think my Zen 3 will be fine until 3D V Cache came.
 
Well I guess those who so claimed how great this new Intel is can bite their tongues now. Some improvement over 12900k but definitely not claimed 40%(or whatever it was) though and yet the power consumption.... holy crap. I guess the main improvement was the frequency which means power through the roof. Just to be slightly ahead of competition. Sorry Intel but the power consumption is way too high either in application or gaming is really not good.
I literally just glanced at the charts and it would seem the 13900k is just 12900k on steroids with power thus the results. As I see it though? No improvement whatsoever, higher power consumption for higher gains. Hoped for better results to be honest.

Just noticed.
The real fun starts when 13600k has a power draw of 255W which is where the 3900XT sits. 7600X has a 183watt power draw. Not to mention during gaming.
No, I'm trying to like it for some reason but I just simply can't despite if it is 13600k or 13900k it does not matter.
 
Last edited:
Something is not right with the Cinebench results. All other reviews show 38k~40k. Wonder how affected the other tests are by whatever is causing it in your setup.
I'm researching this .. something strange is going on .. going away for the weekend in an hour though, more testing on monday
 
Honestly with those temps, I'm waiting till someone cooks on one of these. 115C! 300w! Both AMD and Intel are pretty close in this gen. Maybe gaming but I think Zen4X3D will retake the top.
 
RDR2, WatchDogs2, EldenRing and Forza can't run at 144 Hz on any cpu ... is this "the choking"? :laugh:
 
Well I guess those who so claimed how great this new Intel is can bite their tongues now. Some improvement over 12900k but definitely not claimed 40%(or whatever it was) though and yet the power consumption.... holy crap. I guess the main improvement was the frequency which means power through the roof. Just to be slightly ahead of competition. Sorry Intel but the power consumption is way too high either in application or gaming is really not good.
I literally just glanced at the charts and it would seem the 13900k is just 12900k on steroids with power thus the results. As I see it though? No improvement whatsoever, higher power consumption for higher gains. Hoped for better results to be honest.

Just noticed.
The real fun starts when 13600k has a power draw of 255W which is where the 3900XT sits. 7600X has a 183watt power draw. Not to mention during gaming.
No, I'm trying to like it for some reason but I just simply can't despite if it is 13600k or 13900k it does not matter.
Maybe you should check reviews from other sites. The 13900k according to every review except this one is more than 40% faster than the 12900k with both at similar wattage (240 vs 253).
 
Maybe you should check reviews from other sites. The 13900k according to every review except this one is more than 40% faster than the 12900k with both at similar wattage (240 vs 253).
I did. The power stuff is from GURU3D for instance. It is not impressive and you have been telling people different. Assuring almost to be mind blown after they've seen what the new Intel. To be fair, I am mind blown and I'm sure others as well but not in the way you think. I'm sorry but it's not 40% faster. Not in gaming. Maybe there are instances (applications) that it may reach 40% but in general it is not 40% faster and I think that is pretty clear.
 
I did. The power stuff is from GURU3D for instance. It is not impressive and you have been telling people different. Assuring almost to be mind blown after they've seen what the new Intel. To be fair, I am mind blown and I'm sure others as well but not in the way you think. I'm sorry but it's not 40% faster. Not in gaming. Maybe there are instances (applications) that it may reach 40% but in general it is not 40% faster and I think that is pretty clear.
Guru3d has the 13900k scoring 38+k at 253w in cbr23. Thats more than 40% faster at similar wattage. So wtf are you talking about?

Who said its going to be 40% faster in gaming? Lol, ofc it wont be, how did you ever ecpect that to be possible.
 
Guru3d has the 13900k scoring 38+k at 253w in cbr23. Thats more than 40% faster at similar wattage. So wtf are you talking about?

Who said its going to be 40% faster in gaming? Lol, ofc it wont be, how did you ever ecpect that to be possible.
Oh so in Cinebench score it is 40% faster. That is just one score for one benchmark. Then you have other apps and it is not 40% faster and gaming same thing thus you can't say it is 40% faster since it is misleading. In general it is not 40% faster and that was my point.
 
Oh so in Cinebench score it is 40% faster. That is just one score for one benchmark. Then you have other apps and it is not 40% faster and gaming same thing thus you can't say it is 40% faster since it is misleading. In general it is not 40% faster and that was my point.
It's not just cinebench, every workload that is multithreaded the difference is 40% or more

So are you saying the 7950x is 5% faster than the 12900k?
 
Oh so in Cinebench score it is 40% faster. That is just one score for one benchmark. Then you have other apps and it is not 40% faster and gaming same thing thus you can't say it is 40% faster since it is misleading. In general it is not 40% faster and that was my point.
It's not just cinebench, every workload that is multithreaded the difference is 40% or more

So are you saying the 7950x is 5% faster than the 12900k?
Guys, you are comparing ridiculously fast CPUs at ridiculously high power and temperature levels. May I suggest that maybe... it doesn't matter? No sane person should push a CPU beyond 200+ W for gaming.
 
It's not just cinebench, every workload that is multithreaded the difference is 40% or more

So are you saying the 7950x is 5% faster than the 12900k?
Dude. I'm not saying anything about AMD products. Wrong thread though. I was point something out about your comments to my post.
I honestly don't care about your glaring problems with Intel's praise and my comments are not for anyone's amusement.

Guys, you are comparing ridiculously fast CPUs at ridiculously high power and temperature levels. May I suggest that maybe... it doesn't matter? No sane person should push a CPU beyond 200+ W for gaming.
Im not comparing anything. I just expressed something about a product that just came out.
 
Ugh, I was hoping for IPC improvements or process node improvements but what we have is just an overgrown Alder Lake with yet more power consumption to get the overclock higher.

I could have given Raptor lake some slack if it resulted in lower-tier models like an i5-13500 that had 6P+8E at reasonable power consuption, but no; The 13600K will be the smallest, cheapest Raptor Lake and that is already high-end enough that only the top 10% of buyers will spend that much. For the remaining 90% of the market, we just get rebranded Alder Lake models from 2021.

Given the global economic recession, cost of living increases, and exponentially rising energy costs, I feel like AMD may still have the upper hand here with far lower power consumption. When A620 boards or budget B650(non-E) boards are available, having a fast modern CPU that doesn't require replacing your PSU and cooling to use are both going to be points in AMD's favour.
 
Ugh, I was hoping for IPC improvements or process node improvements but what we have is just an overgrown Alder Lake with yet more power consumption to get the overclock higher.

I could have given Raptor lake some slack if it resulted in lower-tier models like an i5-13500 that had 6P+8E at reasonable power consuption, but no; The 13600K will be the smallest, cheapest Raptor Lake and that is already high-end enough that only the top 10% of buyers will spend that much. For the remaining 90% of the market, we just get rebranded Alder Lake models from 2021.

Given the global economic recession, cost of living increases, and exponentially rising energy costs, I feel like AMD may still have the upper hand here with far lower power consumption. When A620 boards or budget B650(non-E) boards are available, having a fast modern CPU that doesn't require replacing your PSU and cooling to use are both going to be points in AMD's favour.
From my cristal ball: 65w 13300f\13100f on ddr4+h610 will be the ultimate budget options with better gaming preformance all around. AM5 on ddr5 just can't cut it unless massively priceed cut.
 
Steve from HardwareUnboxed: "The power consumption of the 13900K is downright hideous..."

Capture.JPG


 
Last edited:
Ugh, I was hoping for IPC improvements or process node improvements but what we have is just an overgrown Alder Lake with yet more power consumption to get the overclock higher.

I could have given Raptor lake some slack if it resulted in lower-tier models like an i5-13500 that had 6P+8E at reasonable power consuption, but no; The 13600K will be the smallest, cheapest Raptor Lake and that is already high-end enough that only the top 10% of buyers will spend that much. For the remaining 90% of the market, we just get rebranded Alder Lake models from 2021.

Given the global economic recession, cost of living increases, and exponentially rising energy costs, I feel like AMD may still have the upper hand here with far lower power consumption. When A620 boards or budget B650(non-E) boards are available, having a fast modern CPU that doesn't require replacing your PSU and cooling to use are both going to be points in AMD's favour.
The economy is slowly but surely going down the toilet, energy prices are soaring all around the world, and here we have Intel, Nvidia and AMD brute forcing their ways into the highest of the high tiers of computer hardware with never-before-seen power consumption and heat, and no innovation on the IPC and efficiency fronts other than what a node shrink naturally brings. What is going on? :kookoo:
 
Steve from HardwareUnboxed: "The power consumption of the 13900K is downright hideous..."

View attachment 266504
Yeah. I just finished watching what HWUB Steve had to say. It was not pretty. Not to mention the constant thermal throttling for the 13900K and the ridiculous power consumption in most scenarios there are.
 
Yeah. I just finished watching what HWUB Steve had to say. It was not pretty. Not to mention the constant thermal throttling for the 13900K and the ridiculous power consumption in most scenarios there are.
I just finished the 13700K video which paints a similar picture, unfortunately. And we (or at least some of us) thought Zen 4 was bad...
 
Many thanks for an excellent review @W1zzard
Call me crazy, but the term 'frames per watt' tickles me the wrong way...
You've probably used that phrase to say 'frames per second per watt' in a shorter way, but still feels wrong.

If anyone is not asleep yet and want detail:
"frames per watt" is wrong because 'frame' is an 'amount of work done' and 'watt' is 'power' (amount of energy per unit of time).
A correct measure of efficiency would be either:
"amount of work done per amount of energy", so: "frame per joule" or "frames per kWh" (to give a familiar unit of energy)
or
"rate of work being done per amount of power", so: "FPS per watt"
 
But you do see the cost of the platform.
If both are the same and one is cheaper, by some hundreds of $ if you go ddr4 and z690, than the choice is more easy.
What you save today by not buying motherboard if you come from alder then that same you save when ryzen 5 comes. So either way gets to the same result more less.
 
Back
Top