• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel i9-13900K "Raptor Lake" ES Improves Gaming Minimum Framerates by 11-27% Over i9-12900KF

Woo Hoo Gooooooo Intel. Nice, hopefully the 13700k will be just as good.

People run GPU's that are twice the power use of the CPU and don't mind. I don't give a crap about power use as long as i can cool it, and it is fast. If you don't want high power use, don't build a high end gaming rig, simple.
 
Woo Hoo Gooooooo Intel. Nice, hopefully the 13700k will be just as good.

People run GPU's that are twice the power use of the CPU and don't mind. I don't give a crap about power use as long as i can cool it, and it is fast. If you don't want high power use, don't build a high end gaming rig, simple.

or perhaps demand better from the manufactures? just a thought of course.
 
Couldn't care less whether the framerate by 25% or 35%, if the power consumption also increases accordingly.
 
Couldn't care less whether the framerate by 25% or 35%, if the power consumption also increases accordingly.
I'm not judging anything as of now but considering Intel, I would not be surprised if the power goes up as well.
 
Ok wait, so this 13900k is all core 5.5GHz? so its OC? most likely running on liquid Nitrogen to get all those mega hot cores at 5.5GHz and what? using 600W? pointless results are pointless?
 
Low quality post by Deleted member 24505
Low quality post by Melvis
Low quality post by Deleted member 24505
Prove me wrong? otherwise your reply is also pointless....

Prove me wrong. you seriously think they had it cooled by liquid nitrogen :laugh: :laugh: :laugh:

Also 5,5 is not OC iirc it is the stock boost speed for Raptor Lake CPU's -->5.50 GHz was assumed to be the max boost frequency of the retail chip. maybe you should have tried actually reading it.
 
Last edited by a moderator:
looool
i9s are a joke. Intel keeps doing the same thing by increasing the power consumption unutil they get half fps over AMD counterparts....Ridiculous.
A 4.8Ghz 5800X3D will destroy all these jokes for cpus.

On the other hand, the i5 and i7 are very good cpus and are the real threat to AMD.
 
looool
i9s are a joke. Intel keeps doing the same thing by increasing the power consumption unutil they get half fps over AMD counterparts....Ridiculous.
A 4.8Ghz 5800X3D will destroy all these jokes for cpus.

On the other hand, the i5 and i7 are very good cpus and are the real threat to AMD.

even the i3 is a very competitive offer for budget gaming. I have to agree it's on the very high end (i9) that they have a problem, with the rest not so much and pricing is even better on Intel.
 
looool
i9s are a joke. Intel keeps doing the same thing by increasing the power consumption unutil they get half fps over AMD counterparts....Ridiculous.
A 4.8Ghz 5800X3D will destroy all these jokes for cpus.

On the other hand, the i5 and i7 are very good cpus and are the real threat to AMD.
Oh.. well... I see you are 5800X3D owner. No wonder why we got such a comment.
 
Low quality post by Veseleil
Prove me wrong. you seriously think they had it cooled by liquid nitrogen :laugh: :laugh: :laugh: maybe you need the same for your brain, the Australian heat is messing it up.
It doesn't matter LN or not, the point is, as seen from the previous generation, these CPU's are furnaces.
What does matter is your boost clock speed to insult a person for no reason. Supporter or not. Reported for insults.
 
Low quality post by Deleted member 24505
It doesn't matter LN or not, the point is, as seen from the previous generation, these CPU's are furnaces.
What does matter is your boost clock speed to insult a person for no reason. Supporter or not. Reported for insults.

Well mine is not a furnace. people who don't have one say that crap. They may run hot when running R23 maxed out, but who runs their CPU like that all the time. My CPU uses less than 60w gaming.

Boo hoo. He commented without actually reading the OP and got his facts wrong, just like you have about them running like a furnace, typical crap from a AMD user. you enjoy your R3600, i will enjoy my 12700k.
 
Ok wait, so this 13900k is all core 5.5GHz? so its OC? most likely running on liquid Nitrogen to get all those mega hot cores at 5.5GHz and what? using 600W? pointless results are pointless?
The new Intel CPU is using the same node so I guess the increase in clock speed is obvious where it's coming from or you may get the idea though.
Anyway, we will see all of it when the reviews come.
 
Btw I thought smaller nodes = lower power consumption. Does this means lower consumption per transistors and so higher consumption overall on the whole chip due to more transistors per square mm? I know this chip uses the same node as 12xxx series, just asking in general how does it work.
 
"people who don't have one say that crap."

Yeah, we were talking about top end SKUs, not the step below which has lower boost.

"He commented without actually reading"

Actually reading what, a rumor?

I had more Intel CPU based systems than AMD ones for two decades already, and i still have.
My 3600 isn't cool as i would expect, due to static OC (1.3V), as PBO and other boost stuff doesn't work for me.
 
Last edited by a moderator:
The video claims that the 13900K can be overclocked to 5.5 GHz on the P-cores using a 360mm AIO, which would be a feat in itself. Even if such frequencies could only be sustained in games, which hardly ever peg 16 threads at 100%, it's already an improvement over Alder Lake. Games should also benefit from Raptor Lake's increased cache.

However, I question the methodology of these tests. The 12900KF was gimped by setting it to an all-core 4.9 GHz, which is below the 5.1-5.2 GHz boost. Games in general do not benefit from manual overclocks, as they cannot fully utilize all the available threads. The 12900KF would have performed better with stock settings here, especially in ST limited titles.

The 13900K, on the other hand, was given an unfair advantage with a 5.5 GHz overclock across all cores, which is assumed to be its maximum boost frequency. And the perf/watt looks atrocious. I have no idea what quality settings were used, but here's a quick reference. A stock 5800X3D peaks at 71w and uses 51w on average in Forza Horizon 5 1080p benchmark, with maximum detail courtesy of a 6600XT.
 
Last edited:
A stock 5800X3D peaks at 71w and uses 51w on average in Forza Horizon 5 1080p benchmark, with maximum detail courtesy of a 6600XT.
that doesn't say much, it depends on if the 6600 is bottlenecking the 5800 and it's just sitting there. Utilisation is what you need for those kind of arguments.
 
Woo Hoo Gooooooo Intel. Nice, hopefully the 13700k will be just as good.

People run GPU's that are twice the power use of the CPU and don't mind. I don't give a crap about power use as long as i can cool it, and it is fast. If you don't want high power use, don't build a high end gaming rig, simple.

I love all the power comments lol -- same wattage, and these don't use much power at all during gaming. It's honestly just the cache bump - this arch is so memory starved it's insane - tiny tweaks to memory yield massive framerate increases.

Definitely looking forward to the 13700k.
 
Well regarding power you can say whatever about 5800X3D's performance but it's power consumption figures are amazing. For the performance it puts out it literally sips power. Even better is the fact that it's performance and power efficiency actually improve when undervolted via curve optimizer. Even AMD's own designs not to mention Intel cant match this performance per watt.
 
With my power company telling me lately that they have to increase energy prices quite significantly I couldn't care less about RL. The best decision this year: Buying a tablet instead of a new GPU or CPU. With new generations increasing performance at the cost of higher power consumption I will most likely change my behavior in hardware usage instead of buying new stuff that will make my anual power bill even more terrifying.
 
With my power company telling me lately that they have to increase energy prices quite significantly I couldn't care less about RL. The best decision this year: Buying a tablet instead of a new GPU or CPU. With new generations increasing performance at the cost of higher power consumption I will most likely change my behavior in hardware usage instead of buying new stuff that will make my anual power bill even more terrifying.

if a tablet is enough a APU is too, and those are cheap and cool and don't draw much power.
 
Back
Top