• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Enthusiast-Grade K Processors in the Comet Lake-S Family Rumored to Feature 125 W TDP

Intel says their chips are better than AMD’s. Here’s the problem. NOT ONLY do they say in the footnotes that the apps are optimized for intel cpus, they ALSO won’t mention the TDP that is in place when running all that turbo stuff. 210W. Versus AMD’s 105W all-time TDP. For intel, in turbo mode they have a different TDP that they never mention in public docs, and the only way to truly find out is through XTU, and it shows how much more power intel cpus take for those performance numbers. Yes my AMG V12 Engine does 660hp. Peak. I don’t want to tell you how trash it is compared to a v8 for all other RPMs, cus we only focus on maximum performance right? (Btw i don’t have an AMG V12, it was just an example. Normally, V8s offer more consistent performance than V12s)
well,everty manufacturer tells us their is better.
I prefer to look at performance and temperatures over tdp.it's the tdp that serves the purpose of better performance and temp/noise,it's not a metrics one should pursue on its own.
 
it helps with noise levels


well,that's not true since boost is temp realted.



so TPU can't make a cpu review :rolleyes:
Performance wise it doesn’t matter. Fan speed sure, but not necessarily for performance. That’s an indirect comparison. Yes, boost is temp related but, I’m pretty sure you don’t just get more than 200 MHz from a 20C drop. Depends on chip and cooling and a lot of stuff. And explain how, a CPU a designed for 210W, has 2 extra cores, has higher frequencies, RUNS ON THE SAME ARCHITECTURE (yes 6-9th gen basically the same arch) as an 8700K (which is rated for I think 120-131W) runs cooler?
 
Performance wise it doesn’t matter. Fan speed sure, but not necessarily for performance. That’s an indirect comparison. Yes, boost is temp related but, I’m pretty sure you don’t just get more than 200 MHz from a 20C drop. Depends on chip and cooling and a lot of stuff. And explain how, a CPU a designed for 210W, has 2 extra cores, has higher frequencies, RUNS ON THE SAME ARCHITECTURE (yes 6-9th gen basically the same arch) as an 8700K (which is rated for I think 120-131W) runs cooler?
cause 9900k is soldered,like every 9th gen K chip,while neither 9700k nor 8086k are.
 
Quite a lot of people run these with no manual OC. The CPU takes care of itself pretty well.

No. TDP is given in specs and that's it. It's measured by Intel for a non-overclocked processor under some "real life scenario".
The 210W figure at 4.7 GHz all core is power consumption.
Yea 95W and 3.6 GHz is totally realistic for a stock 9900K running stock bios settings. You have to disable turbo boost in order to get the 95W limit working all the time.

cause 9900k is soldered,like every 9th gen K chip,while neither 9700k nor 8086k are.
9700K is soldered, 8th gen is not. 9th gen locked cpus are not soldered, and 9600K may depend on the revision, not sure.
 
Hello Intel Defender. AMD's TDP is way more within specs compared to Intel. The 2700x for example is a 105W TDP chip but could exceed 140W once boost (unlimited) kicks in.

As an AMD fan I can defend him, AMD is just as guilty in this, Linus and Gamers Nexus has videos on this.
 
Letme be very clear, I dont care if its Intel or AMD, if a processor stated as 95w piece of a product, I expect that to be delivered and not exceeded. if the boost exceeds it, then the product is falsely advertised. I dont mind if they give tdp with its boost clocks even if its 150w as long as consumeres know what they are getting into and right both companies are no way near innocent.
 
Letme be very clear, I dont care if its Intel or AMD, if a processor stated as 95w piece of a product, I expect that to be delivered and not exceeded. if the boost exceeds it, then the product is falsely advertised. I dont mind if they give tdp with its boost clocks even if its 150w as long as consumeres know what they are getting into and right both companies are no way near innocent.
I agree with that, however intel DOES HAVE a turbo TDP, they just don’t state it. With AMD, it’s variable because it is a variable boost, while intel cpus have a boost table they follow. That’s why amd can’t really say a boost TDP cus there isn’t one. However, for EXPECTED USE, Intel’s power consumption is much more than 95W. AMD’s 3950X under expected loads are much closer to their TDP ratings. Oh and let’s not forget how many different times Intel changed their definition of TDP.
 
pick one set of standards for every brand,will you.
I expected those intel temps from a D14 or D15, not a NH U12S. The NH U12S isn’t the worlds strongest cooler that could cool a 9900K at turbo at 57C. If you can keep the core speeds constant at least, then it would be more consistent. Perhaps deleting the 3.6/5.0 because that sounds like turbo was used, and disable turbo, and run 3.6 GHz constantly. At a constant voltage. That is a true test. And obviously lower temps are better, HOWEVER that only applies to the specific cpu. A 9700K at 75C doesn’t make it better than a 3900X at 80C, because then you need to look at leakage, and leakage can vary from cpu to cpu. That can affect power consumption pretty substantially, it can potentially make the cpu take another 5-10W just from the extra heat (going from, say, 60C to 80C on a high core count cpu)
 
I expected those intel temps from a D14 or D15, not a NH U12S. The NH U12S isn’t the worlds strongest cooler that could cool a 9900K at turbo at 57C. If you can keep the core speeds constant at least, then it would be more consistent. Perhaps deleting the 3.6/5.0 because that sounds like turbo was used, and disable turbo, and run 3.6 GHz constantly. At a constant voltage. That is a true test. And obviously lower temps are better, HOWEVER that only applies to the specific cpu. A 9700K at 75C doesn’t make it better than a 3900X at 80C, because then you need to look at leakage, and leakage can vary from cpu to cpu. That can affect power consumption pretty substantially, it can potentially make the cpu take another 5-10W just from the extra heat (going from, say, 60C to 80C on a high core count cpu)
i prefer to trust TPU in matters like these.

if they can't get temp testing right,what are we even talking about.
the difference between 9700k and 3900x is 26 degrees,not 5 degrees.
it will affect noise levels substantially.
 
they ALSO won’t mention the TDP that is in place when running all that turbo stuff. 210W. Versus AMD’s 105W all-time TDP. For intel, in turbo mode they have a different TDP that they never mention in public docs, and the only way to truly find out is through XTU, and it shows how much more power intel cpus take for those performance numbers.
Ryzen 3000 105W CPU-s have 142W power limit in usual circumstances. 65W CPU models will consume 88W.

Intel's K-models have motherboard-specific "optimized" settings that generally mean disabled power limits. Non-K models generally perform as per spec - 25% increased power limit for 8 seconds. For example, a 65W Intel non-K CPU will do 81W for 8 seconds at load.

Any Intel 14nm CPU will lose out to AMD's 7nm CPU in all-core loads, especially at 6+ cores. Frequencies in these circumstances will not favor Intel CPUs any more. Not because they are not capable of it but because they hit power limits.
 
i prefer to trust TPU in matters like these.

if they can't get temp testing right,what are we even talking about.
the difference between 9700k and 3900x is 26 degrees,not 5 degrees.
it will affect noise levels substantially.

The 3rd gen temp numbers are slightly misleading. Average load temperatures are much lower than nano peak temps. I was super nervous until I looked at what the rolling load temp was.
 
Yea 95W and 3.6 GHz is totally realistic for a stock 9900K running stock bios settings. You have to disable turbo boost in order to get the 95W limit working all the time.
Once again: TDP is a figure connected to heat dissipation, not power consumption. If you don't know this by now, there's no way I can convince you.
 
Once again: TDP is a figure connected to heat dissipation, not power consumption. If you don't know this by now, there's no way I can convince you.
That is a technicality and a matter of nickpicking on terminology. In practice, sticking to TDP would mean sticking to power limit as the vast majority of power consumed by CPU will be converted to heat. That is the entire point of being angry with Intel's (and current AMD's) idea of TDP - the stated amount does not match the actual heat output of the CPU.
 
Don't Ryzen CPUs report their temps differently than Intel chips? I thought I recall Zen incorporating an array of temp sensors across the CPU as part of Precision Boost.
 
That is a technicality and a matter of nickpicking on terminology. In practice, sticking to TDP would mean sticking to power limit as the vast majority of power consumed by CPU will be converted to heat. That is the entire point of being angry with Intel's (and current AMD's) idea of TDP - the stated amount does not match the actual heat output of the CPU.
No and this is a very common misunderstanding.
But you're right in one thing: pretty much all of energy consumed is emitted as heat.

So, you have a consumer CPU with TDP=65W. You run it as intended. It stays near base clocks most of the time and boost for few seconds from time to time. During that boost it consumes 200W.
So, what cooler do you need? A 95W one. Someone tested it on a 95W CPU and it was fine.
What PSU do you need? A 200W one (just for the CPU).
That's the difference.

You should not look at it as if Intel or AMD deceived you. They've sold you a CPU with 65W or 95W TDP. You can buy a cooler based on that TDP.
But they've also given you a bonus (not a lie!). Because they've made their CPUs so rapid and flexible, they can boost instantly for a short time when you need it in your typical consumer-ish PC using: to load a website, open a file, apply an effect in a photo editor etc. It's so short that the extra heat produced is tiny and your 95W cooler won't explode.
And if, during that short boost, your PSU can provide just 150W, not 200W? It won't explode either. The CPU knows when to stop pushing. It's all though out really well.

We test consumer CPUs by running hours of 100% load benchmarks, which is not how these CPUs are used in real life. Of course that's how we learn their performance limits (which is good), but the resulting average power consumption figures are unrealistic.
This is exactly the reason why workstation/server CPUs turn out (in similar tests) to be very conservative when it comes to TDP. Because their purpose is exactly to run at 100% all the time. That's how their TDP was calculated.

And now moving to 7nm Zen2 issue, which I really can't pass over.
The coolers we have today were tested on CPUs available before 7nm arrived.
It turns out that these CPUs, despite consuming under 150W, are so tightly packed that the heat concentration is much larger than we've seen earlier. That's why 3900X and 3950X are so hot.
And that's why AMD recommends to pair a CPU with TDP=105W with coolers that have TDP ratings 2-3 times larger.
So suddenly the TDP stops making any sense at all. It's lower than what these CPUs actually pull (140-150W) and has absolutely no meaning when it comes to choosing a sufficient cooler.

When Intel joins with desktop 10nm CPUs, the whole TDP rating will have to be adjusted. Dark Rock 4 will not be a 200W cooler anymore. It'll be called a 100W cooler, maybe less.
 
lol,another tdp debate.
why doesn't anyone pay attention to what actually matters

cpu-temperature.png



K-series will be entusiast only,I like the mainstream ones though.
+4.5ghz out of the box,HT on every chip.fast,cool and quiet.
Load temps don’t really matter though. CPU stability is stability.

Love see people argument to TDP

Its easy

TDP intel = base clock
TDP amd = all core boost

Simple

AMD TDP is based off of P0 which is max clocks without boost per AMD.

here is a link confirming that information

 
It looks acceptable. With good pricing this will make Intel competitive in the mass "up to 8 cores" market for another year. That's all they can hope for until 7/10nm arrive.
250W+ mainstream parts incoming. Yeah. Very "acceptable" indeed. My 3800X does not exceed 140W (according to HWInfo) even when overclocked to 4.5 Ghz. Comet Lake will problably double that for 5Ghz+ overclock on the 10c/20t part and will lose to Ryzen 4000 in performance regardless.

Ah and don't forget to buy your new motherboards too...
 
Load temps don’t really matter though. CPU stability is stability.



AMD TDP is based off of P0 which is max clocks without boost per AMD.

here is a link confirming that information


How dare you bring manufacturer data into this discussion...
 
An 10 core 9900k running 5.2GHz+? With a iGPU? (accelerated video rendering). Yes please. Sucks I'll have to change out the mobo though.
 
No and this is a very common misunderstanding.
But you're right in one thing: pretty much all of energy consumed is emitted as heat.

So, you have a consumer CPU with TDP=65W. You run it as intended. It stays near base clocks most of the time and boost for few seconds from time to time. During that boost it consumes 200W.
So, what cooler do you need? A 95W one. Someone tested it on a 95W CPU and it was fine.
What PSU do you need? A 200W one (just for the CPU).
That's the difference.

You should not look at it as if Intel or AMD deceived you. They've sold you a CPU with 65W or 95W TDP. You can buy a cooler based on that TDP.
But they've also given you a bonus (not a lie!). Because they've made their CPUs so rapid and flexible, they can boost instantly for a short time when you need it in your typical consumer-ish PC using: to load a website, open a file, apply an effect in a photo editor etc. It's so short that the extra heat produced is tiny and your 95W cooler won't explode.
And if, during that short boost, your PSU can provide just 150W, not 200W? It won't explode either. The CPU knows when to stop pushing. It's all though out really well.

We test consumer CPUs by running hours of 100% load benchmarks, which is not how these CPUs are used in real life. Of course that's how we learn their performance limits (which is good), but the resulting average power consumption figures are unrealistic.
This is exactly the reason why workstation/server CPUs turn out (in similar tests) to be very conservative when it comes to TDP. Because their purpose is exactly to run at 100% all the time. That's how their TDP was calculated.

And now moving to 7nm Zen2 issue, which I really can't pass over.
The coolers we have today were tested on CPUs available before 7nm arrived.
It turns out that these CPUs, despite consuming under 150W, are so tightly packed that the heat concentration is much larger than we've seen earlier. That's why 3900X and 3950X are so hot.
And that's why AMD recommends to pair a CPU with TDP=105W with coolers that have TDP ratings 2-3 times larger.
So suddenly the TDP stops making any sense at all. It's lower than what these CPUs actually pull (140-150W) and has absolutely no meaning when it comes to choosing a sufficient cooler.

When Intel joins with desktop 10nm CPUs, the whole TDP rating will have to be adjusted. Dark Rock 4 will not be a 200W cooler anymore. It'll be called a 100W cooler, maybe less.
So if I understand this correctly, a 95W cpu has more cooling efficiency than a 65W chip? If that is the case, let’s compare the temps of the 9900KS vs the 9900K, since one has 127W and the other has 95W TDP. The 9900KS theoretically should offer lower temps correct?

An 10 core 9900k running 5.2GHz+? With a iGPU? (accelerated video rendering). Yes please. Sucks I'll have to change out the mobo though.
I don’t think the 10 core chips were supposed to come with iGPUs, I believe those were all F/KF cpus, but I may be wrong. Then again, if you need 10 cores, ur probs gonna get urself a GPU anyways.
 
So if I understand this correctly, a 95W cpu has more cooling efficiency than a 65W chip? If that is the case, let’s compare the temps of the 9900KS vs the 9900K, since one has 127W and the other has 95W TDP. The 9900KS theoretically should offer lower temps correct?


I don’t think the 10 core chips were supposed to come with iGPUs, I believe those were all F/KF cpus, but I may be wrong. Then again, if you need 10 cores, ur probs gonna get urself a GPU anyways.
Think 5775...

Eitherway tdp is a metric used by the manufacturers for coolers. It's not a viable determination for us to buy a cpu.
The cpu by intel is a potato.
 
lol another TDP debate.

Gamersnexus already explained both Intel and AMD TDP.
They are either calculated from base clock or cooler thermal resistance.
They are both OFF.
 
Back
Top