• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel lying about their CPUs' TDP: who's not surprised?

To anyone that might want something to play with..

Intel has a little thing called Power Gadget that shows how much power a cpu is pulling. It'll show my 2680v2's at 95-100w full load and 4790k 60w on normal use. Maybe some of you guys can give it a try on the 10 series.
 
Doing power consumption comparisons at max turbo all the time is about as logical as testing a cars fuel efficiency at full throttle max speed for an extended period of time. Unless you are a track racer, that's meaningless.

TDP is for determining *average* minimum heat dissipation in stock form. Max power consumption under turbo is by default something that only lasts 5-8 seconds, and then drops so that it can maintain that TDP average.

If these sites were interested in coming up with a useful power metric, they would use some standard benchmark representing typical workload to measure overall power consumption in the real world, and in the real world most PCs are sitting around under 5% CPU usage. For cars, they have EPA, which is why no one would get away with measuring MPG on a race track. We don't have anyone defining that in this space so people get to make these idiotic hyperbolic arguments.
 
The worst part is i know intel fanboys who rabidly defend these stats and say its lies
I disagree. The worst part is pi$$-poor journalism misrepresenting the facts with falsehoods and the opposing fanboys who rapidly pile on to defend the article and its falsehoods and then use that to attack the competition without even doing any fact checking to see if the article is biased or factual!

Note where the article says (my bold underline added),
Extreme Tech said:
On paper, an Intel CPU’s TDP is the maximum power consumed under a sustained workload at base frequency.

Anybody can see in seconds that is false! That is NOT how Intel defines TDP! Using the same CPU as the article did, and as seen in the ARK for the Core-i7 10700k, if you hover over TDP to see Intel's definition, it clearly says, (again, my bold underline added).
Intel said:
Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at Base Frequency with all cores active under an Intel-defined, high-complexity workload.

Come on everyone! Power "consumed" does NOT and never has equaled power "dissipated"! Nothing made by Man is 100% efficient! The CPU would not generate any heat if it was. Nor does maximum equal average. :(

It is pretty clear the purpose of that article is simply to launch another bashing session against the big bad Intel even though AMD's published TDP specs are vague and deceptive too!

The fault does NOT belong with Intel, or AMD but on the entire processor industry - which includes VIA, NVIDIA, Qualcomm, Motorola and others. The industry needs to get together and come up with an industry standard for terms and how such values are measured and published - in a similar way they all came together years ago to create the ATX Form Factor standard.
 
@RandallFlagg, "typical" is very difficult to find. Intel's big numbers are usually what you get with Prime95. Even heavier productivity workloads trail by a considerable margin. Anything lower than that - desktop usage scenarios, gaming, will probably fit in TDP anyway and will vary by a large margin.

Come on everyone! Power "consumed" does NOT and never has equaled power "dissipated"!
What percentage from the power that goes into CPU is used for anything else but radiating heat?
Btw, how this works in reality is just the opposite of what you said. CPUs are incredibly inefficient - they use small amount of power to do useful work (i.e. compute stuff) and rest is wasted as heat.
 
I know this was a joke but it actually seems to be the other way around. When idle, just showing desktop and running the few background processes I have, i5 was at 6W but R5 is at 30W. Thankfully the B550 board I have is a bit more efficient than my Z370 board (and the B450 I had previously) so the overall difference for the entire computer is ~15W.

Ryzen's IO Die seems to consume a good 10-15W and this has considerable effect at idle.


For CPUs today? Unfortunately yes.
In other contexts, it is a perfectly valid engineering concept. Thermal Design Power, should indicate the maximum amount of heat components needs to dissipate so that cooling can be designed properly.
I was looking at the wall. They are pretty similar. Not bad for 6 vs 4 cores. The quad was getting a buttload of voltage too. Z77 vs B550..
 
TDP was also never meant, and still isn't meant to be a measure of power consumption. It is measure of thermal output to determine heatsink size.
And I did say engineers Did use it correctly back in the day didn't I, and that marketing has confused it's use dramatically.
So we agree then yes, or no?!.

Looking back might not have expressed my point adequately before :).
 
What percentage from the power that goes into CPU is used for anything else but radiating heat?
That's the problem, isn't. There is no industry standard dictating how such values can be determine. It is not like a motor, for example, where you can accurately measure the power consumed and compare it to the turning power of the spinning motor.

It is not like a power supply where you can measure the voltage and current at the wall and compare it output voltage and current.

How to you accurately measure CPU output power and then compare that to the amount consumed and then use that to compare that to competing processors AND THEN use that data to determine which processor can do more "work" in a given amount of time?
Btw, how this works in reality is just the opposite of what you said. CPUs are incredibly inefficient - they use small amount of power to do useful work (i.e. compute stuff) and rest is wasted as heat.
:( NO!!!!!!! I NEVER said anything of the sort! I was pretty clear that CPUs generate heat - that clearly means they are inefficient. I specifically said nothing man-made is 100% efficient. That includes CPUs.
 
:( NO!!!!!!! I NEVER said anything of the sort! I was pretty clear that CPUs generate heat - that clearly means they are inefficient. I specifically said nothing man-made is 100% efficient. That includes CPUs.
Sorry, I misunderstood what you meant :)
 
Doing power consumption comparisons at max turbo all the time is about as logical as testing a cars fuel efficiency at full throttle max speed for an extended period of time. Unless you are a track racer, that's meaningless.

TDP is for determining *average* minimum heat dissipation in stock form. Max power consumption under turbo is by default something that only lasts 5-8 seconds, and then drops so that it can maintain that TDP average.

If these sites were interested in coming up with a useful power metric, they would use some standard benchmark representing typical workload to measure overall power consumption in the real world, and in the real world most PCs are sitting around under 5% CPU usage. For cars, they have EPA, which is why no one would get away with measuring MPG on a race track. We don't have anyone defining that in this space so people get to make these idiotic hyperbolic arguments.
Good point. That would explain why the techpowerup gaming consumption (average, not just random spikes) shows the 10500 and 5600x as nearly identical. Even the non k 10700 is only showing 8 watts higher than the 5600x when there both at full clocks/strength yet if you only measured the short spikes the 10700 would be way higher
 
That's the problem, isn't. There is no industry standard dictating how such values can be determine. It is not like a motor, for example, where you can accurately measure the power consumed and compare it to the turning power of the spinning motor.
Physics dictates the power consumed must go somewhere. In IC, there really are not many places for the energy to go but heat. It might give off some minor RF radiation (hopefully not) but even indirectly all the other conversion chains end up in heat. Anything else is a very minor fraction of a percent, if even that. For our purposes - the same power that goes into CPU will come out as heat.
Good point. That would explain why the techpowerup gaming consumption (average, not just random spikes) shows the 10500 and 5600x as nearly identical. Even the non k 10700 on max turbo is only showing 8 watts higher than the 5600x when there both at full clocks/strength
Gaming is not a heavy load. Even games that we consider properly loading CPU cores and threads are not using large parts of actual CPU die. Even more, when it comes to Intel CPUs not using AVX2 (which almost no games use) will bring power consumption down by a lot.
 
For a chip there is no real difference, is there? Practically all the power that goes in comes out as heat.

For a thermal solution standpoint there is. The peak power can be way above the rated TDP and the cooler can still handle it in bursts. Intel has been taking advantage of this since Turbo Boost was invented. It has never been a absolute limit in power consumption. It is also why the turbo was, until the recent generations, governed completely by temperature. Now it is governed by power and temperature to keep things at least somewhat reasonable. Thermal solutions don't really care about peaks in heat output, they just absorb them and keep going. However, if you have a chip that says it is going to output 95w and then it constantly outputs 125w and you put a heatsink designed for 95w on that CPU, then it is going to have thermal problems. But the TDP is a rating for heatsinks, it is basically saying if you want the advertised performance out of this CPU, your heatsink better be able to handle this much heat.

For example, my 8700K will boost to 4.6GHz when under full load(Cinebench) and consume almost 140w. This is the default behavior of the Z390 motherboard I have it in. The motherboard decides the power limit, because the motherboard manufacturer knows what their board is capable of delivering and for how long. In that Z390 motherboard, that 140w only lasts for about 60 seconds before it start to dial back, as long as the CPU cooler can keep up(which mine has no problem doing). By the end of a Cinebench run the CPU is running at 4.4GHz and the power consumption is back down below 100w. However, if I take that same 8700K and put in in the B365 motherboard that I have it never goes over 100w and 4.3GHz at full load. But at any time, I put a heatsink that can't handle that higher heat output on the CPU, then it will detect the higher temperatures and throttle back to 95w or less if needed.

But the entire point I'm trying to make is that the TDP was never an absolute power limit on the CPU, and there is not guarantee that the CPU won't consume more than that, and this goes back to Nehalem.
 
Turbo was not governed completely by temperature. There have been power limits in place for a long while. Power limits simply were not hit or were not hit in a significant way.
Stock 8700K will not boost to 4.6GHz on all cores, not even with the fudged power settings. Frequency table is 4.3GHz for max all-core turbo. If yours does, it's MCE or equivalent in motherboard BIOS.
 
Good point. That would explain why the techpowerup gaming consumption (average, not just random spikes) shows the 10500 and 5600x as nearly identical. Even the non k 10700 is only showing 8 watts higher than the 5600x when there both at full clocks/strength yet if you only measured the short spikes the 10700 would be way higher

I've run windows perfmon for several different days for myself, with a 5 second resolution. Example below.

100% usage for >=5s is not a scenario for me. I know 100% sometimes happens during things like file decompress, but it doesn't last long enough to show up here. I'm sure some will come in talking about encoding or some such but that's red herring crap IMO, it's like someone talking about how often they do tractor pulls with their Hyundai.

The big spikes at the start of the day is running a VM, which is not a typical workload. The end of the day, that's gaming. Note it never never goes much over 50% on any core for > 5s. The rest of the time while working and normal stuff like browsing / listening to itunes / youtube and so on, it's pretty damn near zero.


1612374870469.png
 
For our purposes - the same power that goes into CPU will come out as heat.
Oh bullfeathers! That is NOT true at all. Also not true is you speaking for "our" purposes.

You are essentially dismissing all the "work" a CPU does. That's just silly and does NOT accurately reflect the laws of physics you call upon to justify your claims.

A 65W CPU today does a heck of a lot more "work" in the same amount of time while consuming significantly less energy than a 65W CPU from years past. That would be impossible if what you claimed was true.

Practically all the power that goes in comes out as heat.
So what? That is NOT the point - despite how much you want it to be. You keep dismissing, ignoring, or don't understand (I don't know which) the most important point and that is the amount of work being done with the amount of energy that is NOT going up in heat.

"Machine 1" consumes 100W of energy per minute and gives off 95W in the form of heat. It moves 10 buckets of water 10 feet in that minute.

"Machine 2" consumes 100W of energy per minute and gives off 95W in the form of heat. But it moves 20 buckets of water 10 feet in that minute.

See the difference? That's what matters for "our" purposes.
 
Oh bullfeathers! That is NOT true at all. Also not true is you speaking for "our" purposes.
You are essentially dismissing all the "work" a CPU does. That's just silly and does NOT accurately reflect the laws of physics you call upon to justify your claims.
OK, let me go back to definitions. Work as in what happens inside a CPU. Transistors switch, electrons move and all that stuff. CPU performance does not really come into play at this stage. It could be an arbitrary amount of transistors switching back and forth (well, ideally staggered switching to get even remotely steady consumption over time).

We were talking about TDP, power consumption and resulting heat output, no?

Edit: CPU performance does not play a part in how ICs use power. Unless you are saying that higher CPU performance will result in consumed power going to something else than heat. I would really like to see source or at least reasoning for that.

A 65W CPU today does a heck of a lot more "work" in the same amount of time while consuming significantly less energy than a 65W CPU from years past. That would be impossible if what you claimed was true.
Split this into a separate quote. The major factor for this is evolution towards smaller manufacturing processes, making transistors smaller and more efficient (less energy to switch).
If you want to nitpick then yes, this is very simplified and does not account for many other factors. The first things that come into mind are voltages used along with their efficiency curves and potential architectural efficiency gains.
 
Last edited:
CPU performance does not really come into play at this stage.
Of course it does. Performance determines how much "work" can be accomplished in a given amount of time with a given amount of energy.
For all intents and purposes it could be an arbitrary amount of transistors switching back and forth
What??? Do think those gates are just flipping and flopping back and forth for fun or no reason? NOOOOO! They are doing "work"! Crunching numbers. Processing data.

I go back to my previous statement. You keep dismissing, ignoring, or just plain don't understand that the amount of work being accomplished cannot just summarily be omitted from the equation when determining a processor's (or any machine's) efficiency. Work must be factored in too!

For the purpose of the this thread in relation to Intel's definition of TDP, that value is used to determine how much cooling is required. It is NOT meant as a means to compare that Intel CPU to an AMD CPU. That's why if you go to that Intel CPU's ARK again (see here) and click on the "?" next to TDP, you will see where it directs readers to the Datasheet for "thermal solution requirements". It does not mention efficiency or work accomplished. Work load, yes, but that is not the same as work accomplished.
 
OK, my statement that I stand by is that power going into an IC will come out as heat.
This was a response to what you said above:
Power "consumed" does NOT and never has equaled power "dissipated"!
 
Last edited:
OK, my statement that I stand by is that power going into an IC will come out as heat.
It is still wrong, or at least incomplete Why? Because some of the power going in is being consumed to do work (flip gates, crunch numbers, etc.) too.

I don't understand why you can't or refuse to see that.

It is like an incandescent light bulb. No argument (at least I hope not) that "most" of the energy consumed is being converted into heat and not light. But it is still an indisputable fact that some (even if a small amount) of the energy being consumed is indeed, being used for "work", or in this case, to create light.
 
TDP was also never meant, and still isn't meant to be a measure of power consumption. It is measure of thermal output to determine heatsink size.

Bingo... but how do you review something with random heatsinks at the exact measure of the TDP they put in specs?

It will either not perform optimally, or it will brutally exceed TDP. Usually the CPUs do the latter and then start doing the former. Yoyo'ing to keep up, and if you remove the lock on it, they go all over the place. What used to be a simple vcore adjustment is now a whole range of tricks to keep thermal headroom and still extract some semblance of an OC.

In the end its just the same thing. Power = heat.

It is still wrong, or at least incomplete Why? Because some of the power going in is being consumed to do work (flip gates, crunch numbers, etc.) too.

I don't understand why you can't or refuse to see that.

It is like an incandescent light bulb. No argument (at least I hope not) that "most" of the energy consumed is being converted into heat and not light. But it is still an indisputable fact that some (even if a small amount) of the energy being consumed is indeed, being used for "work", or in this case, to create light.

Yes, and then we touch upon the issue of 'efficiency'. Intel has, over the past generations, constantly nudged its processors to clock higher 'when they can' which is an efficiency killer, and a heatwave guarantee. The why behind that is only to look good on spec sheets and in reviews with optimal conditions, while the quality of life of using such a CPU has steadily gone to the shitter. Aggressive temperature cycling doesn't really prolong the lifetime of any component in a system either.

That's a steep price for 5 Gigahurtz to look good. And that is why the TDP as it is being used now is a complete lie, when combined with the specs they show us. If you don't read the Intel Bible on Turbo states that is.

But if you think this through... the work being moved is irrelevant in a discussion about TDP. Performance per watt, does not relate to output temperature. The only thing that relates to temps, is the actual power going in. After all, in a comparison you're looking at an infinite amount of work. No matter how much it moves, you will need all the power it can parse through and this will result in the same temperatures.
 
Last edited:

Lying about power consumption numbers to make your products look good is just despicable.
Thank God I have a 4690k which means I don't have to deal with this mess.
Nowhere in that article was the word "lying" ever used. Anybody who buys into K processors with 4+ cores should already know what they have to cool and not be surprised at temp spikes to 99C under inadequate heat dissipation. Overclockers have known this for a decade now. If you intended to cast a blanket net, you've missed quite a few other fish.
 
The why behind that is only to look good on spec sheets and in reviews with optimal conditions, while the quality of life of using such a CPU has steadily gone to the shitter. Aggressive temperature cycling doesn't really prolong the lifetime of any component in a system either.
Quality of life? I have seen nothing to suggest Intels have a shorter life expectancy than AMDs. Got a link?

And of course Intel wants their CPUs to look good. That's called marketing. Its why Truck Maker A claims their truck is #1 because it gets better gas mileage and Truck Maker B claims theirs is #1 because it can pull more weight and why Truck Maker C claims theirs is #1 because it has more horsepower - and they all are right!

Aggressive temperature cycling? What does that even mean? EVERY CPU can and does go from cold (ambient) when idle to fully temperature when pushed in just a few clock cycles and then back to cold again just as quickly when the load drops back to idle again. Temperature cycling is dependent on the load and cooling.
 
Quality of life? I have seen nothing to suggest Intels have a shorter life expectancy than AMDs. Got a link?

And of course Intel wants their CPUs to look good. That's called marketing. Its why Truck Maker A claims their truck is #1 because it gets better gas mileage and Truck Maker B claims theirs is #1 because it can pull more weight and why Truck Maker C claims theirs is #1 because it has more horsepower - and they all are right!

Aggressive temperature cycling? What does that even mean? EVERY CPU can and does go from cold (ambient) when idle to fully temperature when pushed in just a few clock cycles and then back to cold again just as quickly when the load drops back to idle again. Temperature cycling is dependent on the load and cooling.
- Quality of life: high temperature peaks are low quality of life; your fans get noisy. Your hands on a laptop get hot. I didn't mean durability/endurance. Laptop CPUs did always get hot, but its a difference if they slowly creep to 80C and then even slower to 85C, or if they boost straight to 85C and then cool back to 50 to start it all over again, all the time. The behaviour has changed, and Sandy Bridge was, for Core, in the optimal position. 22nm made a big dent, partly due to increased density. But when Intel started needing those last few hundred megahertz to keep competing, the limits have been stretched further and further. Yes, I do believe devices with Intel CPUs that boost aggressively are liable to last shorter than they used to in the past. Time will tell, but the average lifetime of recent laptops is nothing to write home about in general. Is AMD different? I don't think that is the subject, and I think they have a lot of work especially on mobile CPUs left to do.

- Aggressive temp cycling means what is described above. The limits are moved ever closer to the absolute boundaries of what the chip can do without burning to a crisp. What used to peak briefly at 80C, now peaks to 85C or more. At the same time, idle temps have actually dropped due to more efficient power states, and because idle requires lower clocks than it used to due to IPC gains.

As always the devil is in the details, and Intel is doing a fine job creating a box of details that cross the line.
 
- Quality of life: high temperature peaks are low quality of life; your fans get noisy. Your hands on a laptop get hot. I didn't mean durability/endurance. Laptop CPUs did always get hot, but its a difference if they slowly creep to 80C and then even slower to 85C, or if they boost straight to 85C and then cool back to 50 to start it all over again, all the time. The behaviour has changed, and Sandy Bridge was, for Core, in the optimal position. 22nm made a big dent, partly due to increased density. But when Intel started needing those last few hundred megahertz to keep competing, the limits have been stretched further and further. Yes, I do believe devices with Intel CPUs that boost aggressively are liable to last shorter than they used to in the past. Time will tell, but the average lifetime of recent laptops is nothing to write home about in general. Is AMD different? I don't think that is the subject, and I think they have a lot of work especially on mobile CPUs left to do.

- Aggressive temp cycling means what is described above. The limits are moved ever closer to the absolute boundaries of what the chip can do without burning to a crisp. What used to peak briefly at 80C, now peaks to 85C or more. At the same time, idle temps have actually dropped due to more efficient power states, and because idle requires lower clocks than it used to due to IPC gains.

As always the devil is in the details, and Intel is doing a fine job creating a box of details that cross the line.

Quality of life? Really?

One of the most ridiculous things I've ever read on a tech site - and that's saying a lot.
 
my locked i7-8700 be like 120w while gaming (advertised 65w)

my unlocked 3900x be like 95w while gaming (advertised 105w)

double the cores lol

if Intel had started measuring their rated TDP from boost clocks, it'd be a different story
 
- Quality of life: high temperature peaks are low quality of life; your fans get noisy. Your hands on a laptop get hot. I didn't mean durability/endurance.
Nah! Yes you are right. Things that "annoy" humans may affect our quality of life. But is that really criteria you want to use to decide which CPU is better?

Are you really suggesting AMDs don't get hot too?

What you are describing to me is poor design by the laptop maker or PC builder. Poor choice of fans, inadequate case cooling, etc.
but the average lifetime of recent laptops is nothing to write home about in general.
That may be true but you are suggesting they are failing because the processors are failing and in particular, that those with Intels are failing at a faster rate! Not buying it. Show us evidence.

Frankly, I cannot recall the last time I saw a CPU (Intel or AMD) that just decided to die.
 
Back
Top