• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel lying about their CPUs' TDP: who's not surprised?

And this is right on Intel's website:
You are absolutely right.

TDP
Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at Base Frequency with all cores active under an Intel-defined, high-complexity workload. Refer to Datasheet for thermal solution requirements.

With Turbo Boost disabled, my 10850K runs Prime95 Small FFTs at 95W so it is well under the 125W TDP rating.

Now can we all agree that Intel is not lying about TDP?
 
For intel yes AMD use Tdp that won't be exceeded in default config while loaded and boosting as high as it can, I think that's the point, Intel use nebula's bull###t. ..
@Arctucas with the stock cooler or a 125watt one?!.

AMD configures its 65W "TDP" parts with an 88 Watt PPT - max power to the socket.

The 5600X is a 65W TDP parts that drew 74W avg in this test. In other words, it drew more than its rated TDP.

The 10600K in this image is a 125W TDP part that actually drew 103 watts avg.

By default the intel rig is doing a better job staying inside its TDP.

Both of these can go way beyond that limit if you power unlock them and OC them.

The main difference being that AMD maxes out at 88W PPT. At that point you are done on AMD platforms with a 65W part.

Intel has no such limit imposed, it will just shut down when it overheats.

In other words, Intel is a way better platform for a tweaker/tuner.


1612487034509.png
 
AMD configures its 65W "TDP" parts with an 88 Watt PPT - max power to the socket.

The 5600X is a 65W TDP parts that drew 74W avg in this test. In other words, it drew more than its rated TDP.

The 10600K in this image is a 125W TDP part that actually drew 103 watts avg.

By default the intel rig is doing a better job staying inside its TDP.

Both of these can go way beyond that limit if you power unlock them and OC them.

The main difference being that AMD maxes out at 88W PPT. At that point you are done on AMD platforms with a 65W part.

Intel has no such limit imposed, it will just shut down when it overheats.

In other words, Intel is a way better platform for a tweaker/tuner.


View attachment 187035
I'd like to look into how Tom's achieve that table , got a link.
One still pulls less, does more and as for overclocking, that's debatable with infinity fabric clocking working so well.
 
I'd like to look into how Tom's achieve that table , got a link.
One still pulls less, does more and as for overclocking, that's debatable with infinity fabric clocking working so well.

Efficiency, thats quite different from TDP so smacks a bit an early lead in to goal post shifting there. If you can't win on one thing, just change the topic right?

But here's your link :
 
5600X power limit is at 76W (Package Power).
The way these limits are set up, it can change based on load and temperature and all that in theory but I have not yet seen it be anything else than 76W.
 
I'm so confused... AMD and Intel have been doing the same thing for FOREVER ?... the phenoms were rated for 94W that sucked down over 200W... Zen 3 while extremely efficient also consumes over its rated TDP... OP is posting on a chip rated for a TDP of 88W that at stock config will eat over 150W. Thermal Design Power (TDP) != Power Consumption.

What exactly is the problem? Is it that motherboards are yolo boosting to the moon because they can? Is it because intel can't get its sh*t together and is still on 14nm? I guess I am missing the part where we decided this was Intel's fault for lying...
Careful your fanboy is showing being impartial. You wouldn't want that let your inner AMD/Intel fandom show.
 
140w is royally over 125w? And that is whole system numbers, not just the CPU.
This is a 65W TDP CPU as per the Intel slide right above it. I even specifically mentioned that even if you reduce all of the idle load (50W) you'll still be grossly out of spec.
 
Here is a 10850K set to run at the same default speed as a 10900K. As R20 is just finishing, the CPU is still running at its full rated speed.

View attachment 186982

The Intel recommended default turbo power limits for the 10900K are 125W long term and 250W short term. The default turbo time limit is 56 seconds. R20 is a short test. A 10900K should have no problem completing R20 at full speed without a hint of throttling.

In a longer test like R23, then the turbo power limit will drop to 125W and it will be throttle city.

Instead of Intel is lying about TDP, the real problem is that Intel CPUs cannot deliver their full rated performance indefinitely when they drop down to their rated TDP. Most consumers do not understand this. Their mobile CPUs do the same thing. Long term throttling so they do not exceed rated TDP.

Intel is like a shady used car salesman. They only tell you what you want to hear. Run a quick R20 test in the store and everything looks great. Head out to the mountains and try to go up a long grade and your shiny new car will be throttling along in the slow lane.

Thank you, at least there is some semblance of sanity and cognitive functioning left on this forum.
 
Incorrect. The base clock is the clock from which the total operating clock is derived which is why CPU's have multipliers and have for 30+years. You are talking about Base Operating Frequency. If you are going to insult and attempt(poorly) to correct someone like @unclewebb, who knows a LOT more about tech than you do, try NOT to embarrass yourself in the process.
your wrong he is actually right lol
 
This is a 65W TDP CPU as per the Intel slide right above it. I even specifically mentioned that even if you reduce all of the idle load (50W) you'll still be grossly out of spec.
Assuming this runs at official spec - which it surprisingly seems to be doing - for the short-ish turbo time (8s or 28s, depending on which Intel's spec version we are looking at, with real values from motherboards often at 56s) CPU can run at 1.25x TDP. That is 81-something W. Compared to idle, some other components also get load (plus PSU efficiency if it was measured at wall) so whole-system consumption rising by 91W from idle to multi-core load does not seem worrying.

Similarly 65W TDP 3700X uses 93W more when comparing 53W at idle to 146W at multi-core (and 3700X idles at about 10W higher compared to Intel CPUs). Its spec says 1.35x TDP for power limit, so 88W. Again, rest of the components (plus potentially PSU efficiency) makes it run at about spec.

Got curious and checked the 3700X review as well - 2700 went from 46W to 123W. Again taking rest of the components etc into account, Ryzen 2000 is the last generation of CPUs that actually put power limit to where TDP is set.
 
Assuming this runs at official spec - which it surprisingly seems to be doing - for the short-ish turbo time (8s or 28s, depending on which Intel's spec version we are looking at, with real values from motherboards often at 56s) CPU can run at 1.25x TDP. That is 81-something W. Compared to idle, some other components also get load (plus PSU efficiency if it was measured at wall) so whole-system consumption rising by 91W from idle to multi-core load does not seem worrying.

Similarly 65W TDP 3700X uses 93W more when comparing 53W at idle to 146W at multi-core (and 3700X idles at about 10W higher compared to Intel CPUs). Its spec says 1.35x TDP for power limit, so 88W. Again, rest of the components (plus potentially PSU efficiency) makes it run at about spec.

Got curious and checked the 3700X review as well - 2700 went from 46W to 123W. Again taking rest of the components etc into account, Ryzen 2000 is the last generation of CPUs that actually put power limit to where TDP is set.

And in both cases, this is a bad movement that brings a bill of some kind towards the end-user, be it cooling, overall power usage, whatever... they're exceeding the TDPs they put on the spec sheet while their older parts did not.

If all parts did always, we'd have a different discussion I think, but that wasn't the case, despite what some here are adamant to keep claiming - but it never was even on the high end. You got better parts there and the lower parts in the stack simply carried much more headroom, also in voltages. There were tons of Intel Ivy Bridge, Haswell and Broadwell CPUs that could run comfortably at vcores well below stock, even while running all core turbo's for their turbo frequency specified on the sheets. Maybe in that event some would need stock volts. But higher? Rarely if ever... I've seen more i7 quads run below the specified 1.2V or even 1.150V than I can count.
 
Don’t confuse TDP with total max power consumption. It’s 2 different things. Users make the mistake and think that they are equal. There is a reason they use “Thermal” in TDP.

AMD configures its 65W "TDP" parts with an 88 Watt PPT - max power to the socket.

The 5600X is a 65W TDP parts that drew 74W avg in this test. In other words, it drew more than its rated TDP.

The 10600K in this image is a 125W TDP part that actually drew 103 watts avg.

By default the intel rig is doing a better job staying inside its TDP.

Both of these can go way beyond that limit if you power unlock them and OC them.

The main difference being that AMD maxes out at 88W PPT. At that point you are done on AMD platforms with a 65W part.

Intel has no such limit imposed, it will just shut down when it overheats.

In other words, Intel is a way better platform for a tweaker/tuner.


View attachment 187035
5600X has a 65W TDP and ~75W PPT.
75W is with PB (precision boost) on, meaning all core boost (whatever that freq is). It’s way above base freq. If you disable PB it will drop to maybe 40-50W PPT for base freq.
Now if you turn PB on and PBO (PB Override) also on, it may draw even higher that 75W if temperature (primarily) allow it.

The 65W rating is what cooler the CPU needs while it is on 75W PPT on specific ambient conditions.
 
And in both cases, this is a bad movement that brings a bill of some kind towards the end-user, be it cooling, overall power usage, whatever... they're exceeding the TDPs they put on the spec sheet while their older parts did not.
If all parts did always, we'd have a different discussion I think, but that wasn't the case, despite what some here are adamant to keep claiming - but it never was even on the high end.
Yup.

There were always exceptions - mostly at high end, 4790K comes to mind from recent times - but ever since power limits were implemented in CPUs they have been generally set at TDP. For a long time this did not even matter because CPU did not manage to use that much power anyway. And then it gradually changed with more and more power fed into CPUs. Power limit shenanigans are a fairly recent development as well. It really started with 8000-series Core for Intel and 3000-series Ryzen for AMD.

At the same time, there is some merit in the reasoning both Intel and AMD (and if we look further from desktop and x86, also other CPU manufacturers) use to accompany these changes. The explanation basically boils down to averaging out the power consumption metrics over time to not exceed TDP "by much", so that whatever cooling is on the thing can manage.

For an average user (which is a vast majority of users) it even does not really matter. Outside synthetic tests or some (not all) productivity workloads, even big hungry Intel CPUs have a reasonable power consumption.

5600X has a 65W TDP and ~75W PPT.
75W is with PB (precision boost) on, meaning all core boost (whatever that freq is). It’s way above base freq. If you disable PB it will drop to maybe 40-50W PPT for base freq.
Now if you turn PB on and PBO (PB Override) also on, it may draw even higher that 75W if temperature (primarily) allow it.

The 65W rating is what cooler the CPU needs while it is on 75W PPT on specific ambient conditions.
76W is power limit on 5600X at bone stock. And it will hold that indefinitely under CPU load.
What exactly do you mean by Precision Boost? Precision Boost is Zen's internal clocking technology. If you turn that off (can you?) it should be a noticeable performance hit because the CPU would not boost.
Precision Boost Overdrive is for the most part simply raised power limit.
 
Last edited:
Precision Boost is AMD's terminology for Intel's SpeedStep I believe. It's the same idea rapid dynamic adaptive voltage control switching adjustments. It's pretty much a voltage LFO from high to low that scales frequency and voltage lower and higher based on the CPU load quickly. Intel if I'm not mistaken is a little more advanced at that particular aspect between the two, but AMD's certainly made progress and gotten better in that region.
 
76W is power limit on 5600X at bone stock. And it will hold that indefinitely under CPU load.
What exactly do you mean by Precision Boost? Precision Boost is Zen's internal clocking technology. If you turn that off (can you?) it should be a noticeable performance hit because the CPU would not boost.
Precision Boost Overdrive is for the most part simply raised power limit.
Of course you can disable it. When we talk about boost “we” mean clock over base freq. Is anything changed the last few years?
Maybe I should have said Performance Boost (Boost over base freq). This is how BIOS settings have it. If you turn this off the max clock will be the base freq.

So if you turn off PrecisionBoost the CPU freq will be fluctuating between minimum(idle) freq (example 2200MHz) and base freq (ex. 3600MHz)

PrecisionBoost is the performance enhancement tech by AMD as they state. And the overdrive is potential further boosting headroom.
Clocking from min to base (2.2GHz~3.6GHz) is not boost.

 
So do PSU calculator websites need to account for these boosts in power?
 
Question, I want to know how much the CPU draws in a sustained gaming or video encoding session at stock settings with all the boost/auto overclocking thingees enabled. Does TDP give a guesstimate of that or should I look at some other spec. (Or just look at a god damn review lol)
 
Question, I want to know how much the CPU draws in a sustained gaming or video encoding session at stock settings with all the boost/auto overclocking thingees enabled. Does TDP give a guesstimate of that or should I look at some other spec. (Or just look at a god damn review lol)
Just look at a god damn review :)
It varies quite noticeably by game and power consumption does not necessarily follow the CPU usage %.
Generally, it should be within TDP but I am sure we can find some exceptions :)
 
Question, I want to know how much the CPU draws in a sustained gaming or video encoding session at stock settings with all the boost/auto overclocking thingees enabled. Does TDP give a guesstimate of that or should I look at some other spec. (Or just look at a god damn review lol)
Your best bet is HWiNFO sensors mode.
For AMD you look “CPU PPT” value and for Intel the “CPU Package Power” value. At least these are for the latest 2-3year CPUs.

If you reset sensor value monitoring right before you start the game and then enter, when you exit after X hours you can see the min/max and also the avg value which is the most important IMO.

If you don’t have the CPU then a review that puts the CPU on different loads and measure it individually is the only way.
 
Incorrect. The base clock is the clock from which the total operating clock is derived which is why CPU's have multipliers and have for 30+years. You are talking about Base Operating Frequency.
For my 4690k there is no Base Operating Frequency.
I have set a Balanced power profile in Windows, which means, it goes down to 800MHz when in idle, and at 4.3GHz when on load, because those are the clocks I've set in the BIOS.
The base frequency is 3.5, which of course I can set if I disable Turbo Boost, but I haven't and so it's just indicative. I could just force-set the multiplier to 28 in the BIOS and have it run at 2.8GHz, but that'd be a waste of a good chip.
 
If you reset sensor value monitoring right before you start the game and then enter, when you exit after X hours you can see the min/max and also the avg value which is the most important IMO.
Also possible to let something draw you some graphs. Either in real time or log the data and do some analysis afterwards.

I have Rainmeter drawing stuff based off HwInfo64 monitoring data:
1612530998676.png
 
For my 4690k there is no Base Operating Frequency.
I have set a Balanced power profile in Windows, which means, it goes down to 800MHz when in idle, and at 4.3GHz when on load, because those are the clocks I've set in the BIOS.
The base frequency is 3.5, which of course I can set if I disable Turbo Boost, but I haven't and so it's just indicative. I could just force-set the multiplier to 28 in the BIOS and have it run at 2.8GHz, but that'd be a waste of a good chip.
Your base operating frequency is 3.5-3.9. You are overclocked right now.. My 3770K did 4300 with stock volts too.
 
Back
Top