• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel lying about their CPUs' TDP: who's not surprised?

Joined
Jan 1, 2021
Messages
1,067 (0.67/day)
System Name The Sparing-No-Expense Build
Processor Ryzen 5 5600X
Motherboard Asus ROG Strix X570-E Gaming Wifi II
Cooling Noctua NH-U12S chromax.black
Memory 32GB: 2x16GB Patriot Viper Steel 3600MHz C18
Video Card(s) NVIDIA RTX 3060Ti Founder's Edition
Storage 500GB 970 Evo Plus NVMe, 2TB Crucial MX500
Display(s) AOC C24G1 144Hz 24" 1080p Monitor
Case Lian Li O11 Dynamic EVO White
Power Supply Seasonic X-650 Gold PSU (SS-650KM3)
Software Windows 11 Home 64-bit

Lying about power consumption numbers to make your products look good is just despicable.
Thank God I have a 4690k which means I don't have to deal with this mess.
 

Lying about power consumption numbers to make your products look good is just despicable.
Thank God I have a 4690k which means I don't have to deal with this mess.

1) that's been true for a while - for both vendors, the chips will boost past specs if allowed to,
2) peak power consumption depends on mobo turbo limits, and most of the mobo manufacturers 'cheat' and allow the chip to boost beyond specs (which is fine, it's a type of overclock) as per that article.
3) yes 10 cores on 14nm is power hungry, but it still more efficient than a 4690k perf/watt. A 4690k can peak at 152W which is really not that far behind the 10700k and slightly over the 10600k.

Neither one of those is a 'mess' nor is it a reason to stick to a 4690k.
 
That's turbo boost for you. TDP is rated at "base frequency", that depressingly low figure well under the turbo speed. For example, take the 10900k, base frequency 3.7GHz, 125w. All bets are off once turbo kicks in.
 
Also the idle -- the 4690K doesn't have all the newer power saving tech so it idles at like 56W stock, where the 10 series sit at 15-35W depending on flavor... if you leave your computer on alot you would actually save a bit of power going to the newer chips.
 
Their TDP should have been considered as a joke for few years already. But I wouldn't mind if just the cooler is fit to do its thing.
 
The worst part is i know intel fanboys who rabidly defend these stats and say its lies


one is still on an i7 970 "intels done me great all these years, i trust them!"
 
The worst part is i know intel fanboys who rabidly defend these stats and say its lies


one is still on an i7 970 "intels done me great all these years, i trust them!"

I'm so confused... AMD and Intel have been doing the same thing for FOREVER ?... the phenoms were rated for 94W that sucked down over 200W... Zen 3 while extremely efficient also consumes over its rated TDP... OP is posting on a chip rated for a TDP of 88W that at stock config will eat over 150W. Thermal Design Power (TDP) != Power Consumption.

What exactly is the problem? Is it that motherboards are yolo boosting to the moon because they can? Is it because intel can't get its sh*t together and is still on 14nm? I guess I am missing the part where we decided this was Intel's fault for lying...
 
What exactly is the problem? Is it that motherboards are yolo boosting to the moon because they can? I guess I am missing the part where we decided this was Intel's fault for lying...
My opinion would be that

a) the cheap motherboards have hard time with their crappy VRMs on higher end chips
b) getting better cooling always costs more
 
My opinion would be that

a) the cheap motherboards have hard time with their crappy VRMs on higher end chips
b) getting better cooling always costs more

Ok but... this is partially true. GIGABYTE and ASROCK cheap boards have severe VRM issues and are terrible but - you can get $140-150 z490 boards from Asus and MSI all run a 10850K/10900 at 5+ghz with no vrm issues whatsoever.

Cooling a hot chip is expensive for sure -- but considering that the 5600x is going roughly for the same price as a 10850K right now (within about $25) , you're still looking at the "budget" option. 5800x comes with no cooler either and is about $120 more.
 
I'm so confused... AMD and Intel have been doing the same thing for FOREVER ?... the phenoms were rated for 94W that sucked down over 200W... Zen 3 while extremely efficient also consumes over its rated TDP... OP is posting on a chip rated for a TDP of 88W that at stock config will eat over 150W. Thermal Design Power (TDP) != Power Consumption.

What exactly is the problem? Is it that motherboards are yolo boosting to the moon because they can? Is it because intel can't get its sh*t together and is still on 14nm? I guess I am missing the part where we decided this was Intel's fault for lying...
Like I was saying, TDP is measured at "base frequency". Both Intel and AMD are using some form of turbo boost that will go very close to the limit of the silicon, provided the power delivery and cooling is good enough.

TDP wasn't terrible with older chips, like my 2600k, because turbo boost wasn't as aggressive and we were stuck with quad cores until the 8th Core generation. With 8 and even 10 core chips pushed to the limit, you're going to see a lot of power consumption. I can make my 2600k draw tons of power too, if I slapped a huge cooler on it, clocked it to 5GHz and flooded it with enough voltage to keep up.
 
Ok but... this is partially true. GIGABYTE and ASROCK cheap boards have severe VRM issues and are terrible but - you can get $140-150 z490 boards from Asus and MSI all run a 10850K/10900 at 5+ghz with no vrm issues whatsoever.

Cooling a hot chip is expensive for sure -- but considering that the 5600x is going roughly for the same price as a 10850K right now (within about $25) , you're still looking at the "budget" option. 5800x comes with no cooler either and is about $120 more.
I'd still get a 3600 for a bang for buck setup like I did several month ago.. :toast:
 
I'm so confused... AMD and Intel have been doing the same thing for FOREVER ?... the phenoms were rated for 94W that sucked down over 200W... Zen 3 while extremely efficient also consumes over its rated TDP... OP is posting on a chip rated for a TDP of 88W that at stock config will eat over 150W. Thermal Design Power (TDP) != Power Consumption.

What exactly is the problem? Is it that motherboards are yolo boosting to the moon because they can? Is it because intel can't get its sh*t together and is still on 14nm? I guess I am missing the part where we decided this was Intel's fault for lying...
its because the numbers have meant less and less every generation, to the point a 65W CPU use more power than a 125W in *INTELS OWN PRODUCT STACK*

p1kalmig2k069.jpg


AMD's current ones are more the accepted norm, with TDP being an 'average' and the reality being a little higher (65W = 85W, 105W = 140W)

Intels OTOH... 65W = 214W and 125W = 204W
 
I'm so confused... AMD and Intel have been doing the same thing for FOREVER ?
AMD's definition of TDP is something else & doesn't really translate into power consumption directly. We've been over this, also I'll add that AMD does enforce their "TDP limits" more stringently & there's a whole host of other settings that affect it as well like PPT, EDC,TDC et al.
 
I really don't see how any of this is "lying" if one has the brain capacity to put two and two together, and figure out that TDP doesn't govern jack shit related to actual power consumption for either company, and hasn't ever in recent memory. The only reason it's in the spotlight now is because of the wide gulf between base clock and boost clock. PL1 isn't some new kid on the block.

Now for mobile chips, Intel does some nefarious marketing manipulation using its TDP-up and TDP-down mechanism to misrepresent what its chips actually do in a practical TDP configuration. Now THAT's borderline lying.

As for the "but my 4690K is efficient" LOL nice one, try putting an AVX load on that chip for once and see what "TDP" means. Sitting here looking over at my 4790K, trying to think of all the times when 88W ever meant anything to it
 
I really don't see how any of this is "lying" if one has the brain capacity to put two and two together, and figure out that TDP doesn't govern jack shit related to actual power consumption for either company, and hasn't ever in recent memory. The only reason it's in the spotlight now is because of the wide gulf between base clock and boost clock. PL1 isn't some new kid on the block.

Now for mobile chips, Intel does some nefarious marketing manipulation using its TDP-up and TDP-down mechanism to misrepresent what its chips actually do in a practical TDP configuration. Now THAT's borderline lying.

As for the "but my 4690K is efficient" LOL nice one, try putting an AVX load on that chip for once and see what "TDP" means. Sitting here looking over at my 4790K, trying to think of all the times when 88W ever meant anything to it

please use your logic to explain how the 65W 10700 uses more power than the 125W 10700k
 
That's turbo boost for you. TDP is rated at "base frequency", that depressingly low figure well under the turbo speed. For example, take the 10900k, base frequency 3.7GHz, 125w. All bets are off once turbo kicks in.

This is all there's to it.

Disable boost and it's fine.

please use your logic to explain how the 65W 10700 uses more power than the 125W 10700k

Boosts slightly higher?
 
Boosts slightly higher?

That explains why it uses the power.

I want you to explain why intels marketing has the lower TDP CPU using more power than the higher wattage one.

The marketing and TDP ratings are the problem here, not the technical reasons why they use the electricity - the magical lighting inside the melted sand make zappy zappy hot, but intels fudging the numbers really badly here.
 
Lower quality silicon.
best answer so far, tbh.

Still doesnt excuse intel from advertising half the TDP (THERMAL design power) for the chip that uses more power and produces more heat
 
please use your logic to explain how the 65W 10700 uses more power than the 125W 10700k

Most likely down to most Z motherboards running the chip without power limits to begin with or the reviewer manually doing it and it having a worse ihs than the k chip and most likely being a lesser bin so higher voltage to hit whatever boost frequency.

Asus is the only company afaik that enforces power limits at least till you enable xmp lol
 
I'd still get a 3600 for a bang for buck setup like I did several month ago.. :toast:

hope you didn't pay more than $180 or so... i think i lucked out at ~$150-160, however you want to calculate the $20 combo savings at Microcenter... (before the fall hardware stock madness)
 
That article is about 3.5 years too late.
The CPU that started this was 8700K which ran at~130W at full blast (the "boost" 4.3GHz on all cores) and that came out in autumn 2017. High-end Intel CPUs have only gotten worse since.
AMD isn't slouch any more either, since Ryzen 3000-series there is this complex system for power limit which in large picture boils down to power limit at 1.35x TDP.

I'm so confused... AMD and Intel have been doing the same thing for FOREVER ?... the phenoms were rated for 94W that sucked down over 200W...
This has not been going on forever. There have been CPUs that exceed TDP to some extent but this was not a major issue until a few years ago. Top-end Phenom was 1100T and it generally stayed in its 125W TDP. There was the FX 9590 that went over 200W but it also had TDP of 220W :)

TDP wasn't terrible with older chips, like my 2600k, because turbo boost wasn't as aggressive and we were stuck with quad cores until the 8th Core generation.
TDP was not terrible because it was enforced as power limit. The part about not having to fudge the numbers or outright lie because the chips genuinely did not draw more than TDP of course helped :D
 
Last edited:
Back
Top