• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel lying about their CPUs' TDP: who's not surprised?

I think anyone still following this thread, if they haven't already done so, should look at this post by Zach_01 for Intel's own summary of TDP for their chips, as well as watch (as in the whole thing; I know it's long-ish) the GN video on AMD TDP linked a couple posts later.

Things we've learned over the course of this thread (YMMV):
  • TDP does not mean, and has never meant, maximum power consumption
    • It can, however, resemble average power consumption at base frequency, particularly with Intel
  • Intel and AMD calculate TDP differently, in ways that don't necessarily produce a useful value for an end user
  • Modern turbo and boost strategies can push power consumption well past TDP for short periods
    • Certain computational loads can also drive it higher over longer periods.
  • Overclocking completely obviates TDP as a useful value.
My takeaway is this: neither manufacturer is lying about TDP, AFAICT. It's more Thermal Design Power not actually meaning what it sounds like it should, and we DIY-ers latching onto it because it's all there is (outside of reviews and such, of course). Something more meaningful would be nice, but I'm not sure there's a compelling reason for either company to provide it. If they do, it's certainly not going to be on behalf of a "handful" of enthusiasts on forums. I mean, if even the cooler manufacturers are unhappy with it, and the chipmakers won't provide something better for them, it's probably a lost cause.
Agreed.
What a perfect way to end a thread.
 
What about the 10900K? Why do Z590 boards have VRMs that rival Threadripper? Yesterday on The Full Nerd a question was asked. Would a 1200 Watt PSU be enough to run to 3090s and a 10900K? The answer was non committal. I have run 2 Vega 64s with a 2920X with no concern on my 1200 Watt PSU.
Yes. I'm sure it's the CPU and not the fact that each GPU(s) in that scenario are wildly different. According to TPU's own numbers a 3090 will draw 350-450W, where as a Vega 64 will draw 290-310W. This is why we don't compare completely different setups.
 
Back
Top