• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Posts 10th Gen Core Power Limit and Tau Values

I can't buy an Intel processor anymore. The power bill alone puts me off since my pc is used by multiple people during most of the day.

Add to that a new motherboard, even if I had an Intel motherboard to start with, and the cost of the processor itself and for me it is an AMD clean sweep from budget to high end.
You do realize that average power consumption (at stock at least) is still lower on Intels when comparing chips with same core/thread counts, right? That's especially true if you use X570 with AMD.
 
You do realize that average power consumption (at stock at least) is still lower on Intels when comparing chips with same core/thread counts, right? That's especially true if you use X570 with AMD.
what does the chipset have to do with it ?

this is a scenario which most of us will find most useful,a mid range chip in gaming
start at 2:00
about 10-15 degrees cooler and 10w less power

with k-skus you're throwing power efficiency out of the window tho and you'd better be ready to get a good cooler.
 
A bit more overall power because the 2080Ti can better stretch its legs with Intel, the cpu alone is actually slightly lower. In games where both cpus will fully utilize the gpu, the overall will be slightly lower as well, hence we get this average:
power-gaming.png
 
You do realize that average power consumption (at stock at least) is still lower on Intels when comparing chips with same core/thread counts, right? That's especially true if you use X570 with AMD.

Ryzen 7 3800x: 95W TDP, 3.9GHz Base Clock
Core i7 10700KF: 125W TDP, 3.8GHz Base Clock

3800x has a higher IPC, higher base clock, and possibly lower power consumption too.
 
is xt line out yet ?

Ryzen 7 3800x: 95W TDP, 3.9GHz Base Clock
Core i7 10700KF: 125W TDP, 3.8GHz Base Clock

3800x has a higher IPC, higher base clock, and possibly lower power consumption too.
ipc is an elusive terms,if you take cinebench only as you method then yes
base clocks don't really matter,neither for intel or amd so dunno why you're bringing them up
peak power consumption is lower on ryzen's 7nm,but per-use power consumption can vary.
I linked this video earilier,start at 2:00
sorry I don't have one with 10700f vs 3800x,not that many out yet.this is as close to getting same core/thread config at similar frequencies as I can find.
 
is xt line out yet ?


ipc is an elusive terms,if you take cinebench only as you method then yes
base clocks don't really matter,neither for intel or amd so dunno why you're bringing them up
peak power consumption is lower on ryzen's 7nm,but per-use power consumption can vary.
I linked this video earilier,start at 2:00
sorry I don't have one with 10700f vs 3800x,not that many out yet.this is as close to getting same core/thread config at similar frequencies as I can find.
Your "IPC" means gaming only?

I bring up the base clock because TDP is measured at base clock (according to Intel).
 
no that was clearly a commentary about per use power vs peak power not ipc,I itemized my response from top to bottom.
should I have used numbers ?
I forgot to write to take a look at package power readings,sorry.
 
A bit more overall power because the 2080Ti can better stretch its legs with Intel, the cpu alone is actually slightly lower. In games where both cpus will fully utilize the gpu, the overall will be slightly lower as well, hence we get this average:
power-gaming.png
You kinda played yourself :
1591430967232.png

It's odd, but the 8 core 3700x got a lower power consumption than the 6 core 3600/x, while having slightly higher clocks and more cores. Probably having something to do with AMD using higher binned chip as you go up in the line-up.
 
You kinda kplayed yourself :
View attachment 158015
It's odd, but the 8 core 3700x got a lower power consumption than the 6 core 3600/x, while having slightly higher clocks and more cores. Probably having something to do with having using higher binned chip as you go up in the line-up.
well,as I understand it it has a lot to do with binning.
I think how 3000 manages ccx's in gaming is to maximize the load on fastest cores within one ccx.the reason why so many times differences between 3600,3700x and 3900x are fractions in gaming is not actually bad core count scaling but the fact that each ryzen 3000 sku is a higher bin.
on 3700x you just have 2 more cores sitting at low load,it's not gonna contribute much to power draw especially at ryzen's efficiency levels.the main cores being more efficient make up for that.
 
Last edited:
So basically all 10th gen. CPUs ranging from Pentium to Core i9 have to limit their max turbo speeds to a certain amount of time in order to stay within their increased power limits during that time. This is pretty sad considering that my Core i7-7700 runs on its max. turbo speed all the time while also being within its specified 65 W TDP all the time.

I'm sorry Intel, this is not good enough for me.
 
Last edited:
I thought no one paid attention to tdp once everyone watched Steve’s video lol :confused:

I can’t say I’m shocked. If I did stick with intel I could cool them at least, since it doesn’t look too far off of the numbers I see with my first gen.:laugh:
 
Their power consumption issue is completely overblown; it only really starts to get up there when you OC them to the limit and put them under a stress test or a very demanding rendering workload. In all other instances it's not bad at all and you'll be fine with any half decent motherboard and air cooler.
 
I can't buy an Intel processor anymore. The power bill alone puts me off since my pc is used by multiple people during most of the day.

Add to that a new motherboard, even if I had an Intel motherboard to start with, and the cost of the processor itself and for me it is an AMD clean sweep from budget to high end.
this should get this post straight

wh per 60 sec idle


wh per 25 minut run of blender - big scene.


energy cost of a big blender scene.the difference between 10900k wo.power limits and 3900x is same as 6 min of idle time power draw.


small blender scene


3 min run of x264 encoding


90 sec run of 7-zip



4 min run of stockfish


whole pc in witcher 3


whole pc during video playback - ridiculous


Gee,I hope you've got a good place to put all those power savings.

whole article in Polish,use a translator of your choice

Oddly enough, the tested processors are not very far apart in terms of energy efficiency.

And the most amazing thing is that Intel has been able to extract a little from the fabrication technique used for years, although a year and two years ago we were pretty sure that nothing else could be done with Skylake architecture and the 14 nm class manufacturing process.

measuring peak power draw is wortless in the face of tests like this one.
 
Last edited:

I think at least in this case, AMD was honest about the TDP. Intel in this case slapped a big 65W sticker on the specs, but in actual fact, the fact here is that it draws at least double of it at the minimal, and almost 3.5 to 4x. The worst part is that the stock CPU can barely manage the power requirement for the non K i5 models. So I doubt a user will even see the supposed performance with the stock paper weight cooler from Intel.
 
Looks good to me, great actually. All of those systems are sipping power :laugh:
 
Precisely, if you are a somewhat average user, who spends quite a bit of time on the PC doing undemanding tasks like browsing, watching Youtube, checking email, etc. and/or leave it idling for any more than a couple of minutes, a comparable Intel system will actually consume less electricity overall still and that's against AMD's "latest and greatest" 7 nm chpis, against previous Zen(+) ones there was no contest at all (not to even mention the Faildozers). Unless the red team pulls a miracle with Zen 3 and Zen 4, they'll be pretty much toast when Intel finally moves to 7nm as well (and they will sooner or later) and even 3nm won't be able to help them then...
 
Precisely, if you are a somewhat average user, who spends quite a bit of time on the PC doing undemanding tasks like browsing, watching Youtube, checking email, etc. and/or leave it idling for any more than a couple of minutes, a comparable Intel system will actually consume less electricity overall still and that's against AMD's "latest and greatest" 7 nm chpis, against previous Zen(+) ones there was no contest at all (not to even mention the Faildozers). Unless the red team pulls a miracle with Zen 3 and Zen 4, they'll be pretty much toast when Intel finally moves to 7nm as well (and they will sooner or later) and even 3nm won't be able to help them then...
correct.
peak power draw is lower on amd
typical power draw is same or lower on intel due to amd's boost
I mean,look at video playback power draw values.that 3000 is probably boosting like crazy.you'd need manual tuning,and end up on same level as intel in the end (maybe).their cpu resource management is way too aggressive - it ruins arch/node efficiency for normal tasks.

 
Id say for 2020, it is pretty terrible.
 
I think at least in this case, AMD was honest about the TDP. Intel in this case slapped a big 65W sticker on the specs, but in actual fact, the fact here is that it draws at least double of it at the minimal, and almost 3.5 to 4x. The worst part is that the stock CPU can barely manage the power requirement for the non K i5 models. So I doubt a user will even see the supposed performance with the stock paper weight cooler from Intel.
It's not only about cooling. Even with a decent cooler, your CPU won't boost past the time limit unless your motherboard's UEFI has options for disabling the restraints. But then, your CPU is not a 65/95/125 W unit anymore. So either the TDP, or the boost frequency is a lie. Everything is "up to", nothing is guaranteed. This is what I find utterly repulsive in this generation of Intel CPUs.
 
Id say for 2020, it is pretty terrible.
I hear you man. Mine is sipping on 210w @ the wall just sitting there looking at this page :oops:
 
Back
Top