• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Alder Lake Doesn't Look Like an Overclockers Dream

400W BABY my PC uses 250 while working on lightroom and i'm already worried about the power bill. But if you're buying this sucker you're not gonna care about the money it will suck you, so go for it. it sure looks like it performs very well i'll give you that, but atleast wait for the reviews.

(PS: I Have one of these [ZM-MFC3] and my Ryzen + 1070 never went over 350W Total)

1634838096207.png
 
400W BABY my PC uses 250 while working on lightroom and i'm already worried about the power bill. But if you're buying this sucker you're not gonna care about the money it will suck you, so go for it. it sure looks like it performs very well i'll give you that, but atleast wait for the reviews.

(PS: I Have one of these [ZM-MFC3] and my Ryzen + 1070 never went over 350W Total)

View attachment 221756
That is awesome man! I have the ZM-MFC2 and with my current rig I have seen just over 630w. Just running WCG on the CPU and the system draws 300w lol..
 
That is awesome man! I have the ZM-MFC2 and with my current rig I have seen just over 630w. Just running WCG on the CPU and the system draws 300w lol..
your computer is a beast tho. Mine is more low range (R7 3700X, 32gb DDR4 2933, GTX 1070, NVME SSD for windows and HD for the files), most of the Times i use the PC on the "Power Saving" windows profile so when i'm just watching a movie or browsing the web its around 160~180w total system Power.
When i see News like this on 300W only in the CPU it really shocks me!
 
I have a nice AMD system too, also overclocked but undervolted lower than yours, so I am not disappointed in the slightest.

I didn’t pledge allegiance to them or Intel though.. I have more Intel systems than AMD’s..

But honestly, if you are bored with hardware news take up a new hobby for awhile. I stepped away for a few years to play with other toys too.

So why make fun of someone who undervolts, if you are doing it yourself? Makes 0 sense to me.

Plus I´m not bored with hardware, I´m just bored with hardware discussions nowadays. You know, discussions like this one or seeing ppl on forums using 100 watts more on their CPUs/GPUs for 4 fps more, and PBO with 1,5v for 1% improvement. Or people judging a CPU by its AIDA64 FPU + max possible almost unstable super overclock power consumption, etc. It´s just boring man, and this kind of news only feeds that poor mentality
 
Last edited:
So why make fun of someone who undervolts, if you are doing it yourself? Makes 0 sense to me.

Plus I´m not bored with hardware, I´m just bored with hardware discussions nowadays. You know, discussions like this one or seeing ppl on forums using 100 watts more on their CPUs/GPUs for 4 fps more. Or people judging a CPU by its AIDA64 FPU power consumption, etc. It´s just boring man
I wasn't making fun of anyone. You felt compelled to tell me about your AMD system, so I did the same.

That may be, but your attitude said otherwise. Feel free to discuss away, I am all for it, but don't start calling people shills and the like just because they chose one company over another. That isn't directed entirely at you btw, so don't feel singled out. I will be more choosey with my words the next time, today I am a bit tired.. late night.
 
Is this something new or....11900k did the same, 10900K the same, 9900K the same. Are people so...short term memory these days?
 
With the complexity of current CPUs and GPUs, overclocking is starting to get built right into the design. We saw it with the 5700XT. AMD basically said each card would hit different boosts based on a series of sensor readings that get made and the variability of each GPU, and those readings can be made in milliseconds. Intel threw something similar out there with Rocket Lake, where lifting thermal restrictions results in the chip boosting as long and as hard as thermal limits allow. This just seems to be the way both companies plan to arrive at peak performance numbers now.

There is nothing wrong with this idea in that it gives the user the most possible performance out the the hardware within its design limits. However, that means that peak power consumption needs to be taken into account. It’s becoming a key differentiator in how these companies are reaching peak performance with their products. If one company requires 400W to outperform a competitor, that has to come into account when you are specing your parts for a system build. If you don’t plan to need this performance, then it’s a lot of expense for nothing. Energy isn’t getting cheaper or more abundant. It’s getting more scarce and more expensive. I’m honestly a little bit surprised we are seeing this approach to achieve more performance. I guess the assumption is most folks don’t buy the highest end products.
 
With the complexity of current CPUs and GPUs, overclocking is starting to get built right into the design. We saw it with the 5700XT. AMD basically said each card would hit different boosts based on a series of sensor readings that get made and the variability of each GPU, and those readings can be made in milliseconds. Intel threw something similar out there with Rocket Lake, where lifting thermal restrictions results in the chip boosting as long and as hard as thermal limits allow. This just seems to be the way both companies plan to arrive at peak performance numbers now.

There is nothing wrong with this idea in that it gives the user the most possible performance out the the hardware within its design limits. However, that means that peak power consumption needs to be taken into account. It’s becoming a key differentiator in how these companies are reaching peak performance with their products. If one company requires 400W to outperform a competitor, that has to come into account when you are specing your parts for a system build. If you don’t plan to need this performance, then it’s a lot of expense for nothing. Energy isn’t getting cheaper or more abundant. It’s getting more scarce and more expensive. I’m honestly a little bit surprised we are seeing this approach to achieve more performance. I guess the assumption is most folks don’t buy the highest end products.

I personally take the power consumtpion very seriously when I buy hardware. Yes, price and or versus performance is the first factor, but power is very important to me. However I don´t measure that power on Furmark or FPU benchmarks, because I want computers to play games and edit music. I want to know the power usage on those tasks, not when I am attempting to beat a benchmark world record.

Ppl used to make a big deal about Ryzen 5000 power consumption compared to Intel, but then if you play a game a compare a 10700k vs 5800x power usage, you will see like 15w-20w more on the Intel side, while you will see 150w more on Aida64. Useless, irrelevant.
 
Hardware discussions are so boring these days man...

Really no one cares if some ppl is trying to make intel look bad. I would be more interested in undervolt numbers, stock clocks and go as low as possible on voltage, or even 100mhz less.

I dont care if the cpu has 3% more performance using 300 or 400 watts, irrelevant to me. Plus, we dont buy CPUs to run Cinebench, Aida or any other useless benchmark all day long. Show me a single game using 225w even

Just wait for that bullzoid or frame chasers guy to make a 50 minute vídeo rambling about motherboards and telling you need to spend 400€ if you want to use a 12900k, like if anyone would need such a vrm

Boring
Yeah Except, some people do actually use their pc for more than gaming occasionally, since 2005 I think I have had a pc heater/folder-cruncher, used to be an FX8350, looks like this could show that heater a thing or two.

Point being even under clocked as mine always are, flat out 24/7 requires a chip that sips(100watt max) or the price of a f£#@&£ GPU spending on cooling, as I have.
And trust me don't go cheap unless your a casual gamer.

Incidentally my entire system uses about 400watts to fold and crunch , down clocked a bit but GPU/CPU loaded up.
 
5200 MHz all core for what?
 
I like to overclock my laptops processor ³% or 82w to run porn .00003% faster.
 
Yeah Except, some people do actually use their pc for more than gaming occasionally, since 2005 I think I have had a pc heater/folder-cruncher, used to be an FX8350, looks like this could show that heater a thing or two.

Point being even under clocked as mine always are, flat out 24/7 requires a chip that sips(100watt max) or the price of a f£#@&£ GPU spending on cooling, as I have.
And trust me don't go cheap unless your a casual gamer.

Incidentally my entire system uses about 400watts to fold and crunch , down clocked a bit but GPU/CPU loaded up.

If someone needs a CPU for intensive tasks that take advantage of every instruction and every core on the chip. my advice is: Don´t bother with mainstream platforms.

There are wayyy better options for that kind of stuff.
 
I like to overclock my laptops processor ³% or 82w to run porn .00003% faster.

I legit wasted a minute of my life to find any way to put those numbers in a formula that resulted in the number 69 :D :laugh:
 
Am I seeing X99 and X299-system-like OC power consumption here? Or like FX? (for example, 5960X and FX 9590) Albeit it's got more cores, especially it being an i9!

So, I guess there's a way around it, especially for gamers.....
 
My god, and I thought that my 3930k consumed a lot of power with a strong overclock. This is insane.
 
Am I seeing X99 and X299-system-like OC power consumption here? Or like FX? (for example, 5960X and FX 9590) Albeit it's got more cores, especially it being an i9!

So, I guess there's a way around it, especially for gamers.....
Like an FX, be serious, the FX used less power to get to 5.5, it would just about do the work of those E core's though so there's that.
 
It's the E cores I'd be more interested on overclocking, but how far can they be pushed beyond stock? Is it like Haswell 102MHz on the BCLK or Skylake's BCLK overclocking that can be pushed more heavily!? I don't see much legroom to overclock the P cores they already are clocked high and the wattage to push them further has to be substantial and appears to be. I don't see that point in that, but the E cores at 3.7GHz I could totally see trying to push towards 4GHz with a modest bump to voltage and wattage.
 
Like an FX, be serious, the FX used less power to get to 5.5, it would just about do the work of those E core's though so there's that.
So FX turned out to be better this time, like I feared. Especially the 8370s! (which I never got a chance to have, but I apparently had a golden-sample-looking FX 8350 from 2014 (a late run) that I doubt I maxed out, as I only tried it at a paltry 4.4) (The VID was only 1.2-something volts on my FX 8350!)
 
NGL, I already accepted that "CPU overclock" is dead and buried for anything outside of merely demonstrating how far one can take a CPU, at least if you're not willing to invest in some very beefy cooling (step aside, 240 mm radiators, 360 mm is the new minimum for trying your hand at overclock.) Plus with the way the CPUs get smarter in handling their own clocks, it might as well become an exercise in futility soon.
My thoughts exactly. CPUs have learned how to overclock themselves for a few generations now. To the point casual overclocking is all but dead. I mean, just look at some recent CU reviews right on TPU: quite often a manually overclocked CPU will score lower than a CPU left at default, because the latter can boost one or two cores only, when the workload so demands.
 
It's probably more the case that PWM controller's have gotten better at supplying and managing voltage to the VRM's.
 
Yeah, IMO it's more fun to find out what all architectures can do with some undervolting, incuding GPUs. I have CPUs and GPUs from all vendors and every one is undervolted except this 9700f which simply will not run stably at it's top turbo with any undervolt, the first pyrite sample I think I've ever received. Works at spec though so I can't complain, and with an .-05v it'll run all core 4.2GHz for a lot of power savings and a minimal performance reduction from it's typical 4.5 all core turbo.

If a CPU or GPU is being taped out, they are not going to run a custom profile for each and every chip. Nope. All chips from one wafer are'nt identical. So one chip might be cool and all at 1.2V and the other requires 1.25v to operate. They just set something in the middle that would work under all conditions. This is why there's headroom left in CPU's and GPU's; some GPU's might work problemless with 1090mv and others require 1140mv to do the very same. It's just luck of the draw really.

Other then that; the above does'nt suprise me. Intels TDP is'nt the TDP your getting at AMD. The PL stages is what make these CPU's so damn hungry. And on top of that the small node pretty much makes it very hard to cool. Lapping, liquid metal, high end watercooler for example, thats the region you need to start looking for if you want marginal improvements over stock. I mean we came from era's where your 300Mhz CPU could be oveclocked to 450Mhz. Or your Athlon 600Mhz just a rebranded 750Mhz was. Or the FX from 3.2Ghz up to 5Ghz if your cooling and board allowed it.

Now it's just ramp up large cooling and let the chip decide whats best for it while keeping silicon health in place. This is how AMD boost works pretty much. Keep it constant under 60 degrees and that boost will be in there a life time.

So FX turned out to be better this time, like I feared. Especially the 8370s! (which I never got a chance to have, but I apparently had a golden-sample-looking FX 8350 from 2014 (a late run) that I doubt I maxed out, as I only tried it at a paltry 4.4) (The VID was only 1.2-something volts on my FX 8350!)

Well, in order to get passed 4.4Ghz ~ 4.5Ghz your board had to support the current the chip needed, and be free of any AMD pre-determined overcurrent limits. If i'm correct it was 25A on the 12V line or so. Higher end boards could yield all the way up to 35 to 40 amps or so. The whole FX line where great overclockers. Not just core-clock wise but also the CPU/NB which was responsible for the L3 Cache / speed as well. That is something most reviewers never really highlighted, but that was the money shot in relation of overclocking in cranking those minimum FPS in games up.

FX's where just badly timed really in a era where single core still had the crown in applications. You can tell because the FX still holds to this day in various games while running a higher end graphics card.
 
Well, in order to get passed 4.4Ghz ~ 4.5Ghz your board had to support the current the chip needed, and be free of any AMD pre-determined overcurrent limits. If i'm correct it was 25A on the 12V line or so. Higher end boards could yield all the way up to 35 to 40 amps or so. The whole FX line where great overclockers. Not just core-clock wise but also the CPU/NB which was responsible for the L3 Cache / speed as well. That is something most reviewers never really highlighted, but that was the money shot in relation of overclocking in cranking those minimum FPS in games up.

FX's where just badly timed really in a era where single core still had the crown in applications. You can tell because the FX still holds to this day in various games while running a higher end graphics card.
I felt the same way, too, I of course didn't expect it to do better in Halo, LOL.

4.4 was without a Vcore increase. I needed more cooling before I even felt like giving it a run with x.264 2-pass encode. I think I had the Sabertooth 990 FX R 2.0 set to aggressive VRM settings.
 
I legit wasted a minute of my life to find any way to put those numbers in a formula that resulted in the number 69 :D :laugh:
I ran your calucations with NASA's super computer, 69.69 is correct.
 
FX's where just badly timed really in a era where single core still had the crown in applications.
What it really lacked was a Windows scheduler that knew the FX was a hybrid 2+1 ALU/FPU design. Windows wouldn’t dispatch an FPU thread to an idle FPU, but would instead assign it to one of the ALUs that was already waiting on its shared FPU to complete a task. I think I got that right anyway. Basically, the chips could have performed better in their day, but just didn’t get the OS support. AMD was essentially broke at that time and couldn’t get that kind of support. I bet Adler Lake will perform worse on Windows 10. It’s just how much that remains to be seen.
 
That is awesome man! I have the ZM-MFC2 and with my current rig I have seen just over 630w. Just running WCG on the CPU and the system draws 300w lol..
1634866905224.png

Just got home from work, here's my old ZM-MFC3. Still working (barelly), 3 of the 4 temp sensors are dead. 39ºC on the Exaust. CPU Die at 51°C with lightwork and 78ºC Full load.
 
Back
Top