• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

14900k - Tuned for efficiency - Gaming power draw

Joined
Jun 14, 2020
Messages
2,678 (1.87/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
So, since most people / reviewers are pushing the 14900k to over 200 watts and draw their conclusions based on that, let's see what a properly setup with minimal effort 14900k can do. Game of choice is TLOU because it's the game with the highest power draw right now, in every other game power drops to between 50 and 70w after the tuning, so testing those is pointless.

First video is a 14900k running stock, out of the box. Power draw almost hits 200 watts. DCLLs are calibrated so reported power draw is actually true.


Second video is after spending 10 minutes into the bios, no undervolting done, basically I turned off HT and locked cores to 5.5 ghz (and also tuned memory). Now performance is around 15% higher while power draw dropped from 200w to 120w peak.



What are the negatives of turning HT off? Well, you lose around 10% multithreaded performance, CBR23 score went from 41k to 36-37k, but temperatures and power draw dropped considerably. If maximum multithreaded performance isn't a priority (remember, even with HT off the 14900k is still one of the fastest CPUs in MT workloads) and gaming is more of your thing, turning it off is worth it.

I also have some results after tuning and undervolting, power draw dropped to below 100w even in TLOU, but that requires stability testing so I don't feel it's relevant here.

Also below a tuned 12900k just for comparison

 
Joined
Dec 14, 2023
Messages
10 (0.06/day)
Location
United States
System Name Overbuilt Intel P.O.S. (In White)
Processor i9 14900k 6.2Ghz
Motherboard MSI Z790 MPG
Cooling Lian Li 360
Memory T-Create Expert DDR5 7200 34
Video Card(s) MSI 4080 OC 3Ghz
Storage Samsung 990 Evo 2Tb
Display(s) 27" flat 2k 144hz G sync
Case Lian Li o11 Dynamic Evo
Power Supply ThermalTake 1000W
Mouse Razer Basilisk Ultimate
Keyboard Keychron K2 75%
Crazy results! didn't know you can turn off hyper threading in bios.
I play a lot of Tarkov and that would actually help a lot
 
Joined
Feb 18, 2005
Messages
5,329 (0.76/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
"My 14900K will draw less power if I disable hyper-threading" is about as smart as "my body will require less food if I cut off both of my legs". If you buy a $600 CPU only to immediately disable half of its features just to get acceptable power draw, you're not being smart - you are, in fact, being the exact opposite.

I'm getting really tired of seeing these "Intel CPUs can be power efficient too" threads/posts. Nobody cares that they can be, the point is that, at stock, they are not. The fact that it's possible to make these CPUs consume sane amounts of power is not the saving grace that everyone who uses them seems to think it is. If it's not good out of the box, i.e. how the vast majority of users will experience it because most users don't tweak CPU power consumption, it's not good period.
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,319 (2.51/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
If you're going to turn off hyperthreading in an i7/i9, just buy an i5 instead.
 
Joined
Dec 31, 2020
Messages
785 (0.64/day)
Processor E5-2690 v4
Motherboard VEINEDA X99
Video Card(s) 2080 Ti WINDFROCE OC
Storage NE-512 KingSpec
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
not really. HT is mostly providing 25% uplift and it's doing more damage than good. in this case not even cutting it. 37 to 41 not 25%. well sometimes it's good to have it.
 
Last edited:
Joined
Aug 9, 2019
Messages
1,522 (0.87/day)
Processor Ryzen 5600X@4.85 CO
Motherboard Gigabyte B550m S2H
Cooling BeQuiet Dark Rock Slim
Memory Patriot Viper 4400cl19 2x8@4000cl16 tight subs
Video Card(s) Asus 3060ti TUF OC
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores CB20 4710@4.7GHz Aida64 50.4ns 4.8GHz+4000cl15 tuned ram SOTTR 1080p low 263fps avg CPU game
Good results, I would opt for UV aswell, probably get below 100W :)
 
Joined
Dec 14, 2023
Messages
10 (0.06/day)
Location
United States
System Name Overbuilt Intel P.O.S. (In White)
Processor i9 14900k 6.2Ghz
Motherboard MSI Z790 MPG
Cooling Lian Li 360
Memory T-Create Expert DDR5 7200 34
Video Card(s) MSI 4080 OC 3Ghz
Storage Samsung 990 Evo 2Tb
Display(s) 27" flat 2k 144hz G sync
Case Lian Li o11 Dynamic Evo
Power Supply ThermalTake 1000W
Mouse Razer Basilisk Ultimate
Keyboard Keychron K2 75%
"My 14900K will draw less power if I disable hyper-threading" is about as smart as "my body will require less food if I cut off both of my legs". If you buy a $600 CPU only to immediately disable half of its features just to get acceptable power draw, you're not being smart - you are, in fact, being the exact opposite.

I'm getting really tired of seeing these "Intel CPUs can be power efficient too" threads/posts. Nobody cares that they can be, the point is that, at stock, they are not. The fact that it's possible to make these CPUs consume sane amounts of power is not the saving grace that everyone who uses them seems to think it is. If it's not good out of the box, i.e. how the vast majority of users will experience it because most users don't tweak CPU power consumption, it's not good period.
I think the point was gaming performance. if you notice the gaming performance increased... if all you're using the cpu for is gaming then you're not really "disabling HALF the features" you're optimizing what you have for what you do. Also it may be on a per game basis. not saying the 14900k is the best thing in the world, it's just the fastest available intel one right now. most people with a new i9 have a mobo with OC capabilities so this is actually really valuable to a large number of people.

I pretty much only game on my PC. what an idiot I am for buying the best available cpu for gaming then tweaking it for gaming while playing games that don't utilize HT..... LOL
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,319 (2.51/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
Some of y'all are forgetting this.

 
Joined
Nov 16, 2023
Messages
490 (2.68/day)
Location
Woodstock IL
System Name I don't name my rig
Processor 13700K
Motherboard MSI Z690 D4
Cooling Air/water/DryIce
Memory Corsair 3600mhz something die cl18 at 4000mhz
Video Card(s) RX 6700 XT
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Antec 850w Continous Power Series (since 2009)
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce. Max all time Freq FX-8300 7685mhz LN2
So, since most people / reviewers are pushing the 14900k to over 200 watts and draw their conclusions based on that, let's see what a properly setup with minimal effort 14900k can do. Game of choice is TLOU because it's the game with the highest power draw right now, in every other game power drops to between 50 and 70w after the tuning, so testing those is pointless.

First video is a 14900k running stock, out of the box. Power draw almost hits 200 watts. DCLLs are calibrated so reported power draw is actually true.


Second video is after spending 10 minutes into the bios, no undervolting done, basically I turned off HT and locked cores to 5.5 ghz (and also tuned memory). Now performance is around 15% higher while power draw dropped from 200w to 120w peak.



What are the negatives of turning HT off? Well, you lose around 10% multithreaded performance, CBR23 score went from 41k to 36-37k, but temperatures and power draw dropped considerably. If maximum multithreaded performance isn't a priority (remember, even with HT off the 14900k is still one of the fastest CPUs in MT workloads) and gaming is more of your thing, turning it off is worth it.

I also have some results after tuning and undervolting, power draw dropped to below 100w even in TLOU, but that requires stability testing so I don't feel it's relevant here.

Also below a tuned 12900k just for comparison


An interesting way to approach some tuning. And there is Nothing wrong with it at all. You produced some results, and they look decent.

I usually cut the e-cores for most benchmarks, because I don't really game. And for some, like say, 3DMark IceStorm, I cut HT and E-cores. It's similar to 3DMark06, it doesn't scale past 6 cores for the cpu test.

Now with that much power draw slashed, that gives room for cpu overclocking. It's much easier to obtain an all core 5.7/5.8ghz with only 8c running. And per core epeen is nice with HT off. So, yeah some games and benchmarks are going to reflect HT turned off as long as you have enough cores to cover the game engine and physics so forth.

Thermal headroom is good, but I feel you could fill that headroom in a little by increasing cpu p-core frequency. If you've slashed enough power, you could add 1x multi most likely. Yeah the power draw goes up, but the cpu returns the favor with better performance.

Does it make sense to manage your system on the fly, or are you the set it and forget it kind of guy? If I was gaming, and wanted frame rate increase with power reduction, I feel your approach to be good. If the game isn't using the e-cores ever, might as well cut those too.

Keep on keepin on!
 
Joined
Nov 13, 2007
Messages
10,256 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
This is exactly right Here are my results with and without HT:

SOTR = shadow of the tomb raider benchmark -- it's pretty indicative of how most games will scale. 13700KF.

1702714240506.png


basically turning HT off is one of the best things you can do -- my overall gaming performance increased substantially while power draw and temps decreased i lost about 18% in full mutithread but then again im still pushing 24500K in CB23 - which is pretty damn fast. It also allows you to run less agressive LLC stably as well.

HT on cpus with E cores doesn't have nearly as large a performance impact on non-e core designs.

I have more data, but basically every game is so far benefitting without HT. Cyberpunk and far cry 5 are playing really smoothly -- even starfield seems to like it. Much better minimum FPS.
 
Last edited:
Joined
Feb 1, 2019
Messages
2,678 (1.39/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
If you're going to turn off hyperthreading in an i7/i9, just buy an i5 instead.
Does an I5 have the same clocks and l3 cache as the i7 and i9 chips then as you seem to be implying its the same but without HT?

I have no problem with the default shipped state of the CPU being reviewed as is, thats on intel. But dont see an issue with users helping each other out to run their chips in a more efficient configuration.
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,319 (2.51/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
Does an I5 have the same clocks and l3 cache as the i7 and i9 chips then as you seem to be implying its the same but without HT?

I have no problem with the default shipped state of the CPU being reviewed as is, thats on intel. But dont see an issue with users helping each other out to run their chips in a more efficient configuration.
It's lower, however not enough to actually damage performance to warrant paying more and taking off HT.
 
Joined
Feb 1, 2019
Messages
2,678 (1.39/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
14700K + 3070 Ti
Default settings versus my XTU profile, activated by simply pressing those keys when playing.

Nice, what does your profile change?

The video reminded me of when i chopped of about 70-80w of my power draw on the spice wars map screen by changing my GPU profile on the fly. :)
 
Joined
Jun 6, 2022
Messages
622 (0.87/day)
PL1/2: 125W
IccMax:170A
Voltage Offset -0.050
Nothing spectacular.

According to the TPU review, 14900K@125W loses 2% in 1440p and 0% in 4K. With RTX 4090. If you lose 6 fps out of 300, the tragedy is only in the graphics of the reviews.
With the 3070 Ti, the 14900K doesn't lose anything even in 720p, I suppose.
 
Joined
Feb 1, 2019
Messages
2,678 (1.39/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
PL1/2: 125W
IccMax:170A
Voltage Offset -0.050
Nothing spectacular.

According to the TPU review, 14900K@125W loses 2% in 1440p and 0% in 4K. With RTX 4090. If you lose 6 fps out of 300, the tragedy is only in the graphics of the reviews.
With the 3070 Ti, the 14900K doesn't lose anything even in 720p, I suppose.
I have no idea what IccMax does but seen it mentioned more than once now, some reading for me to do. :)
 
Joined
Aug 13, 2010
Messages
5,396 (1.07/day)
IccMax is chip's current capability limitation.

14900K is just a single chip that behaves like Intel chips have been behaving since the Core i9 12900K really.
The efficiency curve of these chips can be optimized quite a lot when some simple downvolt and TDP limitation.

This isn't just restricted to gaming. Multitile rendering can also see quite heavy reductions in power output of the CPU if properly efficiency tuned.
"ok, so why aren't Intel doing this at the factory?"
Well, this has to do with how chips are verified and tested. There's a lot of fat to cut because that's how wafer to product works in the industry today, at AMD's side too.

What I would suggest Intel to do, and have been in the past - prepare an optimization algorithm and put it on XTU for users who want the best efficiency they can per TDP.
Make it so you can select out of a bunch of TDP options and let the CPU run under that testing algorithm to see how efficient it can run.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.93/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
This thread feels like an extension of the other 14900k vs 7800X3D thread, but without mentioning the 7800X3D over here because if you did, a lot of people would probably laugh at you.

Sure you can tune the CPU to be more efficient, but did you actually gain anything? If anything you're going to lose performance in the grand scheme of things unless your thermal solution is just woefully not up to the task. At least the 7800X3D works well out of the box without being a space heater.
 
Joined
Feb 1, 2019
Messages
2,678 (1.39/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
This thread feels like an extension of the other 14900k vs 7800X3D thread, but without mentioning the 7800X3D over here because if you did, a lot of people would probably laugh at you.

Sure you can tune the CPU to be more efficient, but did you actually gain anything? If anything you're going to lose performance in the grand scheme of things unless your thermal solution is just woefully not up to the task. At least the 7800X3D works well out of the box without being a space heater.
Please dont bring the fanboy stuff in this thread from that one, a rational technical discussion is preferred to CPU bashing.

The problem with many CPUs and GPUs now days is they seem to be tuned for outright performance as if efficiency is of no concern, not all of us agree with this, and this applies to AMD as well, on my AMD rig I have XFR disabled to make it much more efficient.

Some of us are ok losing a bit of performance if efficiency increases, and of course it can actually give performance if you bouncing of a thermal or TDP limit. My GPU performs better after I configured my own custom curve to undervolt it.

On my day to day use I havent seen my chip go much above about 70w package power, but for me efficiency is my version of overclocking, I do it on all my hardware. If i had 170w package power in a game like that video, I would definitely be looking at what I could do about it.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.93/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Please dont bring the fanboy stuff in this thread from that one, a rational technical discussion is preferred to CPU bashing.

The problem with many CPUs and GPUs now days is they seem to be tuned for outright performance as if efficiency is of no concern, not all of us agree with this, and this applies to AMD as well, on my AMD rig I have XFR disabled to make it much more efficient.

Some of us are ok losing a bit of performance if efficiency increases, and of course it can actually give performance if you bouncing of a thermal or TDP limit. My GPU performs better after I configured my own custom curve to undervolt it.

On my day to day use I havent seen my chip go much above about 70w package power, but for me efficiency is my version of overclocking, I do it on all my hardware. If i had 170w package power in a game like that video, I would definitely be looking at what I could do about it.
That's not the point. It's that the 14900k out of the box consumes as much power as my 3930k with a heavy overclock. I mentioned out of the box right? People shouldn't have to be fiddling with their CPUs to get them within their advertised TDP to avoid having it overvolt like crazy when you're not utilizing the entire CPU. The 14900k was released without any efficiency in mind beyond the E-cores. Can it do it? Sure, but you shouldn't have to.

Also don't start slinging the term Fanboy around. The only AMD products I use right now is a Vega 64 and a Radeon Pro 5600m. It's not like I have an AMD CPU and I'm shilling for them. I like Intel and they make good chips, but for the last 3 or 4 generations, Intel has just been flat out stupid when it comes to power consumption OOTB. It's laughable when the rated TDP is exceeded by double without any modifications whatsoever.

The 14900k in stock form is flat out an inefficient chip regardless of what AMD has in their lineup. The numbers show that for themselves.

Another case and point is, show me a 14th gen SKU that's intended to be low power. You know, like previous T-series CPUs? Right now everything is K and KF which I have to assume is because they don't care about power consumption outside of the server market. Sorry, but for ~$500 USD, I expect better from Intel.
 
Last edited:
Joined
Sep 23, 2022
Messages
979 (1.63/day)
System Name Windows
Processor 13900K
Motherboard Pro Z790-A WiFi
Cooling Noctua NH-D15s
Memory 32GB 6600 CL32
Video Card(s) RTX 4090
Display(s) MSI MAG401QR
Case Phanteks P600s
Power Supply Vertex GX-1000
Software Win 11 Pro
Benchmark Scores They suck.
That's not the point. It's that the 14900k out of the box consumes as much power as my 3930k with a heavy overclock. I mentioned out of the box right? People shouldn't have to be fiddling with their CPUs to get them within their advertised TDP to avoid having it overvolt like crazy when you're not utilizing the entire CPU. The 14900k was released without any efficiency in mind beyond the E-cores. Can it do it? Sure, but you shouldn't have to.

Also don't start slinging the term Fanboy around. The only AMD products I use right now is a Vega 64 and a Radeon Pro 5600m. It's not like I have an AMD CPU and I'm shilling for them. I like Intel and they make good chips, but for the last 3 or 4 generations, Intel has just been flat out stupid when it comes to power consumption OOTB. It's laughable when the rated TDP is exceeded by double without any modifications whatsoever.

The 14900k in stock form is flat out an inefficient chip regardless of what AMD has in their lineup. The numbers show that for themselves.

Another case and point is, show me a 14th gen SKU that's intended to be low power. You know, like previous T-series CPUs? Right now everything is K and KF which I have to assume is because they don't care about power consumption outside of the server market. Sorry, but for ~$500 USD, I expect better from Intel.

The point is that this is a thread about tuning for efficiency - not if you should or shouldn't have to. It was started weeks before the current jerk-off session in the other thread started.

Nobody wants or cares about your off topic comments and your yelling at the clouds rants. Take it somewhere else.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.93/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
The point is that this is a thread about tuning for efficiency - not if you should or shouldn't have to. It was started weeks before the current jerk-off session in the other thread started.

Nobody wants or cares about your off topic comments and your yelling at the clouds rants. Take it somewhere else.
You do realize that there is an entire review talking about this and that nobody replied to this thread until Thursday. I'm simply calling out the elephant in the room because you wouldn't need to do this if Intel and motherboard manufactures didn't treat the long term boost power as something you can do 100% of the time 24/7/365. It is related and at least I'm not taking pot shots at you personally for your opinion.

Just because you don't like what I'm saying doesn't mean that I'm wrong or that it's not related.
 
Joined
Sep 23, 2022
Messages
979 (1.63/day)
System Name Windows
Processor 13900K
Motherboard Pro Z790-A WiFi
Cooling Noctua NH-D15s
Memory 32GB 6600 CL32
Video Card(s) RTX 4090
Display(s) MSI MAG401QR
Case Phanteks P600s
Power Supply Vertex GX-1000
Software Win 11 Pro
Benchmark Scores They suck.
and at least I'm not taking pot shots at you personally for your opinion.
You just did ^^

Just because you don't like what I'm saying doesn't mean that I'm wrong or that it's not related.
Read my post again. It has nothing to do with if I agree with you or don't. You've come in to an entirely different thread bringing your ranting about the OOB experience with Intel. There's a whole other thread about that. Again, take it somewhere else.
 
Joined
Feb 20, 2020
Messages
9,340 (6.03/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
This thread feels like an extension of the other 14900k vs 7800X3D thread, but without mentioning the 7800X3D over here because if you did, a lot of people would probably laugh at you.

Sure you can tune the CPU to be more efficient, but did you actually gain anything? If anything you're going to lose performance in the grand scheme of things unless your thermal solution is just woefully not up to the task. At least the 7800X3D works well out of the box without being a space heater.
Hi,
Yeah op got kicked off that one so he made his own intel love thread.

I don't mind tweaking in bios so more power or in this op case less power to you all out of the box really does not matter to them :laugh:

People I feel sorry for is the owners/ buyers of H mobile chips instead of more expensive HX chips they are stuck with locked chips.
I found out this just in time so I canceled and went amd mobile locked but at least not a space heater :cool:

I don't get the disabling HT either frankly
The enemy is thermal defective e=cores not HT lol
 
Top