• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Cyberpunk 2077 Patch 2.11 Adds Hybrid CPU Utilization Setting on PC

Joined
Mar 21, 2016
Messages
2,508 (0.80/day)
I haven't tried it out yet, but my Asus Z790-H Rog Strix board has a bios option where you enable it to have I think it's the scroll lock keyboard key to enable or disable the E-cores on demand easily which looks interesting. I'll have to give it a test sometime. It's a pretty slick option. They also have a way to assign what the reset switch does. They could probably in a bios update have that option assigned to reset switch option kind of like a turbo boost button for the reset switch. Yeah everything old is new again the the boost button and those pesky Atom cores.
 
Joined
Sep 15, 2011
Messages
6,681 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Test it and it's a COMPLETE MESS!!! The game stutters and lags like no tomorrow, sometimes becoming unplayable!
Seriously now, do those guys never properly test the new released features?? Do they even have a proper testing department??
 
Joined
Feb 24, 2023
Messages
2,922 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
Joined
Nov 8, 2017
Messages
229 (0.09/day)
I do not understand why so much dev time is still poured into Cyberclunk

It won't get better. Game's done. I guess its the culmination of millions spent on marketing and attracting partnerships, and having to deliver on it somehow. Man what a shitshow. What a waste of time. No content additions but 'polish'. Patch after patch. And then you play the game and wonder what's changed. The vast majority of systems are still clearly not up to snuff and the base game is what it has been since 1.0.
Have you played the game since Phantom Liberty launched ? There's been a few things that's been overhauled, they cleaned up the talent tree, removed the useless skills like underwater combat, made your character much more mobile (you can slide, dash, vault, get out/in of your car while it's moving and shoot while doing so), the cyberware system has been overhauled and balanced as well, gangs have now non-scripted car chases with you, you can spend time with your romance outside of quest, the metro system is now functional, bars are more interactive, there are mini-games on arcade cabinets, the racing quest is also a repeatable mini-game. Smasher has been buffed, PL also added a few encounters that aren't clearable with brute force alone (spoiler: one encounter is closer to the gameplay of alien isolation). Sure, the base game is still fairly linear, but reworking and expanding the base storyline in addition would have probably taken too much dev time.

It's still not Bioshock meet GTA, but 2.0 wasn't just a bug polish, they added, cleaned, and reworked a few stuff. 2.0 give a few hints that the gameplay team is getting at First person games. CDPR wasn't exactly known to have the mastery of say Capcom or Platinum games when it comes to combat design (TW3 was better than skyrim for sure, but the combat alone isn't that memorable compared to other action-RPG. Still, they've become much better at third person games when you compare it to TW1).

If they've learned their lesson about the development hell of CP2077, they should be able to add more depth, more weight to what they started to do with 2.0. (And they should also ignore the people who complained about TW3 being too long, too overwhelming)
 
Joined
Jun 1, 2010
Messages
356 (0.07/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
I do not understand why so much dev time is still poured into Cyberclunk

It won't get better. Game's done. I guess its the culmination of millions spent on marketing and attracting partnerships, and having to deliver on it somehow. Man what a shitshow. What a waste of time. No content additions but 'polish'. Patch after patch. And then you play the game and wonder what's changed. The vast majority of systems are still clearly not up to snuff and the base game is what it has been since 1.0.

Yeah I've never been interested in Cyberpunk... pretty much for that reason. No matter how much they polish around the edges, the foundation of the game is still broken. But it has found something of a niche as an nvidia tech demo. And since people keep buying it, I guess, they keep developing it :rolleyes:.

I imagine from a business perspective it makes sense not to abandon the game, especially a company like CD Projekt Red that had such a good reputation before the release. They're probably trying to claw some of it back.
Yes. These patches are good, but surerely are too late. It should have been the release state. The game is indeed a tech demo, and being alive only due to life support, most likely from "incentives" from "green" GPU manufacturer. Otherwise, CDPR would abandon this pile of shame ASAP, and wouldn't even bother to fix the game, since their new projects are not even on their in-house engine, but on UE instead.

This is the same thing, that happened during Crysis times, where intel and nVidia were showing off their most expensive hardware, and been pulling it's corpse for as long as possible, with it's Crysis Warhead.
But despite Crysis development has been cut down abruptly (which proves tech demo idea even more), without being even fixed essential bugs, it had many mods, even total conversion ones based on (MWLL). There are mods, that fix the bugged net code, graphics rendering issues, performance optimising, etc, so the game still being played, Heck, there were standalone games based on CryEngine 2 (MWO). Because the Crysis has made the foundation for this to happen. It still felt like a solid game.

I strongly doubt Cyberpunk is about to have the same fate, replayability and treatment outside CDPR themselves (it's basically locked ecosystem). And also get any other implementation, except the Cyberpunk itself, and mods solely within it's context. This again proves, that CP2077 is just another techdemo, companies use to profit from, by steadily adding "new" features of "particular" GPU series, and carried along with new GPU/CPU relesases. Just in order to trick people into looking the same old game several times, just to find what have changed.
 

Fietser

New Member
Joined
Feb 1, 2024
Messages
17 (0.06/day)
My biggest gripe with this update, is that they failed to fix sonic shock! Loosing the blood pump every time you eat is just annoying, but you can work around it. Sonic Shock however is pretty much essential doing netrunner stuff against large groups, something that sets this game apart. So this certainly won't be the last update. Waiting for the next one impatiently.
 
Joined
Dec 14, 2011
Messages
1,005 (0.21/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS RTX 3070 Ti TUF Gaming OC Edition
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Software Microsoft Windows 11 Pro (64-bit)
Yeah I've never been interested in Cyberpunk... pretty much for that reason. No matter how much they polish around the edges, the foundation of the game is still broken. But it has found something of a niche as an nvidia tech demo. And since people keep buying it, I guess, they keep developing it :rolleyes:.

I imagine from a business perspective it makes sense not to abandon the game, especially a company like CD Projekt Red that had such a good reputation before the release. They're probably trying to claw some of it back.

Their efforts are commendable, I wish every game studio devoted much effort to what they made. It's definitely a good look when you consider future sales.
 

Fietser

New Member
Joined
Feb 1, 2024
Messages
17 (0.06/day)
Their efforts are commendable, I wish every game studio devoted much effort to what they made. It's definitely a good look when you consider future sales.
True, but they're not doing this because they are Samaritans. Not long ago Phantom Liberty came out, and it sold well! So supporting that for a while should not be that odd.
Plus, they are working on the new Orion project, for which they want to keep a healthy user base.
 

UTVOL06

New Member
Joined
Jan 6, 2024
Messages
9 (0.03/day)
So the gaming world should only buy AMD CPU's?....
Thank you! I swear all I ever see nowadays are AMD fan boys bashing Intel and trying to make a point of how their more efficient CPU is better than Intel's all because it consumes less power while still maintaining a competitive edge. From a gamer/hardware/overclocking enthusiasts perspective, all I care about is pure performance. The only area where efficiency really matters, is with laptops and that is purely for the increased battery life and reduced heat in such a small, confined space that lacks proper ventilation. I would much rather have a better performing CPU that sucked down some power over a lesser one that tries to brag about how it's more efficient. Most importantly, I am big into overclocking and Intel is by far the best choice when it comes to purchasing an unlocked CPU that will allow you to tweak and overclock until your heart is content. Also, it seems people don't realize that competition between two large CPU manufacturers is a good thing for the consumer, because it inspires and drives an ever-improving CPU product. This is needed to help prevent the stagnation of technology in one particular company. If a CPU manufacturing company owned a monopoly on the CPU market due to it not having any real competition, then said company would eventually stop striving to innovate and improve upon itself over time. Or it would do so it a much slower rate. This most important thing to take from all this is that no one should be loyal to one specific brand of CPU. You should always go with the CPU model/brand that performs the best for your needs at that given time and make sure it provides the right number of features that you desire. Lastly, it's important that you also get a CPU that is low maintenance and just works the majority of the time. What you don't want is a finicky CPU that has unmature drivers and an inconsistent management of which processing cores to use or park, for specific content. No end user should have to resort to a program like Project Lasso just to get their CPU to use the right cores and/or CCDs for a specific task/app/game. That's just too much effort and stress for an average CPU end user to deal with, especially when they could just purchase the other brand (Intel) and not have to screw around at all for it to work like it should out of the box. The problem is that AMD does not advertise that it's going to take an end user extra effort and having to go through specific instructional steps just to get their new gaming oriented CPU to work like it should in the first place. I'm referring to the AMD 7950x3D here and I have owned one briefly before I ended up just giving it and the x670e motherboard to a good friend just so I could switch back to a low hassle Intel CPU with more mature drivers and a hardware based Core allocation technology, like Thread Director. AMD uses immature software based Core/CCD allocation/direction. I apologize a head of time for this lengthy comment/rant, but this blind AMD fanboyism and bandwagon loyalty has got to stop in order for there to be a healthy competitive and more honest CPU marketplace for the consumer.
 
Joined
Oct 17, 2019
Messages
56 (0.03/day)
Location
The Last Frontier
Friendly reminder that AMD doesn't sell an AM5 CPU without an iGPU, and the majority of the die is in fact made of P-cores and the uncore :D. I can understand the hate for the e-cores, those are more beneficial to productivity, but I've never understood the hate around the iGPU, on desktop CPU. I've only heard hearsay about how it's keeping the chip from reaching the highest clock possible, but I haven't seen proof that KF are faster than K chips. I never keep my old GPUs around, so I appreciate the peace of mind of having the iGPU if the GPU need to be repaired/replaced. I think that it's one of the reason as to why AMD decided to add one on all their mainstream CPUs.

The Ryzen 5 7500F is the sole AM5 part (at this time) that ships with no iGPU.
 

UTVOL06

New Member
Joined
Jan 6, 2024
Messages
9 (0.03/day)
Or Intel just give us what we wanted: an 8 P-core only, large L2+L3 CPU.
Intel's upcoming Arrow Lake-S high end gaming oriented CPUs will have a much larger L2 cache reserve than the current 13/14th gen Raptor Lake CPUs and due to the new process node(20A) and other technologies, both the P cores and E cores will be able to operate at higher frequencies with more efficiency and less voltage droop. Despite what most people think about hybrid core architecture in the Intel CPUs of recent time, there are some games that actually perform better with the e cores in use, than without them. One such game is Battlefield 2042, which fully utilizes all the P cores and some of the e cores. The faster the e core frequency, you will often find higher more consistent 1% lows, which equates to a more stable and fluid gaming experience, with less dips in the framerate. I have seen such a correlation when overclocking my 13900KS. The default e-core frequency on the 13900KS is 4.3Ghz, but I have OCed mine to 4.7Ghz and when running particular games, I have noticed a moderate boost to not only the average framerate, but in particular, the 1% lows, which came up higher than before. The stock default P cores are set to 6Ghz for 1-2 active P cores and 5.6Ghz for 3-8 active P cores. My P cores are all synced to 5.9Ghz up to a 60c threshold and will then run at 5.8Ghz after that temp. In gaming, I will rarely ever exceed 60c unless I'm precompiling shaders on a brand new game installation. Running my 13900KS at these frequencies has netted me roughly a 7% performance increase in single threaded performance over a stock 13900K and a 9% increase in multithreaded performance over the stock 13900K. This all according to the latest CPU-Z built-in benchmark. Also, the upcoming Arrow Lake-S CPUs are reported to now support an official DDR5 frequency of 6400 MT/s, whereas the 13/14th gen raptor lake officially supported only 5600 MT/s. Now we all know that most 13900K/14900K CPU owners could run at least 7200 MT/s on a 4 DIMM motherboard and 8000+ on a 2 DIMM board, with the right 13/14th gen CPU, that had a good memory controller. With Arrow Lake-S with an apparently stronger Memory Controller, we should easily start seeing 2-DIMM motherboard users getting up to 9000 MT/s DDR5 speeds without needing to have a golden memory controller.

I haven't tried it out yet, but my Asus Z790-H Rog Strix board has a bios option where you enable it to have I think it's the scroll lock keyboard key to enable or disable the E-cores on demand easily which looks interesting. I'll have to give it a test sometime. It's a pretty slick option. They also have a way to assign what the reset switch does. They could probably in a bios update have that option assigned to reset switch option kind of like a turbo boost button for the reset switch. Yeah everything old is new again the the boost button and those pesky Atom cores.
I tried testing the Scroll Lock Legacy gaming function on my 13900KS and it does not behave the same performance-wise compared to just manually turning off the E cores for a specific game. Turning off the E-cores manually in the game Star Citizen yielded a much smoother gaming experience than enabling the scroll lock legacy gaming feature that tells the CPU to park the e cores rather than turning them completely off. In theory the Scroll Lock feature should give the same results as disabling the e cores in the BIOS, but in practice I've seen different results.
 
Joined
Jan 19, 2023
Messages
241 (0.37/day)
I just installed the mod from LukeFZ and not only does FSR3 with FG works on my 7900XTX, but it also fixed horrible ghosting, not only that it fixed HDR finally when using FSR. And now I get 80-100fps with FG, at 4K, FSR set to Quality and RT set to Ultra. And the game feels smooth. Minor issue is a glitching cars when driving fast, but that's nothing compared to ghosting that was in FSR2.1 that CDPR implemented.

It's a shame that one guy could do it, but whole company still can't.

BTW it works wonderfully in Witcher 3 with RT as well.
 
Joined
Feb 1, 2019
Messages
3,516 (1.67/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Yes. These patches are good, but surerely are too late. It should have been the release state. The game is indeed a tech demo, and being alive only due to life support, most likely from "incentives" from "green" GPU manufacturer. Otherwise, CDPR would abandon this pile of shame ASAP, and wouldn't even bother to fix the game, since their new projects are not even on their in-house engine, but on UE instead.

This is the same thing, that happened during Crysis times, where intel and nVidia were showing off their most expensive hardware, and been pulling it's corpse for as long as possible, with it's Crysis Warhead.
But despite Crysis development has been cut down abruptly (which proves tech demo idea even more), without being even fixed essential bugs, it had many mods, even total conversion ones based on (MWLL). There are mods, that fix the bugged net code, graphics rendering issues, performance optimising, etc, so the game still being played, Heck, there were standalone games based on CryEngine 2 (MWO). Because the Crysis has made the foundation for this to happen. It still felt like a solid game.

I strongly doubt Cyberpunk is about to have the same fate, replayability and treatment outside CDPR themselves (it's basically locked ecosystem). And also get any other implementation, except the Cyberpunk itself, and mods solely within it's context. This again proves, that CP2077 is just another techdemo, companies use to profit from, by steadily adding "new" features of "particular" GPU series, and carried along with new GPU/CPU relesases. Just in order to trick people into looking the same old game several times, just to find what have changed.
It feels like a tech showcase for Nvidia at this point, waiting for the next version of ray tracing or DLSS and sure enough within a short time it will be patched into the game.
 
Joined
Jan 5, 2006
Messages
18,585 (2.70/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
especially when they could just purchase the other brand (Intel) and not have to screw around at all for it to work like it should out of the box.
Exactly, I'm with you there.
 
Joined
Feb 24, 2023
Messages
2,922 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
I just installed the mod from LukeFZ and not only does FSR3 with FG works on my 7900XTX, but it also fixed horrible ghosting, not only that it fixed HDR finally when using FSR. And now I get 80-100fps with FG, at 4K, FSR set to Quality and RT set to Ultra. And the game feels smooth. Minor issue is a glitching cars when driving fast, but that's nothing compared to ghosting that was in FSR2.1 that CDPR implemented.
Did it, too. Apparently, I did something wrong because if I pick "DLSS" it won't upscale, the framerate and the image quality stayed the same as native. FSR stayed as horrible as it previously was. XeSS got broken. Frame generation was a complete glitch show.

???????
 
Joined
Jan 19, 2023
Messages
241 (0.37/day)
Did it, too. Apparently, I did something wrong because if I pick "DLSS" it won't upscale, the framerate and the image quality stayed the same as native. FSR stayed as horrible as it previously was. XeSS got broken. Frame generation was a complete glitch show.

???????
For me the latest version of it didnt work (0.10) either FG didnt work, or game just crashed. Downloaded 0.9 from his website and just copied the files. Works like a charm. I select DLSS from settings, FG on. Only sometimes if I change from Quality to Perf game crashes but afterwards it works.
 
Joined
Feb 24, 2023
Messages
2,922 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
For me the latest version of it didnt work (0.10) either FG didnt work, or game just crashed. Downloaded 0.9 from his website and just copied the files. Works like a charm. I select DLSS from settings, FG on. Only sometimes if I change from Quality to Perf game crashes but afterwards it works.
What does your .toml file contain?
 
Joined
Jan 19, 2023
Messages
241 (0.37/day)
What does your .toml file contain?
Default values from 0.9. Didn't change anything there. I'm not at my computer but fake GPU and all the other values were false. In 0.10 that file looks different, he made some changes there. However with 0.10 I tried to change everything there and it didn't work. Either FG didnt start even if it was enabled of game just crashed when selecting DLSS and then after every restart it would crash at the logos.
 
Joined
Feb 24, 2023
Messages
2,922 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
Default values from 0.9. Didn't change anything there. I'm not at my computer but fake GPU and all the other values were false. In 0.10 that file looks different, he made some changes there. However with 0.10 I tried to change everything there and it didn't work. Either FG didnt start even if it was enabled of game just crashed when selecting DLSS and then after every restart it would crash at the logos.
Made the 0.9 work but FG still was broken and on top of that, this bridge messed my VRR up and made ~60 FPS feel like ~20 FPS.

Eh, hope I'll buy a beefier GPU soon.
 
Joined
Jan 19, 2023
Messages
241 (0.37/day)
Don't know what is wrong then. In my case VRR works fine. I also have vsync enabled in drivers to limit below my 120hz limit.
 
Joined
Aug 21, 2013
Messages
1,888 (0.46/day)
From a gamer/hardware/overclocking enthusiasts perspective, all I care about is pure performance.
That is you, specifically. Not everyone will see it that way.
The only area where efficiency really matters, is with laptops and that is purely for the increased battery life and reduced heat in such a small, confined space that lacks proper ventilation.
Efficiency always matters. Even on desktops. Also better efficiency=lower temps=lower noise and more compact computer.
I would much rather have a better performing CPU that sucked down some power over a lesser one that tries to brag about how it's more efficient.
Better by how much? 14900K is within 5% of 7950X3D in performance (temperatures too but only in gaming), but consumes two or three times a much power while doing so (even in gaming). That <5% may be worth it to you, but for most people i reckon it is not.

Most importantly, I am big into overclocking and Intel is by far the best choice when it comes to purchasing an unlocked CPU that will allow you to tweak and overclock until your heart is content.
As if that is even advantageous these days. Intel has already pushed it so far that there is no OC headroom left and modern boost algorithms mean that even if you OC manually you can never match those boost frequency's manually. This means that in gaming it will actually lose performance. Not to mention the fact that when stressed then Intel quickly reaches temperature limits that further limit overclocking.
From TPU's review:
Overclocking the Core i9-14900K is easy, thanks to its unlocked multiplier. The biggest problem is the heat though, even at stock you'll be reaching 100°C and higher. Overclocking the 14900K means setting the thermal limit to 115°C up from 100°C, and then figuring out what's the highest voltage you can give the CPU without hitting throttling at 115°C, depending on your cooling solution. Switching from air to our Arctic AIO helped with controlling the heat, but it wasn't a huge difference. Our maximum all-core OC is 5.5 GHz on the P-Cores, plus 4.4 GHz on the E-Cores, 100% stable. this still isn't enough to beat the stock configuration in lighter applications and most games, because here the CPU will boost two cores up to 6.0 GHz.
Lastly, it's important that you also get a CPU that is low maintenance and just works the majority of the time. What you don't want is a finicky CPU that has unmature drivers and an inconsistent management of which processing cores to use or park, for specific content. No end user should have to resort to a program like Project Lasso just to get their CPU to use the right cores and/or CCDs for a specific task/app/game.
First you say you like overclocking and then low maintenance. These two don't go together. The best low maintenance CPU is one that does not require overclocking and does not care about how fast your RAM speed is. And that is 7800X3D by far.
That's just too much effort and stress for an average CPU end user to deal with, especially when they could just purchase the other brand (Intel) and not have to screw around at all for it to work like it should out of the box.
And later below your post you describe how you are messing with turning off e-cores to get the best gaming performance. Thereby contradicting your earlier statements. The same way a user could buy 7950X3D and turn off the CCD without the V-Cache. For gaming the 7800X3D requires none of that and is cheaper too.
The problem is that AMD does not advertise that it's going to take an end user extra effort and having to go through specific instructional steps just to get their new gaming oriented CPU to work like it should in the first place. I'm referring to the AMD 7950x3D here and I have owned one briefly before I ended up just giving it and the x670e motherboard to a good friend just so I could switch back to a low hassle Intel CPU with more mature drivers and a hardware based Core allocation technology, like Thread Director. AMD uses immature software based Core/CCD allocation/direction. I apologize a head of time for this lengthy comment/rant, but this blind AMD fanboyism and bandwagon loyalty has got to stop in order for there to be a healthy competitive and more honest CPU marketplace for the consumer.
Yes and intel is low effort with overclocking and turning off e-cores?
For gaming the 7800X3D is best by far and requires no overclocking, no thread directors and does not care about ram speeds. Not to mention the incredibly efficiency. It is the best option for most users.
 
Joined
Jan 5, 2006
Messages
18,585 (2.70/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
It seems this has become an AMD vs Intel thread...
 
Joined
Aug 21, 2013
Messages
1,888 (0.46/day)
It seems this has become an AMD vs Intel thread...
Of course. As long as someone only writes their side it's fine. As soon as there is a rebuttal it's suddenly as VS thread.
I agree with UTVOL06 that we need both AMD and Intel competing. That is how the consumer wins. It's bad if one gains utter dominance.
 
Top