• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900K

Joined
Jul 16, 2021
Messages
34 (0.03/day)
I see where you are going to.
Actually what you propose is that Intel should cut the P cores altogether and glue as many E cores as possible.
For example 32 E cores on a single die.

And see what happens in a 105-watt power budget :D
LOL I'd buy that. Extrapolating from the Anandtech review, a 32 E-Core processor will roughly use 192W @ Max utilization
 
Joined
Jan 27, 2015
Messages
1,646 (0.49/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
PC World on power with the 12900K.

Are we done yet?

1636055396346.png
 
Joined
Jun 14, 2020
Messages
2,678 (1.91/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
So +11% faster (1080p) falling to 7% faster (1440p) in games on average for +23% higher power consumption on a newer 10nm process vs 2-gen old i9-10900K on 14nm process and 92-100c temps even with a Noctua NH-U14S? That's... not very impressive...
Uhm , it OBVIOUSLY does not consume 23% more power OR run at 100c in gaming. It actually consumes less or equal power to amd cpus in gaming, and with pretty muchthe same temperatures.
 
D

Deleted member 215115

Guest
Once eggs were fried on Fermi (GTX 480), now it will be boiling water for coffee or tea on 12900K (don't thank for the idea, just do such a test - who has this CPU of course).
Will it boil the coolant in the custom loop that's needed to cool it? Probably. Only time will tell.
 
Joined
Dec 5, 2013
Messages
602 (0.16/day)
Location
UK
Uhm , it OBVIOUSLY does not consume 23% more power OR run at 100c in gaming. It actually consumes less or equal power to amd cpus in gaming, and with pretty muchthe same temperatures.
People buy i9's and 24 thread CPU's in general for more than just gaming (and TPU isn't just a pure gaming site). Many gamers do video editing and have other mixed workloads too. If I wanted just a pure gaming chip, given that barely 0.4-4.0% separate the i5-12600K vs the i9-12900K (down to just 0.4% for 4k resolution), I'd buy the cheaper chip and spend more on the GPU. Or not upgrade at all and spend even more on the GPU. Put under heavy productivity load though, the temps and power are what they are, and I don't see why they should be arbitrarily excluded simply because "it's not gaming" or why a GPU bottleneck should be used to measure CPU power usage in a CPU review. The flip side of that is to declare the RTX 3070 a "73w card like the 1050Ti" by pairing it with a really slow CPU that matches the 60Hz VSync numbers, cherry pick that as the "real power figures" and throw all other measurements out of the window that actually load the component in question being tested 100%...
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Apple has experience with big.LITTLE for close to a decade, yes it isn't iOS but you're telling me that their experience with Axx chips or ARM over the years won't help them here? Yes technically MS also had Windows on ARM but we know where that went.
CPU development cycles for a new arch are in the ~5 year range. In other words, MS has known for at least 3+ years that Intel is developing a big+little-style chip. Test chips have been available for at least a year. If MS haven't managed to make the scheduler work decently with that in that time, it's their own fault.
No of course not but without the actual chips out there how can MS optimize for it? You surely don't expect win11 to be 100% perfect right out the gate with something that's basically releasing after the OS was RTMed? Real world user feedback & subsequent telemetry data will be needed to better tune for ADL ~ that's just a reality. Would you say that testing AMD with those skewed L3 results was also just as fair?
Perfect? No. Pretty good? Yes. See above.

And ... the AMD L3 bug is a bug. A known, published bug. Are there any known bugs for ADL scheduling? Not that I've heard of. If there are, reviews should be updated. Until then, the safe assumption is that the scheduler is doing its job decently, as performance is good. These aren't complex questions.
View attachment 223678View attachment 223679
So why does the GPU test bench use 4000Mhz modules with the 5800x? Also, Previous benchmarks show even higher fps. 112 vs 96.
Because the GPU test bench is trying to eliminate CPU bottlenecks, rather than present some sort of representative example of CPU performance? My 5800X gives me WHEA errors at anything above 3800, so ... yeah.
According to Igor Lab's review (<- linked here) where they measure CPU power consumption when gaming -

and measure watts consumed per fps

Alder Lake is doing very very well.
That looks pretty good - if that's representative, the E cores are clearly doing their job. I would guess that is highly dependent on the threading of the game and how the scheduler treats it though.
Anandtech does that iirc, but I feel for our enthusiast audience that it's reasonable to go beyond the very conservative memory spec and use something that's fairly priced and easily attainable
Yep, as I was trying to say I see both as equally valid, just showing different things. It's doing anything else - such as pushing each chip as far as it'll go - that I have a problem with.
Wait, are those light blue numbers idle numbers? How on earth are they managing 250W idle power draw? Or are those ST load numbers? Why are there no legends for this graph? I can't even find them on their site, wtf? If the below text is supposed to indicate that the light blue numbers are indeed idle, there is something very wrong with either their configurations or they measure that. Modern PC platforms idle in the ~50W range, +/- about 20W depending on the CPU, RAM, GPU and so on.

LOL I'd buy that. Extrapolating from the Anandtech review, a 32 E-Core processor will roughly use 192W @ Max utilization
Well, you'd need to factor in a fabric capable of handling those cores, so likely a bit more. Still, looking forward to seeing these in mobile applications.
 
Joined
Sep 14, 2020
Messages
498 (0.38/day)
Location
Greece
System Name Office / HP Prodesk 490 G3 MT (ex-office)
Processor Intel 13700 (90° limit) / Intel i7-6700
Motherboard Asus TUF Gaming H770 Pro / HP 805F H170
Cooling Noctua NH-U14S / Stock
Memory G. Skill Trident XMP 2x16gb DDR5 6400MHz cl32 / Samsung 2x8gb 2133MHz DDR4
Video Card(s) Asus RTX 3060 Ti Dual OC GDDR6X / Zotac GTX 1650 GDDR6 OC
Storage Samsung 2tb 980 PRO MZ / Samsung SSD 1TB 860 EVO + WD blue HDD 1TB (WD10EZEX)
Display(s) Eizo FlexScan EV2455 - 1920x1200 / Panasonic TX-32LS490E 32'' LED 1920x1080
Case Nanoxia Deep Silence 8 Pro / HP microtower
Audio Device(s) On board
Power Supply Seasonic Prime PX750 / OEM 300W bronze
Mouse MS cheap wired / Logitech cheap wired m90
Keyboard MS cheap wired / HP cheap wired
Software W11 / W7 Pro ->10 Pro
No, it's not, DDR4 vs DDR5.

And it's not about it being unfair, having just one platform on DDR5 isn't enough to infer how good these CPUs actually are. Any CPU with faster memory will also perform better, nothing new here.
ComputerBase has the answer, DDR5 6200 vs DDR4 3800, practically no difference.
 
Joined
Jun 14, 2020
Messages
2,678 (1.91/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
People buy i9's and 24 thread CPU's in general for more than just gaming (and TPU isn't just a pure gaming site). Many gamers do video editing and have other mixed workloads too. If I wanted just a pure gaming chip, given that barely 0.4-4.0% separate the i5-12600K vs the i9-12900K (down to just 0.4% for 4k resolution), I'd buy the cheaper chip and spend more on the GPU. Or not upgrade at all and spend even more on the GPU. Put under heavy productivity load though, the temps and power are what they are, and I don't see why they should be arbitrarily excluded simply because "it's not gaming" or why a GPU bottleneck should be used to measure CPU power usage in a CPU review. The flip side of that is to declare the RTX 3070 a "73w card like the 1050Ti" by pairing it with a really slow CPU that matches the 60Hz VSync numbers, cherry pick that as the "real power figures" and throw all other measurements out of the window that actually load the component in question being tested 100%...
LOL. But YOU mentioned only the gaming performance and then tossed the power consumption and temperatures from blender. Now you are telling me CPUs ain't just for gaming. Then why did you use the gaming numbers?

CPU's aren't just for n-multithreaded workloads either. If my job consists of lightly threaded tasks (like photoshop / premiere and the likes), that single thread performance of the 12900k is king. Without the power consumption and temperature baggage either. If your workloads consists of n threads scaling then you should be looking at the the threadrippers i guess.
 
Joined
Apr 16, 2019
Messages
632 (0.35/day)
So +11% faster (1080p) falling to 7% faster (1440p) in games on average for +23% higher power consumption on a newer 10nm process vs 2-gen old i9-10900K on 14nm process and 92-100c temps even with a Noctua NH-U14S? That's... not very impressive...
So you first state the (supposedly small) increase in gaming performance, then in the same sentence you quote power and temp figures from an all-core stress test? To use your own phrase - that's not very impressive argumentation...
 
Joined
Jun 29, 2018
Messages
456 (0.21/day)
CPU development cycles for a new arch are in the ~5 year range. In other words, MS has known for at least 3+ years that Intel is developing a big+little-style chip. Test chips have been available for at least a year. If MS haven't managed to make the scheduler work decently with that in that time, it's their own fault.

This isn't even the first big.little CPU from Intel either, Lakefield shipped in Q2'20 with 1P+4E ;)

And ... the AMD L3 bug is a bug. A known, published bug. Are there any known bugs for ADL scheduling? Not that I've heard of. If there are, reviews should be updated. Until then, the safe assumption is that the scheduler is doing its job decently, as performance is good. These aren't complex questions.

The hotfix for AMD L3 bug isn't perfect either:

There are latency regressions even with the update applied. Especially for dual chiplet models.

Bandwidth is not at the Win10 levels either, but dramatically better than the original Win11.
Edit: broken graphs.
 
Joined
Apr 16, 2019
Messages
632 (0.35/day)
People buy i9's and 24 thread CPU's in general for more than just gaming (and TPU isn't just a pure gaming site). Many gamers do video editing and have other mixed workloads too. If I wanted just a pure gaming chip, given that barely 0.4-4.0% separate the i5-12600K vs the i9-12900K (down to just 0.4% for 4k resolution), I'd buy the cheaper chip and spend more on the GPU. Or not upgrade at all and spend even more on the GPU. Put under heavy productivity load though, the temps and power are what they are, and I don't see why they should be arbitrarily excluded simply because "it's not gaming" or why a GPU bottleneck should be used to measure CPU power usage in a CPU review. The flip side of that is to declare the RTX 3070 a "73w card like the 1050Ti" by pairing it with a really slow CPU that matches the 60Hz VSync numbers, cherry pick that as the "real power figures" and throw all other measurements out of the window that actually load the component in question being tested 100%...

LOL. But YOU mentioned only the gaming performance and then tossed the power consumption and temperatures from blender. Now you are telling me CPUs ain't just for gaming. Then why did you use the gaming numbers?

CPU's aren't just for n-multithreaded workloads either. If my job consists of lightly threaded tasks (like photoshop / premiere and the likes), that single thread performance of the 12900k is king. Without the power consumption and temperature baggage either. If your workloads consists of n threads scaling then you should be looking at the the threadrippers i guess.
Bingo! It's always the same with them lot - when trying to make Intel look bad and AMD good, all and every tactic is fair, the dirtier the better actually... :rolleyes:
 
Joined
Jan 27, 2015
Messages
1,646 (0.49/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
ComputerBase has the answer, DDR5 6200 vs DDR4 3800, practically no difference.

Yeah but DDR4-3200 is a bit too slow. From different reviews it seems like if you are running DDR4-3600 or higher with decent latency (like CL16) then it's fine, zero or almost zero difference, but the tests with DDR4-3200 on AL are highly variable vs DDR5.
 
Joined
Nov 24, 2012
Messages
27 (0.01/day)
Compared to this power consumption Bulldozer seems like a good CPU. It wasn't as fast as Intel's offerings at the time, but then again it wasn't trying to burn your house down either.
 
Joined
Dec 5, 2013
Messages
602 (0.16/day)
Location
UK
Bingo! It's always the same with them lot - when trying to make Intel look bad and AMD good, all and every tactic is fair, the dirtier the better actually... :rolleyes:
Considering I own a 10th Gen Intel, I've no idea who this dumb anti-fanboyism fanboyism of yours is even aimed at. I just have zero interest in space heaters of either brand and 100c with an $80 Noctua NH-U14S is piss-poor thermals... :rolleyes:
 
Joined
Jan 27, 2015
Messages
1,646 (0.49/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Considering I own a 10th Gen Intel, I've no idea who this dumb anti-fanboyism fanboyism of yours is even aimed at. I just have zero interest in space heaters of either brand and 100c with an $80 Noctua NH-U14S is piss-poor thermals... :rolleyes:

Then set the power limit to 88W on the AL and still walk all over your neighbors 5900X.

Computerbase.de :

1636058355485.png
 
Joined
Feb 7, 2006
Messages
738 (0.11/day)
Location
Austin, TX
System Name WAZAAM!
Processor AMD Ryzen 3900x
Motherboard ASRock Fatal1ty X370 Pro Gaming
Cooling Kraken x62
Memory G.Skill 16GB 3200 MHz
Video Card(s) EVGA GeForce GTX 1070 8GB SC
Storage Micron 9200 Max
Display(s) Samsung 49" 5120x1440 120hz
Case Corsair 600D
Audio Device(s) Onboard - Bose Companion 2 Speakers
Power Supply CORSAIR Professional Series HX850
Keyboard Corsair K95 RGB
Software Windows 10 Pro
Next week :) Intel CPUs arrived yesterday. I've been rebenching everything else since the W11 AMD L3 cache fix came out. Will rebench Zen 2 and 9th gen too and add them to the reviews.

I understand how much trouble it is to rebench everything. Thanks for the extra effort, we really do appreciate your thoroughness.
 
Joined
May 19, 2009
Messages
1,821 (0.33/day)
Location
Latvia
System Name Personal \\ Work - HP EliteBook 840 G6
Processor 7700X \\ i7-8565U
Motherboard Asrock X670E PG Lightning
Cooling Noctua DH-15
Memory G.SKILL Trident Z5 RGB Black 32GB 6000MHz CL36 \\ 16GB DDR4-2400
Video Card(s) ASUS RoG Strix 1070 Ti \\ Intel UHD Graphics 620
Storage 2x KC3000 2TB, Samsung 970 EVO 512GB \\ OEM 256GB NVMe SSD
Display(s) BenQ XL2411Z \\ FullHD + 2x HP Z24i external screens via docking station
Case Fractal Design Define Arc Midi R2 with window
Audio Device(s) Realtek ALC1150 with Logitech Z533
Power Supply Corsair AX860i
Mouse Logitech G502
Keyboard Corsair K55 RGB PRO
Software Windows 11 \\ Windows 10
This is not going to change my plans for upgrade to Ryzen someday, but good job Intel.
P.S.
I will never understand complaints about high power draw and heat when you are spending cash for top end product. Does eletricity bill really applies to someone who can shell out the cash for this?! I doubt it very much...
 
Joined
Feb 7, 2006
Messages
738 (0.11/day)
Location
Austin, TX
System Name WAZAAM!
Processor AMD Ryzen 3900x
Motherboard ASRock Fatal1ty X370 Pro Gaming
Cooling Kraken x62
Memory G.Skill 16GB 3200 MHz
Video Card(s) EVGA GeForce GTX 1070 8GB SC
Storage Micron 9200 Max
Display(s) Samsung 49" 5120x1440 120hz
Case Corsair 600D
Audio Device(s) Onboard - Bose Companion 2 Speakers
Power Supply CORSAIR Professional Series HX850
Keyboard Corsair K95 RGB
Software Windows 10 Pro
This is not going to change my plans for upgrade to Ryzen someday, but good job Intel.
P.S.
I will never understand complaints about high power draw and heat when you are spending cash for top end product. Does eletricity bill really applies to someone who can shell out the cash for this?! I doubt it very much...

I'm more concerned with the heat output in one room. Though I'm in central texas so it's a bigger concern for me than others. (my solution was a mini-split in my server room and switching to a minipc at my desk where I remote to my gaming system. Stays cools and I don't care how much heat it generates.)
 
Joined
Feb 18, 2017
Messages
688 (0.26/day)
"Fighting for the Performance Crown"

And yet fails to beat the 1 year old rival with the same amount of cores. With TWICE worse efficiency. The i7 model is also 5% behind the 5900X while having the same number of cores. The only model able to beat its rival is the i5 - having 4 extra cores compared to the 5600X.
 
Joined
Jun 1, 2021
Messages
87 (0.08/day)
System Name NR200 SuperSport Speed Machine yet quiet
Processor Ryzen 7 5800x
Motherboard Gigabyte B550I AORUS PRO AX
Cooling Noctua NH-D15 w/Thermal Grizzly Kryonaut
Memory Corsair Vengeance LPX 32 GB (2 x 16 GB) DDR4 3600Mhz
Video Card(s) Sapphire AMD Radeon RX 6800 Nitro plus 16GB With SAM and OCed
Storage Samsung 970 EVO PLUS 1TB NVMe M.2 Solid State Drive
Display(s) Gigabyte G27Q GAMING MONITOR 1440p 144hz IPS HDR "1ms"
Case Cooler Master NR200P w/Mods w/Full Noctua fans
Audio Device(s) Beyerdynamic DT 770 PRO 80 Ohm/Ifi Zen Dac v2
Power Supply Corsair SF 600 W 80+ Platinum Certified Fully Modular SFX
Mouse Glorious Model D Matte White w/Mousepad: Cooler Master MP511 (Size XL)
Keyboard Cooler Master CK550 Gateron Red Switch w/Tai-Hao Doubleshot PBT Backlit - Cool Gray/Navy
Software Microsoft Windows 10 Pro Full - USB 32/64-bit
Benchmark Scores https://www.3dmark.com/spy/27840416
1636062220128.png

- Did you use a U14s or a U12s for overclocking? @W1zzard :D
 
Joined
Oct 31, 2020
Messages
78 (0.06/day)
Processor 5800X3D
Motherboard ROG Strix X570-F Gaming
Cooling Arctic Liquid Freezer II 280
Memory G Skill F4-3800C14-8GTZN
Video Card(s) PowerColor RX 6900xt Red Devil
Storage Samsung SSD 970 EVO Plus 250GB [232 GB], Samsung SSD 970 EVO Plus 500GB
Display(s) Samsung C32HG7xQQ (DisplayPort)
Case Graphite Series™ 730T Full-Tower Case
Power Supply Corsair RM1000x
Mouse Basillisk X Hyperspeed
Keyboard Blackwidow Ultimate
Software Win 10 Home
Very good work but if you allow me I can find out the reason why you changed the Zen setup from EVGA X570 DARK with 4000mhz@2000 IF memories that you used in the last reviews you did on MSI X570 and 3600@1800 IF memories?
 
Joined
Mar 16, 2017
Messages
1,663 (0.64/day)
Location
Tanagra
System Name Budget Box
Processor Xeon E5-2667v2
Motherboard ASUS P9X79 Pro
Cooling Some cheap tower cooler, I dunno
Memory 32GB 1866-DDR3 ECC
Video Card(s) XFX RX 5600XT
Storage WD NVME 1GB
Display(s) ASUS Pro Art 27"
Case Antec P7 Neo
Software has always been behind but maybe this transition to big.LITTLE will change that.
I have my doubts. Because this is the first x86 product like this on Windows, and the hybrid approach is limited to only part of the 12 series, it means 99% of the hardware out there will still be homogeneous CPU architecture, and for many years to come with the way our hardware can now last for so long. It’s going to be on Intel to make this work, then MS, and maybe developers will jump in. I could easily see developers just saying “use different hardware” if you have issues, at least for a while.
ComputerBase has the answer, DDR5 6200 vs DDR4 3800, practically no difference.
Anandtech came to the conclusion that DDR5 does contribute to the performance increase. They have 2 pages of DDR4 vs DDR5 that show measurable gains. It’s not across the board, but significant, especially mutlithread. They even concede that AMD should see similar gains when they implement DDR5, though they are obviously further out.
 
Joined
Jun 29, 2018
Messages
456 (0.21/day)
I have my doubts. Because this is the first x86 product like this on Windows, and the hybrid approach is limited to only part of the 12 series, it means 99% of the hardware out there will still be homogeneous CPU architecture, and for many years to come with the way our hardware can now last for so long. It’s going to be on Intel to make this work, then MS, and maybe developers will jump in. I could easily see developers just saying “use different hardware” if you have issues, at least for a while.

It's actually the second - Lakefield was released in Q2 2020 with 1P+4E. The difference here is that Intel Thread Director is present to help the Windows scheduler make sensible decisions. The AnandTech article explains in detail what is happening behind the scenes, especially on Windows 10, which lacks ITD support.

I don't think software vendors will ignore the potential issues, but the worst solution would probably be "disable E-cores in BIOS" instead of "use different hardware", because the P-cores are superior to previous Intel cores ;)
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
ComputerBase has the answer, DDR5 6200 vs DDR4 3800, practically no difference.
Only half the story, still need to see how AMD will perform under DDR5.

Software has always been behind but maybe this transition to big.LITTLE will change that.
Doubt it, big.LITTLE will always produce terrible results under certain situations, it's a problem impossible to solve without negative side effects. It's just that on mobile those sides effects aren't that noticeable but now it became obvious that on desktop PCs they are.
 
Last edited:
Top