• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900K

Joined
Dec 12, 2012
Messages
718 (0.17/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
By the way, can you disable the E-cores in the BIOS? Like if you wanted to push for maximum gaming performance with 8C/16T.
 
Joined
Apr 12, 2013
Messages
6,750 (1.67/day)
I generally dislike public perception, but you have to admit that whatever AMD did with marketing was genius. People drank AMD's koolaid about those barely functional buggy excuses of CPUs and hyped Zen 1 to the moon. Despite it being really similar to how FX launched. Focus on more cores, poor single threaded performance, worse power consumption than Intel. I genuinely thought that Ryzen will be FX v2, but nah. Somehow buying several kits of RAM to find one compatible, having computer turning on once in 10 tries and overall being inferior than Intel suddenly became acceptable and not only that, but desirable. Later gens were better, but it was the first gen that built most of Ryzen's reputation. And people quickly became militant of idea that Intel is still better, some of them would burn you at stake for saying such heresy. And now people are surprised that Intel is good again, as if Intel was a clear market leader and hasn't been for half century.
It's mainly those who follow the narrative that many YTers peddle these days or how clickbait sites like WTFtech publish each passing week! Those who've followed the tech landscape relatively closely over the last two decades know AMD was only really a big big deal post 2k, that was for about half a decade till Conroe entered the arena. Then AMD was absent for nearly another decade till Zen materialized, in between there were also really informed people who thought Intel would lower prices even if AMD went bankrupt because ~ ARM, well suffice to say that never happened & Intel would never lower prices especially if it was the only x86 CPU maker left. They only started lowering their prices in almost 1.5 decades because they had to, same goes for AMD now!

Basically the public perception isn't made of lies or fantasies but what you could term as desires :ohwell:
 
Joined
Jan 27, 2015
Messages
1,649 (0.49/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Yeah that same document only talks about 125 W and never mentions the new defaults

Intel marketing in their presentation was 100% clear PL1=PL2=241 W, I posted their presentation slide a few days ago

I suspect what happened is that someone in marketing really wanted to win Cinebench R23 (which heats up the CPU first and usually runs at PL1 without turbo on Intel), so they pushed for that change last minute

Honestly this is no different than AMD telling reviewers to use DDR4-3733 / 3800. Then what actually ships is mostly getting mated with DDR4-3200 C16 at XMP, or worse if it's in an OEM rig.

It's why I always tell people to look at multiple reviews and pay attention to the test configuration. That actually takes work on the readers part though.

I personally like to see reviews at three levels -
  • Base 'inexpensive' configs like 'cheap' DDR4-3200 XMP (not the expensive C14 stuff) with recommended base power settings (technically 125/241 for a 12900K) and maybe just XMP enabled
  • A midrange setting, which I think TPU does and probably reflects what most enthusiasts will actually do, i.e. DDR4-3600 XMP and maybe some limited power limit tinkering. I actually think PL1/PL2 of 241=241 is too high for this, and 125/241 is too low. Your Noctua is probably good for 180/241. Asus AI Tweaker is a good example here, it will tune your power levels to your ability to cool the chip under the workload you are actually putting it under.
  • A balls to the wall setup for that system (which nobody does outside of a few youtubers and sometimes computerbase.de, and means that you need to have memory / motherboard and settings that puts that CPU in its best setting). These guys run open loop coolers and high but stable overclocks.

It's mainly those who follow the narrative that many YTers peddle these days or how clickbait sites like WTFtech publish each passing week! Those who've followed the tech landscape relatively closely over the last two decades know AMD was only really a big big deal post 2k, that was for about half a decade till Conroe entered the arena. Then AMD was absent for nearly another decade till Zen materialized, in between there were also really informed people who thought Intel would lower prices even if AMD went bankrupt because ~ ARM, well suffice to say that never happened & Intel would never lower prices especially if it was the only x86 CPU maker left. They only started lowering their prices in almost 1.5 decades because they had to, same goes for AMD now!

Basically the public perception isn't made of lies or fantasies but what you could term as desires :ohwell:

You said : "..because ~ ARM"

Don't think you thought that through.

ARM affected Intel far more negatively than AMD ever did.

The desktop is almost dead because of ARM. Mobile totally dominates the chip production market, only 9% of TSMCs production is AMD (and that includes all AMD, console, desktop, laptop, GPU, server), the other 91% of their production is fundamentally ARM. And then there are the other fabs who are all pretty much 100% ARM as far as core type.

Really the things AMD and Intel have right now are the desktop \ workstation, laptop productivity, and data center markets. As a percentage of the overall compute device market, these have shrunk while at the same time growing in absolute terms largely in order to service the ever growing mobile segment.

I am no ARM advocate at all, but realistically unless something changes ARM will probably destroy the conventional desktop / laptop market. That will likely happen when thin client computing becomes a viable thing and high speed network bandwidth becomes ubiquitous. Already seeing it to some degree with GPU being put into the cloud. I give it about 10 more years.
 
Joined
Apr 12, 2013
Messages
6,750 (1.67/day)
Don't think you thought that through.

ARM affected Intel far more negatively than AMD ever did.
I did, apparently the ones predicting Intel will lower their desktop/notebook (chips) prices eventually due to ARM becoming popular didn't! Till MS is fully on board with ARM you can kiss this pipedream goodbye, also at the time Intel was pushing double digit billion dollars to enter the mobile market. So I'm not sure if you have the context of that debate.
 
Joined
Oct 26, 2018
Messages
58 (0.03/day)
Why are the Database TPC-C scores so low with E-cores enabled? Perhaps the benchmark hasn't been updated to support ADL yet?
 
Joined
Jan 27, 2015
Messages
1,649 (0.49/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I did, apparently the ones predicting Intel will lower their desktop/notebook (chips) prices eventually due to ARM becoming popular didn't! Till MS is fully on board with ARM you can kiss this pipedream goodbye, also at the time Intel was pushing double digit billion dollars to enter the mobile market. So I'm not sure if you have the context of that debate.

I was speaking of things that already happened, and simply are. There's no pipedream there. I have a couple multiples more invested in devices containing ARM than x86. Most people do and don't even realize it, obviously.
 
Joined
Apr 12, 2013
Messages
6,750 (1.67/day)
Of course but I'm strictly speaking about desktop/notebook chips Intel sells. Since the ones in servers, or mobiles, are of no relevance to the point I made.
I have a couple multiples more invested in devices containing ARM than x86.
Also how much of that cost is the ARM processor vs say the display on it? You're making this unnecessarily complex, the mobile industry isn't just (about) ARM ~ in fact the most expensive components that go in it are usually the displays!
 
Joined
Jan 27, 2015
Messages
1,649 (0.49/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Of course but I'm strictly speaking about desktop/notebook chips Intel sells. Since the ones in servers, or mobiles, are of no relevance to the point I made.

You were talking about ARM like it didn't impact or compete with Intel. It did, it does, and Intel lost most of the mobile and embedded segments to ARM.

And I think you still don't understand what I'm pointing out. Let me put it another way -

PC/Laptop sales broke a record number of sales last year. That record had stood for 10 years.

So if there had been no ARM, do you think we would have had to go 10 years and experience a pandemic that sent everyone scrambling for a home PC before the market could break that record from 10 years ago for laptop/desktop?

It's not like the population has been declining or big new populations haven't obtained the means to buy a laptop/desktop - they just aren't buying.

If as you stated you are talking about PC/Laptop sales only as an enclosed ecosystem, then you are talking about an ecosystem that is slowly dying. ARM and mobile are slowly crushing it. That is competition.
 
Joined
Apr 12, 2013
Messages
6,750 (1.67/day)
Ok, without going into all the specifics here's what the argument was ~

Intel was pouring billions to get into the mobile market, AMD was on the verge of bankruptcy & Zen was 2-3 years away.

Now if AMD goes bankrupt at this point in time what happens? The most likely scenario ~

1) Intel monopolizes x86 (traditional PC) market & probably increases pricing. They may or may not have done massive price hikes but we'd certainly not be seeing 8c/16t under $500 by 2017 like from AMD.

2) Intel would likely keep on subsidizing their mobile SoC's with the increased profits & margins after AMD went bankrupt, heck they could essentially pay Apple to keep them on x86 for as along as Apple thought was necessary.

3) Desktops would never tide over to ARM unless MS or Google were subsidizing that push, now with Intel probably paying them both ARM on desktops (viz M1) would likely be a pipedream.
It's not like the population has been declining or big new populations haven't obtained the means to buy a laptop/desktop - they just aren't buying.
People today aren't necessarily buying mobiles or tablets as PC replacements, these markets rarely overlap, but supplanting their old PC's in a way. So when we talk about the traditional PC (x86) market you really think ARM would be much of a threat or Intel would lower their chip prices in this segment? Are you forgetting without AMD Intel would be in a much stronger position today?

And that's why we need competition ~ AMD going poof & Intel lowering prices, due to ARM, was & will always be a pipedream!
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
By the way, can you disable the E-cores in the BIOS? Like if you wanted to push for maximum gaming performance with 8C/16T.
Yes, it's an option in the BIOS, you can disable any number of E-Cores, including all of them. You can also disable any number of P-Cores, but at least one P-core has to be active

Why are the Database TPC-C scores so low with E-cores enabled? Perhaps the benchmark hasn't been updated to support ADL yet?
As mentioned in the conclusion, Thread Director/Windows 11 puts MySQL onto the E-Cores and not the P-Cores.

 
Joined
Dec 12, 2012
Messages
718 (0.17/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Yes, it's an option in the BIOS, you can disable any number of E-Cores, including all of them. You can also disable any number of P-Cores, but at least one P-core has to be active
That is cool! Does that also help with the DRM issues in certain games? I know there is some compatibility option, but I am curious if simply disabling the E-cores does the job.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
That is cool! Does that also help with the DRM issues in certain games? I know there is some compatibility option, but I am curious if simply disabling the E-cores does the job.
Yeah turning off E cores solves the Denuvo issues
 
Joined
Jun 10, 2014
Messages
2,902 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Just goes to show that high end MSDT is taking over where HEDT used to have its niche. The space between "server/datacenter chip" and "16c24t high clocking new arch MSDT chip" is pretty tiny, both in relevant applications and customers…
Back in the days HEDT used to be mostly about more cores, but as you were saying, these days mainstream has plenty of cores for even most "power users". The issue with mainstream today is IO, something which has become much more important after the arrival of M.2 SSDs. Having the flexibility of running 2-4 SSDs, a GPU, a 10G network card and/or some special cards (for those doing video) quickly exhausts the possibilities of these platforms. These are the typical concerns I hear from other "power users" such as myself, where it feels like there is a hole between mainstream and HEDT.

XMP is generally 100% stable though, unless you buy something truly stupidly fast. Of course you should always do thorough testing on anything mission critical, and running JEDEC for that is perfectly fine - but then you have to work to actually find those DIMMs in the first place.
I think you are missing the point.
XMP is about setting a profile which the memory is capable of, but not necessarily the memory controller. Most of the stability issues people hare having are actually related to their CPU sample, and the fact that the memory controller (like any other silicon) degrades over time, depending on use etc.
So no, choosing a XMP profile supported by the motherboard and the memory module doesn't guarantee stability, it may not even complete POST. And over time it will become more unstable, some motherboards may gradually turn down the clock speed or revert to a lower profile after system crashes or failed POSTs, and the end user may not even know it's happening.
 
Joined
May 8, 2021
Messages
1,978 (1.83/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
It's mainly those who follow the narrative that many YTers peddle these days or how clickbait sites like WTFtech publish each passing week! Those who've followed the tech landscape relatively closely over the last two decades know AMD was only really a big big deal post 2k, that was for about half a decade till Conroe entered the arena. Then AMD was absent for nearly another decade till Zen materialized, in between there were also really informed people who thought Intel would lower prices even if AMD went bankrupt because ~ ARM, well suffice to say that never happened & Intel would never lower prices especially if it was the only x86 CPU maker left. They only started lowering their prices in almost 1.5 decades because they had to, same goes for AMD now!

Basically the public perception isn't made of lies or fantasies but what you could term as desires :ohwell:
Actually AMD became big deal also since forever. Since AMD used to make similar parts to Intel, but later. They clocked them higher, sold them cheaper and competed that way with them, Usually they were strong competitors. Around K5 era, Intel had enough and patent trolled AMD away. So now they had to copyright their stuff and develop their own architecture, as well as boards. And while K5 was mostly a loss to AMD, AMD started to make strides with K6. It flopped, but once they improved it with K6-II and started to push 3D-now! instructions, they became strong in market. When they moved to K7 architecture, AMD launched Athlon chips. The fastest CPUs around and soon won 1GHz race. At that time AMD was doing exceptionally well and crushing Intel. They later made Athlon XP chips, which were much cheaper, vastly improved, gained another GHz, but weren't faster than Pentium 4s. Still, due to loads of flak for Intel about their lackluster launch of Pentium4, AMD was perceived very well, until AMD k8 architecture. It was vastly improved architecture, with slightly higher clocks, great thermals and power consumption, now K8 cores had support for 64 bit software and 4GB of RAM, while still remaining affordable. Intel only managed to raise clock speed, meddled with RAMBUS and FSB speeds to keep Pentium 4 on life support, meanwhile AMD was beating them on all fronts, literally everything. Be it power use, thermals, performance, value. Only in cache heavy tasks AMD was a bit behind, but in everything else it was a slaughter of Pentium 4. But AMD was not selling many of those Athlons, due to OEM exclusivity agreements with Intel and very heavy advertising of Pentium 4, while hyping up their clock speed. Then AMD came out with Athlon 64 FX, a fully unlocked, 3GHz server based chip made for consumers. It was very fast and outcompeted Pentium 4 Emergency Edition, but at 1000 dollar price and requirement to buy server board, it failed to get traction. Still, AMD soon came out with first consumer grade dual core chips, the Athlon 64 X2s. Intel had nothing to compete against AMD and had to glue together two Pentium 4s together. One Pentium 4 was dying from heat (https://arstechnica.com/civis/viewtopic.php?t=752419) and had awful thermals and horrendous power usage, why not put two of these into one chip? What a swell idea it was, until reviews came out and proved that Pentium D was slow, ran awfully hot and literally warped motherboards from heat. Intel only regained traction with later Core 2 Duo release. So all in all, AMD was always a competent rival with worst beating of Intel actually being around 1999/2000 and then going strong until 2005. Still, it's not like AMD made crap until Ryzen either. There were underwhelming releases from them, but generally AMD was not much behind Intel. In K10 era, AMD was big on value and managed to sell quad core chips for as little as 100 dollars, while this previously was only possible for over 300 dollars and you could overclock them too. Soon they came out with 6 core chips and managed to sell them at 200 dollars. Not so cheap, but way better than over 500 dollars for Intel one. And with FX chips AMD managed to sell a six core chips for a price of i3. And 8 core chips for a price on higher end i3. That really doesn't sound awful to me. AMD also tried to create some hype with beating world records of overclocking with Phenom II chips and with FX chips, which are still unbeaten today. On more sane side, they made APUs, which could run AAA games at respectable resolutions and nearly 60 fps, which was unprecedented for single chip or integrated graphics in general. They managed to change perception of iGPU being display adapter to a budget gaming powerhouse. AMD was also making very competent graphics cards at the time with Radeon HD 6000 series being strong competitor to nVidia. And nVidia only had Thermi (aka Fermi) cards out, which ran obscenely hot. Even worse nVidia launched second generation of Thermi, called Thermi 2.0 with very modest improvements. Flagship Radeon HD 6950 card consumed literally twice less power than Thermi based GTX 580. So AMD only didn't have a prestige due to their best CPUs not quite measuring up to Intel chips, but they weren't much behind and were unbeatable at budget. Despite all that and despite good times, AMD historically managed to sell chips at low prices (except flagships), it is only with Ryzen that AMD went full wanker with pricing and I'm glad that they finally got a kick in balls to lower prices a bit. To put things in perspective, they sold 6 core chips in 2012 for 120 dollars and 8 core chips for 160 dollars. At launch 5600X sold at 300 dollars, while being physically smaller die chip. It's really unbelievable how overly inflated prices of AMD chips became and mostly due to hype, which was partially meritless.
 
Joined
Oct 22, 2014
Messages
13,210 (3.80/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
Actually AMD became big deal also since forever. Since AMD used to make similar parts to Intel, but later. They clocked them higher, sold them cheaper and competed that way with them, Usually they were strong competitors. Around K5 era, Intel had enough and patent trolled AMD away. So now they had to copyright their stuff and develop their own architecture, as well as boards. And while K5 was mostly a loss to AMD, AMD started to make strides with K6. It flopped, but once they improved it with K6-II and started to push 3D-now! instructions, they became strong in market. When they moved to K7 architecture, AMD launched Athlon chips. The fastest CPUs around and soon won 1GHz race. At that time AMD was doing exceptionally well and crushing Intel. They later made Athlon XP chips, which were much cheaper, vastly improved, gained another GHz, but weren't faster than Pentium 4s. Still, due to loads of flak for Intel about their lackluster launch of Pentium4, AMD was perceived very well, until AMD k8 architecture. It was vastly improved architecture, with slightly higher clocks, great thermals and power consumption, now K8 cores had support for 64 bit software and 4GB of RAM, while still remaining affordable. Intel only managed to raise clock speed, meddled with RAMBUS and FSB speeds to keep Pentium 4 on life support, meanwhile AMD was beating them on all fronts, literally everything. Be it power use, thermals, performance, value. Only in cache heavy tasks AMD was a bit behind, but in everything else it was a slaughter of Pentium 4. But AMD was not selling many of those Athlons, due to OEM exclusivity agreements with Intel and very heavy advertising of Pentium 4, while hyping up their clock speed. Then AMD came out with Athlon 64 FX, a fully unlocked, 3GHz server based chip made for consumers. It was very fast and outcompeted Pentium 4 Emergency Edition, but at 1000 dollar price and requirement to buy server board, it failed to get traction. Still, AMD soon came out with first consumer grade dual core chips, the Athlon 64 X2s. Intel had nothing to compete against AMD and had to glue together two Pentium 4s together. One Pentium 4 was dying from heat (https://arstechnica.com/civis/viewtopic.php?t=752419) and had awful thermals and horrendous power usage, why not put two of these into one chip? What a swell idea it was, until reviews came out and proved that Pentium D was slow, ran awfully hot and literally warped motherboards from heat. Intel only regained traction with later Core 2 Duo release. So all in all, AMD was always a competent rival with worst beating of Intel actually being around 1999/2000 and then going strong until 2005. Still, it's not like AMD made crap until Ryzen either. There were underwhelming releases from them, but generally AMD was not much behind Intel. In K10 era, AMD was big on value and managed to sell quad core chips for as little as 100 dollars, while this previously was only possible for over 300 dollars and you could overclock them too. Soon they came out with 6 core chips and managed to sell them at 200 dollars. Not so cheap, but way better than over 500 dollars for Intel one. And with FX chips AMD managed to sell a six core chips for a price of i3. And 8 core chips for a price on higher end i3. That really doesn't sound awful to me. AMD also tried to create some hype with beating world records of overclocking with Phenom II chips and with FX chips, which are still unbeaten today. On more sane side, they made APUs, which could run AAA games at respectable resolutions and nearly 60 fps, which was unprecedented for single chip or integrated graphics in general. They managed to change perception of iGPU being display adapter to a budget gaming powerhouse. AMD was also making very competent graphics cards at the time with Radeon HD 6000 series being strong competitor to nVidia. And nVidia only had Thermi (aka Fermi) cards out, which ran obscenely hot. Even worse nVidia launched second generation of Thermi, called Thermi 2.0 with very modest improvements. Flagship Radeon HD 6950 card consumed literally twice less power than Thermi based GTX 580. So AMD only didn't have a prestige due to their best CPUs not quite measuring up to Intel chips, but they weren't much behind and were unbeatable at budget. Despite all that and despite good times, AMD historically managed to sell chips at low prices (except flagships), it is only with Ryzen that AMD went full wanker with pricing and I'm glad that they finally got a kick in balls to lower prices a bit. To put things in perspective, they sold 6 core chips in 2012 for 120 dollars and 8 core chips for 160 dollars. At launch 5600X sold at 300 dollars, while being physically smaller die chip. It's really unbelievable how overly inflated prices of AMD chips became and mostly due to hype, which was partially meritless.
My brain just had a fit looking at that wall of text.
I can't even.
 
Joined
May 8, 2021
Messages
1,978 (1.83/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
My brain just had a fit looking at that wall of text.
I can't even.
Well, I'm disappointed then. A short summary of that would be that AMD also has always been cool, even in pre Ryzen era and pre year 2000.
 
Joined
Jan 14, 2019
Messages
9,870 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Well, I'm disappointed then. A short summary of that would be that AMD also has always been cool, even in pre Ryzen era and pre year 2000.
An even better summary (in my point of view) is that AMD has always been good at offering competitive performance to Intel, at usually lower prices and with lower thermals. They only messed up the FX series, which took them a good 5-6 years to get out of, while Intel was riding a high horse with their 4-core chips. Now that Ryzen is good, AMD receives so much unnecessary hype that they can afford to ride the same high horse that Intel did. Now that Alder Lake is good too, AMD hopefully gets the message and lowers prices to competitive levels.

In my opinion, a lot of AMD's hype comes from the "underdog effect". Though I agree that new Ryzen is good, it's good in a very different way than Intel is good. This is the kind of competition and diversification that I like to see. :)
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
By the way, can you disable the E-cores in the BIOS?
You should be able to. And If you're running Windows 10 and you have problems with glitches or bugs, it might help. The Pcores are the shining part of Alder Lake anyway so the loss of the Ecores will not hurt your over-all performance much. However, if you're on Windows 11, leave them on.

That is cool! Does that also help with the DRM issues in certain games?
That is the general consensus. However, Denuvo is the overwhelming offender here and most devs are making patches to remove it.
 
Joined
May 2, 2017
Messages
7,762 (3.04/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
You should be able to. And If you're running Windows 10 and you have problems with glitches or bugs, it might help. The Pcores are the shining part of Alder Lake anyway so the loss of the Ecores will not hurt your over-all performance much. However, if you're on Windows 11, leave them on.


That is the general consensus. However, Denuvo is the overwhelming offender here and most devs are making patches to remove it.
There seems to be a (potentially very cool, or very janky, depending on perspective) BIOS-level fix for this. Don't know if it's on current BIOSes or upcoming, but they've announced a toggle ("Legacy game compatibility mode" or something) that when enabled lets you press Scroll Lock to park and effectively disable the E cores on the fly, avoiding Denuvo issues and the like. I'm a bit shocked that this doesn't hard crash the system, but I guess it automatically migrates all threads off of the cores before shutting them down?
An even better summary (in my point of view) is that AMD has always been good at offering competitive performance to Intel, at usually lower prices and with lower thermals. They only messed up the FX series, which took them a good 5-6 years to get out of, while Intel was riding a high horse with their 4-core chips. Now that Ryzen is good, AMD receives so much unnecessary hype that they can afford to ride the same high horse that Intel did. Now that Alder Lake is good too, AMD hopefully gets the message and lowers prices to competitive levels.

In my opinion, a lot of AMD's hype comes from the "underdog effect". Though I agree that new Ryzen is good, it's good in a very different way than Intel is good. This is the kind of competition and diversification that I like to see. :)
The underdog effect has a lot to do with AMD's current success, that's for sure. And yes, AMD has historically been everywhere from competitive to clearly ahead to clearly behind, but they have always been a much smaller company than Intel, with all the competitive drawbacks of that - including (but not limited to) various illegal anticompetitive behaviour on Intel's part. That "Dell is the best friend money can buy" quote from internal Intel communications that came to light in one of the antitrust cases summarizes things pretty eloquently IMO. There's one major factor that hasn't been mention though: the size of the total PC market and how this affects things. Intel's late-2000s performance advantage with Core, combined with the tail end of their various dubiously legal exclusivity deals, combined at the point where a) laptops became properly viable for everyday use, and b) the internet had become a properly commonplace thing. So Intel came into a rapidly expanding market (lower prices per unit, but volume increased dramatically) while also rapidly becoming the only viable server alternative. They had some damn good luck with their timing there, and IMO that contributed quite a bit to their dominance in the 2010s, not just that Bulldozer and its ilk were bad. Now, it was pretty bad, but in a less drastically unequal competitive situation that could have evened out in some way. Instead AMD nearly went bankrupt, but thankfully stayed alive thanks to their GPU division churning out great products still. As such, I guess we can thank ATI, and the AMD decision to buy them, for having the competitive and interesting CPU market we have today. And I don't think we would have gotten the underdog effect without the GPUs either - their ability to deliver good value GPUs is likely to have contributed significantly to that.

Actually AMD became big deal also since forever. Since AMD used to make similar parts to Intel, but later. They clocked them higher, sold them cheaper and competed that way with them, Usually they were strong competitors. Around K5 era, Intel had enough and patent trolled AMD away. So now they had to copyright their stuff and develop their own architecture, as well as boards. And while K5 was mostly a loss to AMD, AMD started to make strides with K6. It flopped, but once they improved it with K6-II and started to push 3D-now! instructions, they became strong in market. When they moved to K7 architecture, AMD launched Athlon chips. The fastest CPUs around and soon won 1GHz race. At that time AMD was doing exceptionally well and crushing Intel. They later made Athlon XP chips, which were much cheaper, vastly improved, gained another GHz, but weren't faster than Pentium 4s. Still, due to loads of flak for Intel about their lackluster launch of Pentium4, AMD was perceived very well, until AMD k8 architecture. It was vastly improved architecture, with slightly higher clocks, great thermals and power consumption, now K8 cores had support for 64 bit software and 4GB of RAM, while still remaining affordable. Intel only managed to raise clock speed, meddled with RAMBUS and FSB speeds to keep Pentium 4 on life support, meanwhile AMD was beating them on all fronts, literally everything. Be it power use, thermals, performance, value. Only in cache heavy tasks AMD was a bit behind, but in everything else it was a slaughter of Pentium 4. But AMD was not selling many of those Athlons, due to OEM exclusivity agreements with Intel and very heavy advertising of Pentium 4, while hyping up their clock speed. Then AMD came out with Athlon 64 FX, a fully unlocked, 3GHz server based chip made for consumers. It was very fast and outcompeted Pentium 4 Emergency Edition, but at 1000 dollar price and requirement to buy server board, it failed to get traction. Still, AMD soon came out with first consumer grade dual core chips, the Athlon 64 X2s. Intel had nothing to compete against AMD and had to glue together two Pentium 4s together. One Pentium 4 was dying from heat (https://arstechnica.com/civis/viewtopic.php?t=752419) and had awful thermals and horrendous power usage, why not put two of these into one chip? What a swell idea it was, until reviews came out and proved that Pentium D was slow, ran awfully hot and literally warped motherboards from heat. Intel only regained traction with later Core 2 Duo release. So all in all, AMD was always a competent rival with worst beating of Intel actually being around 1999/2000 and then going strong until 2005. Still, it's not like AMD made crap until Ryzen either. There were underwhelming releases from them, but generally AMD was not much behind Intel. In K10 era, AMD was big on value and managed to sell quad core chips for as little as 100 dollars, while this previously was only possible for over 300 dollars and you could overclock them too. Soon they came out with 6 core chips and managed to sell them at 200 dollars. Not so cheap, but way better than over 500 dollars for Intel one. And with FX chips AMD managed to sell a six core chips for a price of i3. And 8 core chips for a price on higher end i3. That really doesn't sound awful to me. AMD also tried to create some hype with beating world records of overclocking with Phenom II chips and with FX chips, which are still unbeaten today. On more sane side, they made APUs, which could run AAA games at respectable resolutions and nearly 60 fps, which was unprecedented for single chip or integrated graphics in general. They managed to change perception of iGPU being display adapter to a budget gaming powerhouse. AMD was also making very competent graphics cards at the time with Radeon HD 6000 series being strong competitor to nVidia. And nVidia only had Thermi (aka Fermi) cards out, which ran obscenely hot. Even worse nVidia launched second generation of Thermi, called Thermi 2.0 with very modest improvements. Flagship Radeon HD 6950 card consumed literally twice less power than Thermi based GTX 580. So AMD only didn't have a prestige due to their best CPUs not quite measuring up to Intel chips, but they weren't much behind and were unbeatable at budget. Despite all that and despite good times, AMD historically managed to sell chips at low prices (except flagships), it is only with Ryzen that AMD went full wanker with pricing and I'm glad that they finally got a kick in balls to lower prices a bit. To put things in perspective, they sold 6 core chips in 2012 for 120 dollars and 8 core chips for 160 dollars. At launch 5600X sold at 300 dollars, while being physically smaller die chip. It's really unbelievable how overly inflated prices of AMD chips became and mostly due to hype, which was partially meritless.
Paragraphs, man. Paragraphs!
I think you are missing the point.
XMP is about setting a profile which the memory is capable of, but not necessarily the memory controller. Most of the stability issues people hare having are actually related to their CPU sample, and the fact that the memory controller (like any other silicon) degrades over time, depending on use etc.
So no, choosing a XMP profile supported by the motherboard and the memory module doesn't guarantee stability, it may not even complete POST. And over time it will become more unstable, some motherboards may gradually turn down the clock speed or revert to a lower profile after system crashes or failed POSTs, and the end user may not even know it's happening.
You're right about that, but my point was based on current reality, which is that both major vendors have had IMCs for at least two generations that can reliably handle XMP settings at 3600 or higher. I don't believe I've ever heard of a recent-ish Intel or Zen2/3 CPU that can't handle memory at those speeds. Zen and Zen+ on the other hand ... nah. My 1600X couldn't run my 3200c16 kit at XMP, and saw intermittent crashes at anything above 2933. But AMD IMCs have improved massively since then. So while you're right on paper, reality is that any current CPU is >99% likely to handle XMP just fine (given that, as I've said all the time, you stick to somewhat reasonable speeds).
It's a 2600X on a b450 board. I had been looking into going to a newer gen Ryzen but the used market is horrible right now probably due to the problems in the brand new market. The bios after the one I am using added memory compatibility fixes for zen+ but since proxmox is now stable, I decided to not rock the boat.

Also it is a 4 dimm setup and when it was stable on windows it was only 2 dimms (should have mentioned), so take that into account. The official spec sheets for zen+ and the original zen show a huge supported memory speed drop for 4 dimms, if I remember right original zen only officially supports 1866mhz for 4 dimms?

My current 9900k handles the same ram that my 8600k couldnt manage at XMP speeds fine, I suspect i5's might have lower binned imc's vs i7 and i9.
Yeah, that makes sense. Zen and Zen+ IMCs were pretty bad, and 2dpc definitely didn't help. If those DIMMs had been dual rank as well, you might not have been able to boot at all. Thankfully AMD stepped up their IMC game drastically with Zen2 and beyond, and hopefully they're able to keep that up for DDR5.
Well, used to. At some points I ran 3 machines all day with 2 out of 3 with native Windows BOINC loads and one with linux VM and with BOINC loads in both linux and Windows. I don't do that anymore, but when you start out in crunching, you quickly realize how generally decent everyday CPU now suddenly becomes relatively inadequate. And soon you start to want Opterons or Xeons and then you you realize in what rabbit hole you end up.
I don't doubt it, but that's definitely niche territory :)
That's just one type of decompressing.
Yes, obviously, but my point was that compression/decompression workloads tend to be intermittent and relatively short-lived (unless you're working on something that requires on-the-fly decompression of some huge dataset, in which case you're looking at a much more complex workload to begin with). They might be tens or even hundreds of GB, but that doesn't take all that much time with even a mid-range CPU.
Or did it? Overclocked FX 83xx was slower at first pass, but faster at second pass.
I said "roughly even", didn't I? Slower in one, faster in the other, that's roughly even to me.
Don't forget that FX octa core chips costed nearly 3 tiems less than i7. And that was close to ideal situation for i7 as that workload clearly benefited from HT, some workloads have negative performance impact from using HT. And FX has real cores, therefore performance of overclocked FX should had been far more predictable than with i7.But power consumption... :D yeah, that was rough. But still even at stock speeds, FX was close to i7 and at second pass benchmark beat i7. Also FX chips were massively overvolted from factory. 0.3V undervolts were achievable on nearly any chip. Despite being more power hungry chip, AMD did no favours by setting voltage so unreasonably high.
Yeah, they were good if you had a low budget and workloads that were specifically MT-heavy and integer-only (with that shared FP unit per two cores making it effectively half the core count in FP loads). So, yes, they were great value for very specific workloads, but they were quite poor overall - and while this gave them a good niche to live in, the lack of flexibility is part of why Intel's perceived advantage grew so huge (in addition to the inefficiency and thus lack of scaling, of course).
Phenom II X6 was decent. Phenom II X6 had single core performance of first gen FX chips, which roughly translates as somewhere in between Core 2 Quad and first gen Core i series. It was closer to i7 in that regard than 3970X is to 5950X today. And Phenom II X6 1055T sold for nearly 2 times lower price than i7, so value preposition was great.

Seems very sketchy, boards clearly overheated, but I'm not sure if it's just boost that got cut or even bellow base speed. In 3900X video CPU clearly throttled bellow base clock, that's a fail by any definition. On Intel side it's even worse:

Gigabyte B560 D2V board throttled i9 way bellow base speed and some boards were just so so. On Intel Z490 side:

Asrock Phantom Gaming 4 was only legally not throttling. I guess it's a pass, but in hotter climate it would be a fail. And that's not really a cheap board, there are many H410 boards with even worse VRMs, which are complete gamble if they would work with i9 or not. I wouldn't have much confidence that they would.

All in all, there are plenty of shit claims from motherboard manufacturers, they mostly don't get flak for that because media only uses mid tier or high end boards, but if media cared about low end stuff, board makers would face lawsuits. I guess that's an improvement from AM3+ era, when certain MSI boards melted or caught on fire and many low end boards throttling straight to 800MHz
Hm, thanks for linking those. Seems the board partners have been even worse than I thought - and, of course, Intel deserves some flak for not ensuring that they meet the requirements of the platform. This is simply not acceptable IMO.
I don't see anything a bad about putting i9K on H510 board. After all, manufacturers claim that they are compatible. If you are fine with less features, lower end chipset and etc, you may as well not pay for fancier board. Also, some people upgrade old system which had i3 later to i7s. Today that would be throttlefest (with i9). I don't see anything unreasonable about upgrading CPU later and I don't think that those people deserve to have their VRMs burning.
In theory you're right, but if you're buying a high end CPU for a heavy, sustained workload, you really should also buy something with a VRM to match. If not, you're making a poor choice, whether that is due to ignorance or poor judgement. That could still be a H510 board of course, it's just that nobody makes a H510 board with those kinds of VRMs. That doesn't take away the responsibility of Intel and board manufacturers to ensure they can meet spec - i.e. every board should be able to handle PL1 power draws indefinitely for the most power hungry part on the platform, and should also handle PL2 reasonably well as long as conditions aren't terrible (i.e. there is some ventilation cooling the VRMs and ambient temperatures are below 30°C or thereabouts). The issue, and why I said you shouldn't buy that combo, is that all the H510 boards are bargain-basement boards that generally can't be expected to do this, and will almost guaranteed leave performance on the table - which undermines the "I only need CPU perf, not I/O" argument for going that route in the first place.
On the other hand, you could have bought non K i5 or i7 and see it last for nearly decade with excelent performance. It was unprecedented stagnation, but it wasn't entirely good or bad. Even Core 2 Quad or Phenom II X4 users saw their chips last a lot longer than expected. Game makers made games runable on that hardware too, now core race got restarted and I don't think that we will see chips with usable lifespan as long as Sandy, Ivy or Haswels. You may say that's good. Maybe for servers and HEDT it is, but for average consumer that means more unnecessary upgrading.
*raises hand* I used my C2Q 9450 from 2008 till 2017 ;)
You're entirely right that these chips lasted a long time. However, a huge part of that is that MSDT platforms went quite rapidly from 1 to 4 cores, and then stopped for a decade, meaning software wasn't made to truly make use of them. HEDT platforms had more cores, but were clocked lower, meaning you had to OC them to match the ST perf of MSDT parts - and still topped out at 6 or 8 cores. Now that we've had 8+ core MSDT for four years we're seeing a lot more software make use of it, which is great as it actually makes things faster, rather than the <10% perf/year ST improvements we got for most of the 2010s.
Well, I made a point about VRMs, more RAM channels, more PCIe lanes and etc. HEDT boards were clearly made to professional use and those who migrated to mainstream are essentially not getting a full experiences minus performance. Is that really good? Or is it just some people pinching pennies and buying by performance only?
There's definitely a point there, but the shrinking of the HEDT market shows us that the most important thing for most of these customers has been CPU performance, with I/O being secondary. And of course PCIe 4.0 and current I/O-rich chipsets alleviate that somewhat, as you can now get a lot of fast storage and accelerators even on a high-end MSDT platform. X570 gives you 20 native lanes of PCIe 4.0 plus a bunch off the chipset, and there are very few people who need more than a single GPU/accelerator + 3-4 SSDs. They exist, but there aren't many of them - and at that point you're likely considering going for server hardware anyhow.
Not at all, there used to be Ryzen 3100s, Ryzen 3200G-3400Gs, various Athlons. On Intel side Celerons and Pentiums were always available without issues, now they became unobtanium, well except Pentium 4 :D. Budget CPUs are nearly wiped out as a concept, along with GPUs. They don't really exist anymore, but they did in 2018.
That's quite different from what I'm used to seeing across these and other forums + in my home markets. Here, Celerons and Pentiums have been nearly nonexistent since ... I want to say 2017, but it might have been 2018, when Intel's supply crunch started. Ryzen 3000G chips have come and gone in waves, being out of stock for months at a time, particularly the 3000G and 3200G, and the Ryzen 3100, 3300 and 3500 have been nearly unobtainable for their entire life cycle. These chips seem to have been distibuted almost entirely to OEMs from what I've seen, but clearly there are regional differences.
Maybe, but AMD has fanboys, never underestimate fanboys and their appetite for being ripped off.
Oh, absolutely. But that just underscores what a massive turnaround this has been. They've always had a small core of fans, but that suddenly turning into a dominant group is downright shocking.
Or is it? I find my country's society mind boggling at times. I was reading comments in phone store about various phones and found out that S20 FE is "budget" phone and that A52 is basically poverty phone. Those people were talking about Z Flips and Folds as if they were somewhat expensive but normal, meanwhile average wage in this country is more than 2 times lower than what latest Z flip costs. And yet this exactly same country loves to bitch and whine how everything is bad, how everyone is poor or close to poverty. I really don't understand Lithuanians. It makes me think that buying 5950X may be far more common than I would like to admit and that those two stores may be a reasonable reflection of society.
Hm, that sounds ... yes, mind-boggling. At least from that description it sounds like a society with a significant fixation on wealth and prestige - not that that's uncommon though. Still, in Norway and Sweden too we've gone from a flagship phone at 4-5 000NOK being considered expensive a decade ago to 10-15 000NOK flagships selling like hotcakes these days. Of course people's usage patterns and dependency on their phones have changed massively, so it makes some sense for them to be valued higher and people being more willing to spend more on them, but that's still a staggering amount of money IMO. PC parts haven't had quite the same rise, possibly due to being more of an established market + more of a niche. There have always been enthusiasts buying crazy expensive stuff, but there's a reason why the GTX 1060 is still the most used GPU globally by a country mile - cheaper stuff that performs well is still a much easier sell overall.
I generally dislike public perception, but you have to admit that whatever AMD did with marketing was genius. People drank AMD's koolaid about those barely functional buggy excuses of CPUs and hyped Zen 1 to the moon. Despite it being really similar to how FX launched. Focus on more cores, poor single threaded performance, worse power consumption than Intel. I genuinely thought that Ryzen will be FX v2, but nah. Somehow buying several kits of RAM to find one compatible, having computer turning on once in 10 tries and overall being inferior than Intel suddenly became acceptable and not only that, but desirable. Later gens were better, but it was the first gen that built most of Ryzen's reputation. And people quickly became militant of idea that Intel is still better, some of them would burn you at stake for saying such heresy. And now people are surprised that Intel is good again, as if Intel was a clear market leader and hasn't been for half century.
I think the promise of Intel HEDT-like performance (largely including the ST performance, though of course you could OC those HEDT chips) was much of the reason here, as well as the 5+ year lack of real competition making an opening for a compelling story of an up-and-coming company. If nothing else, Zen was exciting, as it was new, different, and competitive. You're mistaken about the power consumption though: Zen was very efficient from the start. That was perhaps the biggest shock - that AMD went from (discontinued) 250+W 8c CPUs that barely kept up with an i5 to ~100W 8c CPUs that duked it out with more power hungry Intel HEDT parts, and even competed on efficiency with Intel MSDT (though their ST perf was significantly behind).

I also think the surprise at "Intel being good again" is mainly due to Intel first not delivering a meaningful performance improvement for a solid 4 years (incremental clock boosts don't do much), and failing to get their new nodes working well. Rocket Lake underscored that with an IPC regression in some workloads - though that was predictable from the disadvantages of backporting a 10nm design to 14nm. Still, this is the first time in more than half a decade that Intel delivers true improvement in the desktop space. Their mobile parts have been far better (Ice Lake was pretty good, Tiger Lake was rather impressive), but they've had clear scaling issues going above 50-ish watts and the very short boost times of laptops. They've finally demonstrated that they haven't lost what made them great previously - it just took a long time.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,000 (4.60/day)
Location
Kepler-186f
I really appreciate W1zzard's last sentence of the review, "Just to provide a little bit of extra info on why I gave "Recommended" to the 12900K, but "Editor's Choice" to the 12700K and 12600K. I feel like the super-high power limit on the 12900K is just pushing things too high, probably for the sake of wining Cinebench, with limited gains in real-world usage, and too much of a toll on power and heat."

This type of clarity was missing in past reviews on this site and it is much appreciated. More transparency on the awards is a good thing. I agree with this last sentence 100% as well. 12700k seems to be the sweet spot if you want the best of the best without compromising your humanity. :roll:
 
Joined
May 8, 2021
Messages
1,978 (1.83/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
An even better summary (in my point of view) is that AMD has always been good at offering competitive performance to Intel, at usually lower prices and with lower thermals. They only messed up the FX series, which took them a good 5-6 years to get out of, while Intel was riding a high horse with their 4-core chips. Now that Ryzen is good, AMD receives so much unnecessary hype that they can afford to ride the same high horse that Intel did. Now that Alder Lake is good too, AMD hopefully gets the message and lowers prices to competitive levels.

In my opinion, a lot of AMD's hype comes from the "underdog effect". Though I agree that new Ryzen is good, it's good in a very different way than Intel is good. This is the kind of competition and diversification that I like to see. :)
I frankly still don't like Ryzen. I think that K10 era was the best, because AMD had tons of skus. Want a single core? Buy Sempron. Want triple core? Phenom II X3 it is. Want power saving chips? Buy e CPUs. Want 100 EUR quad core? Athlon II X4 is there. Want excellent gaming chip? Get Phenom II X4. Want crunching monster? Get Phenom II X6 1100T? Want the same but cheaper? Get 1055T. Don't even have AM3 board? AM2+ is fine. Got FX chip with AM3+ board? You can still use all Athlon IIs, Phenom IIs and Semprons and maybe K10 Opties too. I loved that diversity and it was fun to find less common models. Too bad I was too young to get into computers then, but if that weren't the case, AMD was the ultimate value provider. Now they only have a few skus for each person and AMD sure as hell doesn't offer super cheap, but downclocked 16 core chips. If you are cheap, your only options are Ryzen 3s and even then they are overpriced. Athlons now suck. And every single chips is focused on performance, meaning is clocked as far as they can. There is no real e chip availability. And now the equivalent of those old Athlon IIs would be octa core chips with maybe cut down L3 cache (Athlon IIs had no L3 cache, that was Phenom II exclusive feature). There's no way now that AMD would sell an 8C/16T chip at 100 EUR/USD or one with low heat output. There just aren't deals like that anymore, value AMD has been dead since FX got discontinued. Ryzen is antithesis to value. First gen Ryzen was kinda cool, but nothing beyond that.
 
Joined
May 2, 2017
Messages
7,762 (3.04/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I frankly still don't like Ryzen. I think that K10 era was the best, because AMD had tons of skus. Want a single core? Buy Sempron. Want triple core? Phenom II X3 it is. Want power saving chips? Buy e CPUs. Want 100 EUR quad core? Athlon II X4 is there. Want excellent gaming chip? Get Phenom II X4. Want crunching monster? Get Phenom II X6 1100T? Want the same but cheaper? Get 1055T. Don't even have AM3 board? AM2+ is fine. Got FX chip with AM3+ board? You can still use all Athlon IIs, Phenom IIs and Semprons and maybe K10 Opties too. I loved that diversity and it was fun to find less common models. Too bad I was too young to get into computers then, but if that weren't the case, AMD was the ultimate value provider. Now they only have a few skus for each person and AMD sure as hell doesn't offer super cheap, but downclocked 16 core chips. If you are cheap, your only options are Ryzen 3s and even then they are overpriced. Athlons now suck. And every single chips is focused on performance, meaning is clocked as far as they can. There is no real e chip availability. And now the equivalent of those old Athlon IIs would be octa core chips with maybe cut down L3 cache (Athlon IIs had no L3 cache, that was Phenom II exclusive feature). There's no way now that AMD would sell an 8C/16T chip at 100 EUR/USD or one with low heat output. There just aren't deals like that anymore, value AMD has been dead since FX got discontinued. Ryzen is antithesis to value. First gen Ryzen was kinda cool, but nothing beyond that.
You're disregarding some pretty major factors here though: those old chips were made on relatively cheap production nodes. Per-wafer costs for recent nodes are easily 2x what they were just a 1-2 generations back, and many times the cost of older nodes (yes, the older nodes were likely more expensive when new than the numbers cited in that article, but nowhere near current cutting-edge nodes). And while these nodes are also much denser, transistor counts are also much higher for current CPUs. The new CCDs are just 80-something mm² compared to over 300 for those 45nm Phenom II X6es, but then those were made on the relatively cheap 45nm node, and of course the Ryzens also have a relatively large IOD as well, on a slightly cheaper node. Substrates and interconnects are also much more expensive due to the high speed I/O in these chips. The older stuff was cutting-edge for its time, obviously, but it was still much, much cheaper to produce. Double, triple and quadruple patterning, more lithographic masks, more time in the machines, and the introduction of EUV are all driving costs up. A $100 downclocked 16-core Ryzen wouldn't be even remotely feasible today.

You're also misrepresenting things a bit IMO. The Zen Athlons have been great value and great bang for the buck as long as they have existed. E and GE parts are available, but mainly for OEMs, as they generally sell in very low numbers on the DIY market, making it not worth the distribution cost - this is the same for Intel's T SKUs, though Intel of course has the advantage of ~80% market share and thus ~5x more product availability. You're also missing that literally every single Ryzen chip has lower power modes built in, which can be toggled through the BIOS - at least 65W for the 105W SKUs and 45/35W for the 65W SKUs, though I've seen screenshots of 45W modes for 105W SKUs as well. You can buy a 5950X and make it a "5950XE" with a single BIOS toggle. The current Zen APUs are also extremely power efficient, and can run very low power if you toggle that mode or hand tune them. You're right that they are currently focusing exclusively on what I would call upper midrange and upwards, which is a crying shame, as lower core count Zen3 CPUs or APUs would be a fantastic value proposition if priced well (there have been some glowing reviews of the 5300G despite it only being available through OEMs). But AMD is sadly following the logic of capitalism: in a constrained market, you focus on what is the most profitable. Which means you bin and sell parts as the highest possible SKU, rather than dividing them between a full range of products. This will likely continue until we are no longer supply constrained, which will be a while. After all, with current laws for publicly traded companies, they could (and would) be sued by shareholders if they did anything but that, as the legislation essentially mandates a focus on maximizing profits above all else, on pain of significant monetary losses from lawsuits. And while I have no doubt AMD's executives wholeheartedly buy into this and think it's a good thing to maximize profits overall, they are also conditioned into this through rules, regulations, systems and societies that all adhere to this logic.
 
Joined
May 8, 2021
Messages
1,978 (1.83/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
I don't doubt it, but that's definitely niche territory :)
And the fun thing is that bulldozer era Opterons are really cheap and if there are cheaper boards, it might had been good at crunching. 16 FX cores in cheap Opteron? Sounds pretty cool, until you realize how woefully slow each one of them is and how they get trashed by Xeons costing the same and having much stronger board availability. i didn't buy anything, because I really had no space to keep something like that running for a long time, but it was very tempting.

Yes, obviously, but my point was that compression/decompression workloads tend to be intermittent and relatively short-lived (unless you're working on something that requires on-the-fly decompression of some huge dataset, in which case you're looking at a much more complex workload to begin with). They might be tens or even hundreds of GB, but that doesn't take all that much time with even a mid-range CPU.
Some compressed files don't have to be big to be hard to decompress, it depends on how complicated archive is. I still remember suffering over 4 hours with Athlon X4 845 with 5-6GB archive.

Yeah, they were good if you had a low budget and workloads that were specifically MT-heavy and integer-only (with that shared FP unit per two cores making it effectively half the core count in FP loads). So, yes, they were great value for very specific workloads, but they were quite poor overall - and while this gave them a good niche to live in, the lack of flexibility is part of why Intel's perceived advantage grew so huge (in addition to the inefficiency and thus lack of scaling, of course).
I bought FX 6300, because it was awesome. I just couldn't accept that my money was only worth some poopy 2C/4T i3. It felt insulting, so got FX 6300 instead. And it aged quite well, i3 was stuttering mess a year or two later.

Hm, thanks for linking those. Seems the board partners have been even worse than I thought - and, of course, Intel deserves some flak for not ensuring that they meet the requirements of the platform. This is simply not acceptable IMO.
It's the same with AMD platforms and to be honest, all brands produce overheating boards. It's just that Gigabyte and Asrock do it more often, but I don't doubt that Biostar would fare quite poorly too. MSI and Asus seem to be generally more robust and MSI stopped making boards that catch on fire. I remember seeing Dawid's video about NZXT board and NZXT proved to be a brand to avoid. Still the general suggestion is to shop by model, not by brand.

And beyond overheating, you can have really shit experience for entirely different reasons:

Now that board is completely dead.


In theory you're right, but if you're buying a high end CPU for a heavy, sustained workload, you really should also buy something with a VRM to match. If not, you're making a poor choice, whether that is due to ignorance or poor judgement. That could still be a H510 board of course, it's just that nobody makes a H510 board with those kinds of VRMs. That doesn't take away the responsibility of Intel and board manufacturers to ensure they can meet spec - i.e. every board should be able to handle PL1 power draws indefinitely for the most power hungry part on the platform, and should also handle PL2 reasonably well as long as conditions aren't terrible (i.e. there is some ventilation cooling the VRMs and ambient temperatures are below 30°C or thereabouts). The issue, and why I said you shouldn't buy that combo, is that all the H510 boards are bargain-basement boards that generally can't be expected to do this, and will almost guaranteed leave performance on the table - which undermines the "I only need CPU perf, not I/O" argument for going that route in the first place.
But what if you really are fine with H510 features? Or A520 features? I'm aware that this is not a common consideration, but if manufacturers claim that i9 or 5950X works with it and then it doesn't, that's just unacceptable.


*raises hand* I used my C2Q 9450 from 2008 till 2017 ;)You're entirely right that these chips lasted a long time. However, a huge part of that is that MSDT platforms went quite rapidly from 1 to 4 cores, and then stopped for a decade, meaning software wasn't made to truly make use of them. HEDT platforms had more cores, but were clocked lower, meaning you had to OC them to match the ST perf of MSDT parts - and still topped out at 6 or 8 cores. Now that we've had 8+ core MSDT for four years we're seeing a lot more software make use of it, which is great as it actually makes things faster, rather than the <10% perf/year ST improvements we got for most of the 2010s.
And Core 2 Quads lasted, but Core 2 Duos didn't. Just literally 2 years after Core 2 Duo launch, games like Red Faction Guerilla were absolutely hammering Core 2 Duos and you really had to have that Quad to play certain games. Far Cry 2 was another surprisingly CPU heavy title.


That's quite different from what I'm used to seeing across these and other forums + in my home markets. Here, Celerons and Pentiums have been nearly nonexistent since ... I want to say 2017, but it might have been 2018, when Intel's supply crunch started. Ryzen 3000G chips have come and gone in waves, being out of stock for months at a time, particularly the 3000G and 3200G, and the Ryzen 3100, 3300 and 3500 have been nearly unobtainable for their entire life cycle. These chips seem to have been distibuted almost entirely to OEMs from what I've seen, but clearly there are regional differences.
In my country, only some graphics cards were missing for a short time, while pricing became insane. Over 1k EUR for Vega 56 or 800 EUR for RX 580, well shit happened. It make C19 situation look meek.

Hm, that sounds ... yes, mind-boggling. At least from that description it sounds like a society with a significant fixation on wealth and prestige - not that that's uncommon though.
And yet historically Lithuanian society has been meek, introverted, extremely independent, hating fellow Lithuanians more than anyone else (unless they are Russians or Poles), poor, industrious and thrifty. If you imagine a Lithuanian living in middle of nowhere or in small village and spending most of the time farming, that's exactly what they were. Hating Lithuanians more than anyone else is a modern bit, previously Lithuanians were very conservative, traditional and judgmental.


Still, in Norway and Sweden too we've gone from a flagship phone at 4-5 000NOK being considered expensive a decade ago to 10-15 000NOK flagships selling like hotcakes these days. Of course people's usage patterns and dependency on their phones have changed massively, so it makes some sense for them to be valued higher and people being more willing to spend more on them, but that's still a staggering amount of money IMO. PC parts haven't had quite the same rise, possibly due to being more of an established market + more of a niche. There have always been enthusiasts buying crazy expensive stuff, but there's a reason why the GTX 1060 is still the most used GPU globally by a country mile - cheaper stuff that performs well is still a much easier sell overall.
My parents went from flip phones straight to flagships phones. My dad has S6 and mom has S9. Somehow they didn't consider going cheaper, despite talking a lot about how valuable money is.


I think the promise of Intel HEDT-like performance (largely including the ST performance, though of course you could OC those HEDT chips) was much of the reason here, as well as the 5+ year lack of real competition making an opening for a compelling story of an up-and-coming company. If nothing else, Zen was exciting, as it was new, different, and competitive. You're mistaken about the power consumption though: Zen was very efficient from the start. That was perhaps the biggest shock - that AMD went from (discontinued) 250+W 8c CPUs that barely kept up with an i5 to ~100W 8c CPUs that duked it out with more power hungry Intel HEDT parts, and even competed on efficiency with Intel MSDT (though their ST perf was significantly behind).
Zen 1 was quite cool, but very problematic at launch.


I also think the surprise at "Intel being good again" is mainly due to Intel first not delivering a meaningful performance improvement for a solid 4 years (incremental clock boosts don't do much), and failing to get their new nodes working well. Rocket Lake underscored that with an IPC regression in some workloads - though that was predictable from the disadvantages of backporting a 10nm design to 14nm. Still, this is the first time in more than half a decade that Intel delivers true improvement in the desktop space. Their mobile parts have been far better (Ice Lake was pretty good, Tiger Lake was rather impressive), but they've had clear scaling issues going above 50-ish watts and the very short boost times of laptops. They've finally demonstrated that they haven't lost what made them great previously - it just took a long time.
We will see if they haven't lost it with next release. Alder Lake is quite okay, just not like Ryzen was and not liek Sandy Bridge was. The key is continuous strong execution.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
This type of clarity was missing in past reviews on this site and it is much appreciated. More transparency on the awards is a good thing
There has been so much drama about my awards, so now I'm trying to be a bit more explicit where I suspect people will be like "wtf why is this getting an award? wizz must have been bought by <amd/nvidia/intel/all three>" Breaking down every single award recommendation seems overkill though and would essentially replace the conclusion text
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,000 (4.60/day)
Location
Kepler-186f
There has been so much drama about my awards, so now I'm trying to be a bit more explicit where I suspect people will be like "wtf why is this getting an award? wizz must have been bought by <amd/nvidia/intel/all three>" Breaking down every single award recommendation seems overkill though and would essentially replace the conclusion text

I had no idea! I actually wasn't even thinking of your reviews specifically, just reviews in general on this site, usually they are on target, other times I scratch my head and I am like how is this highly recommended yet it has more negatives than pros in the review section... just stuff like that, not necessarily anyone specifically. It all works at end of day though. I think reviews are just fine, I like the way and format they are done here.
 
Top