• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

10c difference between CCD1 and CDD2 on AMD 5950x

Joined
Jan 4, 2013
Messages
1,212 (0.27/day)
Location
Denmark
System Name R9 5950x/Skylake 6400
Processor R9 5950x/i5 6400
Motherboard Gigabyte Aorus Master X570/Asus Z170 Pro Gaming
Cooling Arctic Liquid Freezer II 360/Stock
Memory 4x8GB Patriot PVS416G4440 CL14/G.S Ripjaws 32 GB F4-3200C16D-32GV
Video Card(s) 7900XTX/6900XT
Storage RIP Seagate 530 4TB (died after 7 months), WD SN850 2TB, Aorus 2TB, Corsair MP600 1TB / 960 Evo 1TB
Display(s) 3x LG 27gl850 1440p
Case Custom builds
Audio Device(s) -
Power Supply Silverstone 1000watt modular Gold/1000Watt Antec
Software Win11pro/win10pro / Win10 Home / win7 / wista 64 bit and XPpro
I was planing af repaste of my cpu to se if there would be any difference - and to be honest - I could not remember if I had used my new Noctua NT-H2 paste on my own cpu or only on som machine I have build during the last couple of months... Doing testing afterwards - burn in if you can call it that - I noticed that there was a 10c difference between CCD1 and CDD2
1619778263418.png

Looking at some earlier test from februar I saw almost excatly the same difference between the two cores - with CCD1 up and about 78-80c and CCD2 almost 10c lower - now theres always a difference in mount and paste, but that seems more or less continuously.

My question is what do other dual CCDs Owners experience? E.g. 3900x/3950x and 5900x/5950x models?

Did the "burn in" with CPUz stress test
1619778722268.png
 
Same behaviour on my 5900X, and I'm not on water. I'll upload some screenies tmr to show what I see. If I play MW that day, CCD1 maxes out 80-85C and CCD2 stays around 10C if not more cooler.

It's just a matter of usage usually. Which makes sense, CCD1 will always have the best cores on the chip, so Windows keeps tasks on those two cores (Core 0 and 1 for me). Sometimes Windows likes to use 1 core off CCD2 for background processing (I think it's like Core 7 or something for me), but even when Windows needs to fill 6 cores it will stay within CCD1 until it absolutely needs more cores than CCD1 can provide. Most days I'll see 13-15W max on Core 0 and 1, then around 10-11W max on Core 2-5, then single digit power draw for CCD2.

The max temp readings on Ryzen 5000 seem to almost always be transient spikes + 10-20C under single threaded load, never average or consistent temps. For example I have HWInfo open on my second screen and while CCD1 stays at 60-75C mostly, I'll occasionally see a spike up to 80-85C for a single polling interval and back down again.

What I instead found interesting was that in some all-core benchmarks I'd have CCD2 running consistently hotter by about 3C. All 12 cores under max load, same Vcore, etc. Never really figured out why, there's no mounting issue. Not an issue since benchmarks always top out at 70-72C max, I don't run PBO.
 
Could be due to the design, placement of the cores could benefit from an offset cooler mount.
 
My CCD temps only differ by just 2c on average in cpuz and r20. In day to day usage it could differ by a 10-15c obviously but in consistent threaded tasks they're pretty close together.

r20
temp test r20.png


cpuz
temp test cpuz.png
 
tiny littly bubble in the solder and you see results like that.
my 5900X had almost 25°C difference under high load. until it landed on ebay.
 
My first 3900 had a 15-20 degree difference and it was caused by ccd0 using more voltage than it needed.

for example ccd1 each core was 7-8w under full load and ccd0 was 10-11 w so I RMAd and my current 3900 both CCD use 7-8w per core.
 
I can echo what @tabascosauz and @thesmokingman have experienced.

I have a 3900x & 3950x that are running a constant all core load 24/7 and the average difference is 2C over an 86 hour period (time since the last reboot) with CCD1 having the higher temp. on both CPUs.

I don't even look at my 5800x because of temperature anxiety. :fear:
 
Reading this thread. Gets me curious how my new 5950X will behave when I can get it up and running. Still awaiting the last parts to come home.

I Will be cooling it with noctua NH-D15 CHROMAX.BLACK with Thermal grizzly kryonaut extreme and swap the stock fans for noctua industrial IPPC 3000 RPM fans.
 
@jesdals there's that temp delta between CCD1 and CCD2. If I run all-core benchmarks only, CCD2 is hotter by 3C always. If I just use the comp normally and do some gaming, Core 0 and 1 start being used and CCD1 starts outstripping CCD2 by 10-15C.

Bench temps:

5900x temps.png


Daily temps:

5900x normal temps.png

NH-C14S with a NF-A14 iPPC-2000.
 
Last edited:
I was planing af repaste of my cpu to se if there would be any difference - and to be honest - I could not remember if I had used my new Noctua NT-H2 paste on my own cpu or only on som machine I have build during the last couple of months... Doing testing afterwards - burn in if you can call it that - I noticed that there was a 10c difference between CCD1 and CDD2
View attachment 198624
Looking at some earlier test from februar I saw almost excatly the same difference between the two cores - with CCD1 up and about 78-80c and CCD2 almost 10c lower - now theres always a difference in mount and paste, but that seems more or less continuously.

My question is what do other dual CCDs Owners experience? E.g. 3900x/3950x and 5900x/5950x models?

Did the "burn in" with CPUz stress test
View attachment 198628

HWiNFO can tell a lot if you look in the right place.
Your answer is here

HWiNFO_.png


CCD1 uses +25% power compared to CCD2. And also... where the heck is "Core 0 Power (SMU)"...???

It could be different binned CCDs of maybe your V/F curve settings (Optimizer).

I suggest to get the latest version of HWiNFO64 and use the "Snapshot CPU Polling" as its more accurate. Also you can turn on the tool tip function for the sensors.
 
Last edited:
CCD1 uses +25% power compared to CCD2. And also... where the heck is "Core 0 Power (SMU)"...???

It's a known bug with pre-v.700 HWinfo when the system is running AGESA 1201 or newer. Same problem with the C-state residency numbers in that image. Man needs to update his HWInfo.
 
Gave the new HWinfo a try - still seeing the same picture - It doesent work with individual curve settings on my CPU - but could wonder if it would be possible to lower the difference with a bigger negative ofset on the CCD2 cores.
1619897112800.png
 
You're using curve optimizer.... that's your problem I suspect. The cores are not getting the same voltage nor are they controlled by AMD's logic. I would test with at stock and see if the pattern continues.
 
You're using curve optimizer.... that's your problem I suspect. The cores are not getting the same voltage nor are they controlled by AMD's logic. I would test with at stock and see if the pattern continues.
Trying to disable curve optimizer shows the same difference - but with lower boost - will try if I can get individual core settings stable with the curve setting.
1619942828061.png
 
Trying to disable curve optimizer shows the same difference - but with lower boost - will try if I can get individual core settings stable with the curve setting.
View attachment 198861
Still this is not entirely stock judging by the 186W PPT. It's PBO Enabled with manual limits and speed override, right?

I think this is differently binned chiplets, if there is also a difference at pure stock settings.
 
Back
Top