• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 9 9950X3D

Didn't check that specifically, I vaguely remember seeing in some of my data that it's around 5.5, because I first was like "uh didn't they say 5.7?"

The 5.7GHz Boost Clock is the Non 3D V-Cache CCD, the 3D V-Cache CCD should have lower clocks (closer to the 9800X3D).

It's not strange at all. More power per core = higher temps.

The 3D V-Cache CCD on the x950X3D is usually the same as the 8c/16t CCD on the 9800X3D/7800X3D/5800X3D. The Non 3D V-Cache one is the one that has higher Core Clocks and better efficiency.

It's not strange at all. More power per core = higher temps.

The 3D V-Cache CCD on the x950X3D is usually the same as the 8c/16t CCD on the 9800X3D/7800X3D/5800X3D. The Non 3D V-Cache one is the one that has higher Core Clocks and better efficiency.
Clock speeds are higher on 9950x3D on load

View attachment 389234View attachment 389235

The clocks on the left are the Non 3D V-Cache, but the 3D V-Cache one should be around the 9800X3D clock-wise. Still the 9800X3D should not run warmer than the 9950X3D for sure.
Hey @W1zzard could you check again, please?
 
Curious to see how the 9950X3D perform against the 9800X3D in PUBG with the second CCD disabled. PUBG is pretty sensitive to CPU topology, and with the 9950X3D having a larger L3 cache
 
It is interesting reading the comments of people that are saying turning off a CCD would be better for Gaming. Did any of them read the actual review? Gaming is not just running the Game but also loading, installing mods and creating folders. The reviews show that this chip is good enough that people on 7000 would feel it in daily use. The charts comparing it with the 7950X3D especially are eye opening for what AMD is still able to achieve in 1 generation. They are all sold out where I live right now.
 
It’s a great CPU. Too power hungry for my taste. The 9700X or 9800X3D would be what I’d go for. The big thing this exposes is AMD’s horrible idle power consumption. 30W higher than Intel a lot of room for improvement there.
 
It is interesting reading the comments of people that are saying turning off a CCD would be better for Gaming. Did any of them read the actual review? Gaming is not just running the Game but also loading, installing mods and creating folders. The reviews show that this chip is good enough that people on 7000 would feel it in daily use. The charts comparing it with the 7950X3D especially are eye opening for what AMD is still able to achieve in 1 generation. They are all sold out where I live right now.

Most modern games are still optimized for 8c/16t (just like on Consoles), therefore having more than 8c/16t doesn't grant you anything in Gaming. If anything the extra CCD adds more Latency, therefore lower performance. And the 2nd CCD doesn't have the 3D V-Cache either so it doesn't help anyway...
ZEN 6 (and probably the PS6 + Next Xbox too) should have 12c/24t per CCD so we should see more and more games using more than 8c/16t in the future. Let's hope ZEN 6 3D will pack 2x 3D V-Cache CCDs so we can have 24c/48t with 3D V-Cache!
 
@W1zzard I appreciate your review. There was something that caught my attention.
View attachment 389531
How can Intel 285k performance be so much worse in Elden Ring? Is Easy anti-cheat still causing issues for the CPU?
But wait, there's more:
5090 @720p vs 4090 @720p
In Counter-Strike 2 @720p the 285K scored clearly lower with the 5090 than with the 4090 (the 14900K also but marginally). Every other CPU that's present in the 9950X3D review scored clearly higher.
In Elden Ring @720p the 9800X3D scored marginally lower with the 5090 than with the 4090, the 7800X3D about the same, the 7950X3D & 9950X scored higher with the 5090 and finally the 285K & 14900K scored clearly lower with the 5090.
In The Last of Us @720p the 285K, 14900K & 9950X score a bit higher with the 5090 but the 9800X3D, 7950X3D & 7800X3DD score lower with the 5090.
In Elden Ring + RT @720p the 285K scored lower with the 5090 and everything else was basically the same as with the 4090.
5090 @1080p vs 4090 @1080p
In Counter-Strike 2 @1080p the 285K scored basically the same with the 5090, everything else scored higher.
In Cyberpunk 2077 @1080p the 14900K scored basically the same (marginally higher if we would split hairs) with the 5090, everything else scored higher.
In Elden Ring @1080p the 285K scored clearly lower with the 5090, the 14900K & 9950X basically the same, the 9800X3D & 7800X3D marginally higher and the 7950X3D clearly higher.
In The Last of Us @1080p everything improves apart from 7800X3D which stays the same.
In Elden Ring + RT @1080p the 285K scored marginally lower with the 5090 and everything else scored marginally higher.
5090 @1440p vs 4090 @1440p
In Elden Ring @1440p the 285K scored clearly lower with the 5090, everything else scored higher.
5090 @4K vs 4090 @4K
In Elden Ring @4K the 285K scored marginally higher with the 5090 when everything else scored clearly higher.
5090 min.fps vs 4090 min. fps
In Alan Wake 2 @1080p the 285K has lower min. fps with the 5090.
In Counter-Strike 2 @1080p the 285K has lower min. fps with the 5090.
In Elden Ring @1080p the 285K has lower min. fps with the 5090.
In Elden Ring + RT @1080p the 285K has clearly lower min. fps with the 5090 and the 14900K marginally lower min. fps.
In Elden Ring @4K the 285K has the same min. fps with the 5090, everything else clearly improves.
In Elden Ring + RT @4K the 285K has lower min. fps with the 5090, everything else slightly improves.

So in conclusion, for me personally I am not ready to call Arrow Lake a total failure in gaming, there are obvious issues, who knows if they can be fixed or reduced, but the software needs scrutiny as well. I mean we can't just consider apps/games -> software in general as being pure & innocent & without fault, if it were then yeah it would be easy to point fingers at CPUs or other hardware for having a broken architecture but simply looking at Elden Ring's engine behaviour tells me that this is one of the games/game engines of all time when it comes to benchmarking.
 
It's crazy how CPU limited Spider-Man 2 + RT is... all CPUs have the same fps at 1080p, 1440p and 4K :cry: ZEN 6 will probably bring a good performance bump in this game!
 
Great review. Only thing I’m going to say moving forward is test Intel at higher DDR5 speed since the architecture supports it. Memory above 7200 is common and prices are similar now. It’s more fair to test with the best available for each system. No one with an Intel build is going to use “slow” 6000 ram. AMD is crushing them as is, let’s see a valid Intel test using faster ram.
 
Back
Top