Wednesday, January 4th 2023

AMD Ryzen 7000X3D Announced, Claims Total Dominance over Intel "Raptor Lake," Upcoming i9-13900KS Deterred

AMD today announced its Ryzen 7000X3D "Zen 4" desktop processors with 3D Vertical Cache technology. With these, the company is claiming to have the world's fastest processors for gaming. The company claims to have beaten the Intel Core i9-13900K "Raptor Lake" in gaming, by a margin it feels comfortable to remain competitive with against even the upcoming Core i9-13900KS. At the heart of these processors is the new "Zen 4" 3D Vertical Cache (3DV cache) CCD, which features 64 MB of L3 cache stacked on top of the region of the "Zen 4" CCD that has the on-die 32 MB L3 cache. The 3DV cache runs at the same speed as the on-die L3 cache, and is contiguous with it. The CPU cores see 96 MB of transparent addressable L3 cache.

3DV cache is proven to have a profound impact on gaming performance with the Ryzen 7 5800X3D "Zen 3" processor that helped it beat "Alder Lake" in gaming workloads despite "Zen 3" being a generationally older microarchitecture; and AMD claims to have repeated this magic with the 7000X3D "Zen 4" series, enabling it to beat Intel "Raptor Lake." Unlike with the 5800X3D, AMD don't intend to make gaming performance a trade-off for multi-threaded creator performance, and so it is introducing even 12-core and 16-core SKUs, so you get gaming performance alongside plenty of muscle for creator workloads.
The series consists of three SKUs, the 8-core/16-thread Ryzen 7 7800X3D, the 12-core/24-thread Ryzen 9 7900X3D, and the flagship 16-core/32-thread Ryzen 9 7950X3D. The 7800X3D comes with an unknown base frequency above the 4.00 GHz-mark, along with up to 5.00 GHz boost. The 7900X3D has 4.40 GHz base frequency, and up to 5.60 GHz boost. The flagship 7950X3D ticks at 4.20 GHz base, and boosts up to 5.70 GHz.

There's something interesting about the cache setup of the three SKUs. The 7800X3D has 104 MB of total cache (L2+L3), whereas the 7900X3D has 140 MB and the 7950X3D has 144 MB. The 8-core CCD in the 7800X3D has 64 MB of 3DV cache stacked on top of the 32 MB on-die L3 cache, resulting in 96 MB of L3 cache, and with each of the 8 cores having 1 MB of L2 cache, we arrive at 104 MB total cache. Logically, the 7900X3D and 7950X3D should have 204-208 MB of total cache, but they don't.

While we await more details from AMD on what's happening here, there are two theories—one holds that the 3DV cache for the 7900X3D and 7950X3D is just 32 MB per chiplet, or 64 MB L3 cache per CCD. 140 MB total cache for the 7900X3D would hence come from ((2 x 64 MB L3) + (12 x 1 MB L2)); and for the 7950X3D this would be ((2 x 64 MB L3) + (16 x 1 MB L2)).

The second more radical theory holds that only one of the two CCDs has 64 MB of 3DV cache stacked on top of the on-die 32 MB L3 cache, and the other is a conventional "Zen 4" CCD with just 32 MB of on-die L3 cache. The math checks out. Dating all the way back to the Ryzen 3000 "Zen 2" Matisse dual-CCD processors, AMD has worked with Microsoft to optimize Windows 10 and Windows 11 schedulers to localize gaming workloads to one of the two CCDs (using methods such as CPPC2 preferred-core flagging), so if these processors indeed have an asymmetric L3 cache setup between the two CCDs, the one with the 3DV cache would be preferred by the OS for gaming workloads.

In its presentation, AMD uses the term "the world's best gaming processor" with the 7800X3D and not the 7950X3D. This should mean that despite its lower maximum boost frequency, the 7800X3D should offer the best gaming performance among the three SKUs, and very likely features 96 MB of L3 cache for the CCD; whereas the 7900X3D and 7950X3D feature either lower amounts of 3DV cache per CCD, or that asymmetric L3 cache setup we theorized.
In terms of performance, AMD is claiming anywhere between 21% to 30% gaming performance gains for the 7800X3D over the previous-generation 5800X3D. This can be associated with the IPC increase of the "Zen 4" core, and faster DDR5 memory. AMD claims that the 7800X3D should particularly shine with CPU-limited gaming scenarios, such as lower-resolution high refresh-rate setups.

The 7950X3D is claimed to beat the Core i9-13900K in gaming performance by anywhere between 13% to 24% in the four tests AMD showed, while also offering big gains in multi-threaded productivity benchmarks. Especially in workloads involving large streaming data, such as file-compression and DaVinci Resolve, the 7950X3D is shown offering between 24% to 52% performance leads over the i9-13900K (which we doubt the i9-13900KS can make up for).

The Ryzen 7000X3D processors will be available from February 2023, and should be drop-in compatible with existing Socket AM5 motherboards, with some boards requiring a BIOS update. The USB BIOS Flashback feature is standardized by AMD across motherboard brands, so this shouldn't be a problem.
Add your own comment

177 Comments on AMD Ryzen 7000X3D Announced, Claims Total Dominance over Intel "Raptor Lake," Upcoming i9-13900KS Deterred

#151
Vya Domus
fevgatosBecause your card is bottlenecking the crap out of your cpu.
Every card is going to be bottlenecked with RT at high resolutions genius, I can run the same benchmark at 720P and the same thing happens. "Insanely heavy" usage should be at least observable, because you know, it's insane after all.
fevgatosArent you bored arguing with me and always proven to be wrong?
Proven wrong by what ? Aren't you tired of making crazy claims with no proof whatsoever ? You previously mentioned some links that you said show the insane CPU usage when RT is enabled, so, post them.
Posted on Reply
#152
fevgatos
Vya DomusEvery card is going to be bottlenecked with RT at high resolutions genius, I can run the same benchmark at 720P and the same thing happens. "Insanely heavy" usage should be at least observable, because you know, it's insane after all.



Proven wrong by what ? Aren't you tired of making crazy claims with no proof whatsoever ? You previously mentioned some links that you said show the insane CPU usage when RT is enabled, so, post them.
Then drop to 720p and show me your fps with RT on and RT off. Then youll realize that your cpu manages lower framerate with RT on.


But nevermind, you are a lost cause
Posted on Reply
#153
Vya Domus
fevgatosBut nevermind, you are a lost cause
Why don't you just stop stalling and either show us those links so we can settle this or admit you have no proof ?
fevgatosThen drop to 720p and show me your fps with RT on and RT off.
Pointless, it just does what you expect it to, a couple of threads are close to being maxed out while most others do nothing. The combined CPU usage across all threads is like 15% either way, no such thing as insanely heavy usage.
Posted on Reply
#154
fevgatos
Vya DomusWhy don't you just stop stalling and either show us those links so we can settle this or admit you have no proof ?


Pointless, it just does what you expect it to, a couple of threads are close to being maxed out while most others do nothing. The combined CPU usage across all threads is like 15% either way, no such thing as insanely heavy usage.
I didnt ask you about the usage, i asked you about the framerate. :roll:

Dodging united
Posted on Reply
#155
Vya Domus
fevgatosDodging united
Lmao, that's rich considering the fact that you're literally doing anything except show the proof. I wonder why, probably because there is none and it's all just utter nonsense.

Just post the links bro.
fevgatosi asked you about the framerate.
The framerate always goes down with RT on, at all resolutions, how dense can you be ?
Posted on Reply
#156
Dredi
fevgatosI don't even know what that means.
And it shows. I already stated some things that NEED to be considered when designing and running tests. If you disagree, you are simply wrong. And no, I have no horse in this race, I simply asked you questions in the first place because of around 1000% larger claimed performance difference between the two processors, compared to what for example GN and TPU have managed to show, and was intrigued about what it was all about.
Posted on Reply
#157
fevgatos
Vya DomusThe framerate always goes down with RT on, at all resolutions, how dense can you be ?
So the framerate goes down due to the cpu... Which is what im saying all this time. The proof is right there, the heck are you talking about is beyond mw
DrediAnd it shows. I already stated some things that NEED to be considered when designing and running tests. If you disagree, you are simply wrong. And no, I have no horse in this race, I simply asked you questions in the first place because of around 1000% larger claimed performance difference between the two processors, compared to what for example GN and TPU have managed to show, and was intrigued about what it was all about.
Well ask the guy avove with his 7900x to post a cyberpunk run like mine and youll see the 1000% larger claimed performance compared to gn and tpu. Of course he will decline but its worth a try you know
Posted on Reply
#158
Dredi
fevgatosWell ask the guy avove with his 7900x
So you have the same gpu? Don’t be an idiot and take the test setup stuff seriously, plz.

And ask tpu/gn/eurogamer/hub to test your cpu taxing scenario. Or do it yourself to at least some quality standard.

And no one claimed 1000% larger (why larger, and not higher?) performance.
Posted on Reply
#159
fevgatos
DrediSo you have the same gpu? Don’t be an idiot and take the test setup stuff seriously, plz.

And ask tpu/gn/eurogamer/hub to test your cpu taxing scenario. Or do it yourself to at least some quality standard.

And no one claimed 1000% larger (why larger, and not higher?) performance.
Why does the GPU matter if the tests are completely cpu bound? I mean come on....
Posted on Reply
#160
Dredi
fevgatosWhy does the GPU matter if the tests are completely cpu bound? I mean come on....
Different drivers, different cpu requirements. Come on.

AMD gpu drivers need to do more calculations with cpu when calculating raytracing shit, for example.

pretty basic shit.
Posted on Reply
#161
Vya Domus
fevgatosSo the framerate goes down due to the cpu...
Nope, that’s what you claim and you can’t prove it.

Even taking into account the fact that you’re clearly trolling do you realise how dumb this sounds ? RT incurs a cost per frame on the GPU side that’s why the frame rate goes down when it’s enabled at any resolution.

As a troll you should really step up your game and at least get the absolute basics right.
Posted on Reply
#162
Dredi
Vya DomusNope, that’s what you claim and you can’t prove it.

Even taking into account the fact that you’re clearly trolling do you realise how dumb this sounds ? RT incurs a cost per frame on the GPU side that’s why the frame rate goes down when it’s enabled at any resolution.

As a troll you should really step up your game and at least get the absolute basics right.
I mean, you could see his point if the resolution was like 200x100 px, then the framerate would be capped by the cpu regardless of settings. But at 4k it of course won’t work as he envisions. I believe the discussion was about 0.1% lows at 4k, before this silly shit about (faux?) youtube videos.
Posted on Reply
#163
fevgatos
Vya DomusNope, that’s what you claim and you can’t prove it.

Even taking into account the fact that you’re clearly trolling do you realise how dumb this sounds ? RT incurs a cost per frame on the GPU side that’s why the frame rate goes down when it’s enabled at any resolution.

As a troll you should really step up your game and at least get the absolute basics right.
Indeed RT incurs a cost per frame on the GPU side if you have an amd card, since they are bad at RT. If you have an nvidia things change. Drop your resolution to 240p so your card can actually pump those frames and tell me your framerate with and without RT.
Posted on Reply
#164
Dredi
fevgatosIndeed RT incurs a cost per frame on the GPU side if you have an amd card
It does have a cost per frame on all cards. Come on now.
Posted on Reply
#165
Vya Domus
fevgatosIndeed RT incurs a cost per frame on the GPU side if you have an amd card, since they are bad at RT. If you have an nvidia things change.
Yeah because if you have an AMD card RT runs on the GPU side and if you have an Nvidia card it runs where ? In outer space ?

Dude the more you post the stupider your explanations get, stop.
Posted on Reply
#166
fevgatos
Vya DomusYeah because if you have an AMD card RT runs on the GPU side and if you have an Nvidia card it runs where ? In outer space ?

Dude the more you post the stupider your explanations get, stop.
How can you be so wrong in every thread about every possible thing? That's insane.

Drop to a resolution that your GPU isn't the bottleneck and check your framerates with RT on and off. Then overclock your cpu (or underclock, doesn't matter) and check your fps again. They will change - because the bottleneck isn't the GPU but your CPU. Then compare your frames with RT on and RT off, youll realize that enabling RT makes the game much harder for the CPU, not just the GPU. If you still don't get it i don't know how to help
Posted on Reply
#167
OkieDan
This CPU performance with RT on sounds interesting. Hopefully a well established tester that follows the scientific process as closely as possible and doesn't have a history of bias will do some testing on this.
Posted on Reply
#168
fevgatos
OkieDanThis CPU performance with RT on sounds interesting. Hopefully a well established tester that follows the scientific process as closely as possible and doesn't have a history of bias will do some testing on this.
It's been done multiple times. DF did a test on spiderman, with RT on it's a cpu hog and even a 12400 fails to keep a steady 60 fps.
Posted on Reply
#169
wolf
Performance Enthusiast
fevgatosHow can you be so wrong in every thread about every possible thing?
Should have seen the one the other day arguing that he could see shimmer in DLSS/FSR from screenshots alone, and having access to see them both with my own eyes in motion was no better at all than a compressed screenshot comparison :roll: When someone else chimed in to agree with me it was promptly dropped too.
Posted on Reply
#170
fevgatos
wolfShould have seen the one the other day arguing that he could see shimmer in DLSS/FSR from screenshots alone, and having access to see them both with my own eyes in motion was no better at all than a compressed screenshot comparison :roll: When someone else chimed in to agree with me it was promptly dropped too.
What I find notoriously suspicious is all I hear is claims of insane performance of the 3d and other amd cpus in games. But whenever you ask for a video from their pc running that game at those insane framerates, you get nothing. I swear ive been trying for a year, all I get is posts talking about great performance, but then when you ask for videos....they all disappear. That is highly suspect :D
Posted on Reply
#171
wheresmycar
fevgatosWhat I find notoriously suspicious is all I hear is claims of insane performance of the 3d and other amd cpus in games. But whenever you ask for a video from their pc running that game at those insane framerates, you get nothing. I swear ive been trying for a year, all I get is posts talking about great performance, but then when you ask for videos....they all disappear. That is highly suspect :D
don't worry i might be grabbing a 5800X3D in the coming weeks. I wouldn't mind sharing screenshots/vids but only in the games i play at 1440p... can't be bothered with anything else really. Or did you have a specific test scenario in mind?
Posted on Reply
#172
Dredi
OkieDanThis CPU performance with RT on sounds interesting. Hopefully a well established tester that follows the scientific process as closely as possible and doesn't have a history of bias will do some testing on this.
Yup. Waiting for someone not too deep in the fanboi pit to check up on the claims made by @fevgatos

Anyone here with a 12xxx processor and cyberpunk? Do you get 200fps with RT on?
Posted on Reply
#173
Arco
I got a 7950x and an RTX 4090. Could I help in any way?
Posted on Reply
#174
TheoneandonlyMrK
wheresmycardon't worry i might be grabbing a 5800X3D in the coming weeks. I wouldn't mind sharing screenshots/vids but only in the games i play at 1440p... can't be bothered with anything else really. Or did you have a specific test scenario in mind?
He will have a specific case in mind, one of the few he stands a chance of being right in. IE 280p RT on or some other nonsense no one's using.

Now this chip isn't out and he's already on his podium spouting shit, imagine the butt hurt trolling we will see when it's out and his precious looks less good.

I'm glad I ignore most of his drivel
Posted on Reply
#175
fevgatos
ArcoI got a 7950x and an RTX 4090. Could I help in any way?
Sure, run the ingame bench at ultra settings + rt, low resolution so you are not gpu bound. If you want to run ingame as well and post a video, that would also be great
DrediYup. Waiting for someone not too deep in the fanboi pit to check up on the claims made by @fevgatos

Anyone here with a 12xxx processor and cyberpunk? Do you get 200fps with RT on?
I have the 12900k and cyberpunk, i already posted a video, lol

@Dredi

I found some reviews that fit your bill. First one is from pcgameshardwarede running spiderman miles morales. Ignore the 12900k result, since that one is with fast ram. The 13900k is 61% faster than the 5800x 3d and around 50% faster than the 7950x.




Second one is from purepc. What this reviewer does is, he finds the hardest most demanding scenes on each specific game and tests there. Ram used is 5200mhz for all ddr5 cpus and 3600mhz for all ddr4.





And we have to take into account that 13rth gen has huge room to grow an even wider lead just because of how much faster ram they support. Just imagine what the difference would be with a 7600 ram kit..
Posted on Reply
Add your own comment
Jun 1st, 2024 00:51 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts