Saturday, March 11th 2023

AMD Shows First Ryzen 7 7800X3D Game Benchmarks, Up To 24% Faster Than Core i9-13900K

AMD has finally released some of the official gaming benchmark for its 8-core Ryzen 7 7800X3D processor that should be coming in April, and, now that AMD has released some of the first gaming benchmarks, it appears that it outperforms the Intel Core i9-13900K by up to 24 percent. Officially, AMD is putting the Ryzen 7 7800X3D against the Intel Core i7-13700K, leaving the Core i9-13900K and the Core i9-13900KS to its 16- and 12-core Ryzen 7000X3D SKUs.

Although some of its Ryzen 7000X3D series chips are available as of February 28th, namely the Ryzen 9 7950X3D and the Ryzen 9 7900X3D, AMD has pushed back the launch of its 8-core/16-thread Ryzen 7 7800X3D. This was quite a surprise and a big letdown, especially due to its tempting $449 price tag. One of the reasons might be the fact that the Ryzen 7 7800X3D is simply too good and might put a lot of pressure on even AMD's own SKUs, let alone Intel's lineup.
The AMD Ryzen 7 7800X3D has yet another significant advantage compared to the rest of the Ryzen 7000X3D series, as while the 12-core and 16-core SKUs are a multi-chip module with two CCDs, and feature an asymmetric chiplet design, the Ryzen 7 7800X3D has a rather standard design, with single 8-core CCD with 3D V-Cache.

The Ryzen 9 7950X3D and the Ryzen 9 7900X3D have two CCDs with only one CCD with 3D Vertical Cache, which means it relies on software control, or the 3D Vertical Cache Optimizer Driver, to ensure that workload from games are directed to the CCD with the 3D Vertical Cache using dynamic "preferred cores" flagging for the Windows OS scheduler. You can find more details in our Ryzen 9 7950X3D review.

AMD has released two new slides, putting the AMD Ryzen 7 7800X3D against the Intel Core 9 13900K in four games, Rainbow Six Siege, Total War: Three Kingdoms, Red Dead Redemption 2, and Horizon Zero Dawn. In all four, the Ryzen 7 7800X3D is ahead of the Core i9-13900K, by anywhere between 13 and 24 percent. The second slide puts the Ryzen 7 7800X3D against the previous generation AMD Ryzen 5800X3D in Rainbow Six Siege, Warhammer: Dawn of War III, CS:GO, and DOTA 2, where the new generation is anywhere between 21 and 30 percent faster.
If these benchmarks turn out to be even close to painting the realistic picture, as these are just three games handpicked by AMD, the Ryzen 7 7800X3D, priced at $449, might be a big winner for AMD, and becoming one of the best sellers as it managed to outperform Intel's $579 priced Core i9-13900K SKU, while being $130 less expensive. The Core i7-13700K, which is what AMD is actually putting the Ryzen 7 7800X3D against, is priced at $405.

Of course, these are handpicked benchmarks provided by AMD so take them with a grain of salt, and we would rather wait to check out these performance figures by ourselves when it officially launches on April 6th. In the meantime, you can check out our Ryzen 7800X3D preview, which is a simulation of the performance with a single CCD enabled.
Source: Toms Hardware
Add your own comment

85 Comments on AMD Shows First Ryzen 7 7800X3D Game Benchmarks, Up To 24% Faster Than Core i9-13900K

#26
Godrilla
Are the 8 core cached ccd yields in trouble based on the limited stock of the 7950-x3d that have not been restocked 11 days post launch on AMD'S website, Microcenter, Bestbuy, Amazon ( besides scalped pricing, Newegg besides scalped pricing, B and H, and gamestop. In case anyone is interested in the 7900x3d there is still dozens of cpus across the 25 Microcenter stores available now.
Posted on Reply
#27
InhaleOblivion
Looking forward to Wizz review on this one. Should be interesting to see how it goes up against the AM4 Champ (5800X3D).
Posted on Reply
#28
Mussels
Freshwater Moderator
mäggäri24h 100% load every day?
going from a 5800x to a 5800x3D took my gaming wattages down about 50W, absolutely noticeable in summer here

you dont need 24/7 load to notice a significant difference in power consumption and heat output
Posted on Reply
#29
evernessince
GodrillaAre the 8 core cached ccd yields in trouble based on the limited stock of the 7950-x3d that have not been restocked 11 days post launch on AMD'S website, Microcenter, Bestbuy, Amazon ( besides scalped pricing, Newegg besides scalped pricing, B and H, and gamestop. In case anyone is interested in the 7900x3d there is still dozens of cpus across the 25 Microcenter stores available now.
I don't think so, AMD made the cache die even smaller this time so they should be yielding more per wafer. It's more likely that AMD is allocating more 3D Cache chiplets to higher margin markets as they are pushing for a large gain in server marketshare.

AMD have been prioritizing server CPUs over desktop for the last few years. A good example of this is the 7950X compared to the 5950X. The only thing that made the 5950X very efficient was that it used higher binned chiplets. The 7950X doesn't see that same jump in efficiency though, as AMD decided that it would rather seller those higher quality chiplets to enterprise customers at a higher markup.
Posted on Reply
#30
wheresmycar
Up-to-24% doesn't sound out-of-whack to me considering we know this is based on select Ryzen-warming titles. Looking at TPUs ccd1 disabled 7950X3D 1080p gaming benchmarks, already a couple of titles are showing 15-23% increase in performance over the 13900K. Actually its better to break them up as the cyberpunk result may need revisiting.

Battlefield +15%
Borderlands +18%
Cyberpunk +23% (really?)

(AMD loyalists dont get too excited. Intel loyalists dont get your panties in a twist - intel also enjoys similar performance advantages with select titles. Just thought i'd get that out there before the loyalist blah blah ensues).

I'm more interested in the 5800X3D/13700K and 7800X3D showdown... although i don't expect much more than the already reported (7950X3D) typical generational increase in performance.

In terms of value, at the moment it seems like an average 15% performance uplift over the 5800X3D.... not enticing enough to blow $450 + AM5 adoption tax. I actually warmed up to AMD big time and was considering buying into AM5 coming from an intel 9700K.... but nah, kinda boring now and they've done a fine job delaying the living day lights out of an X3D gaming chip and not to mention the asking price (almost half a grand for heavens sake).
Posted on Reply
#31
Mussels
Freshwater Moderator
wheresmycarUp-to-24% doesn't sound out-of-whack to me considering we know this is based on select Ryzen-warming titles.
Borderlands 3/Wonderlands for example show some insane gains with x3D chips - look how the 5800x to the 5800x3D went



This is from TPU's emulated 7800x3D testing, but they're showing a possible 50% gain in this game engine, so 25% in multiple titles does seem entirely plausible
Posted on Reply
#32
pressing on
I see from one of the AMD slides attached to this news article that there is a claim that the 7950X3D has a 50% advantage over the 13900K in file compression, see the graphic below



But the TechPowerUp review of February 27 this year shows it as much less than that. In WinRAR it shows a narrow win for the 13900K as this extract shows



and in 7-Zip although the 7950X3D is on top, the advantage is only around 5% as this screenshot from the review shows



I appreciate that the wording "UP TO" is being used, but where did the 50% come from?
Posted on Reply
#33
Mussels
Freshwater Moderator
rar vs zip, type of compressible content...
Posted on Reply
#34
Jism
Who cares.

Competition is good for us consumers.
Posted on Reply
#36
Dirt Chip
If that`s the best ZEN4-3D can do it is 'nice', but nothing special to talk about. The great power efficiency is offset by the high platform cost. In the high-end for app you have real competition, all else is in Intel hand. The total cost is still too high with AMD for most and the lower you go in tier, the more pronounce it is.
We can just hope than ZEN4 is ironing all the problems for the new platform so ZEN5 will see smooth lunch.
Posted on Reply
#37
TumbleGeorge
mäggäri3DNow!
Was good extension of x86 instructions set which isn't exist after 2017..:(
Posted on Reply
#38
Garrus
Dirt ChipIf that`s the best ZEN4-3D can do it is 'nice', but nothing special to talk about. The great power efficiency is offset by the high platform cost. In the high-end for app you have real competition, all else is in Intel hand. The total cost is still too high with AMD for most and the lower you go in tier, the more pronounce it is.
We can just hope than ZEN4 is ironing all the problems for the new platform so ZEN5 will see smooth lunch.
What high platform cost? Motherboard is $140. Best 5600C28 ram is $130. 32GB of it!
Posted on Reply
#39
mb194dc
How many people
m2geekAs someone who is currently using a 5800X3D there is little reason to upgrade, the Big cost of doing so isn't worth it for "20-25%" and I doubt the performance increase would be noticeable in World of Warcraft. Going from a 5800X to the X3D was such a massive improvement I can't imagine that the 7800X3D would be any better without Also getting a Much stronger GPU than the RTX3080 I have.
You're at 1440p 144hz? Should max it out with 5800x3d and 3080?

You'd probably be better off upgrading the 3080 than going am5 indeed.

These benchmarks are at 1080p, totally useless for pretty much anyone who's dropping the money on am5 and a top gpu.
Posted on Reply
#41
Dirt Chip
I suggest that for every "up to" figure you will be obligated to add "down to" figure.
Posted on Reply
#42
fevgatos
mäggäriWould be great if Intel dropped off these E-Core gimmicks. Just plain P-cores. I have found 0 usage for those notepad-tier CPU cores, disabled in BIOS already.
Try warzone 2 with ecores on and off (or cyberpunk, or spiderman) and youll change your mind. 1% lows fall flat on their face with ecores off
Posted on Reply
#43
Garrus
fevgatosTry warzone 2 with ecores on and off (or cyberpunk, or spiderman) and youll change your mind. 1% lows fall flat on their face with ecores off
That's not the point. You could have had more P cores instead. Then you'd be better than with E cores on.
Posted on Reply
#44
fevgatos
GarrusThat's not the point. You could have had more P cores instead. Then you'd be better than with E cores on.
That's a completely different issue that's debatable. What's not debatable is that disabling ecores on the current CPU's loses you performance in plenty of games. There is a game here and there (havent had any myself) that may work better with ecores off, but if you find such a game, you can disable the ecores with a press of a button from within windows on the fly
Posted on Reply
#45
nguyen
Well being 4k gamer I only care about this


Efficiency matter very little when I pay 9usd a month on my entire PC, 50W less mean I pay 1usd less per month ;)
Posted on Reply
#46
spnidel
Ayhamb99To be honest though, I don't see people buying the 7800X3D and then decide to play at 1080p, i expect people who'll buy it to either want to play at 1440p or 4K, where it'll probably be GPU-Bound.
testing on 1080p and below is done to see how much cpu performance headroom is available there, not that people who buy 4090s will actually play on a resolution like that
Posted on Reply
#47
Ayhamb99
spnideltesting on 1080p and below is done to see how much cpu performance headroom is available there, not that people who buy 4090s will actually play on a resolution like that
Precisely why i think arguing about which CPU is better for gaming is pointless in my honest opinion, in real-world usage, it'll be difficult to notice the differences between the top performing gaming CPUs with a 4090, and the differences will be even less noticeable with the GPUs that people are more likely to use (3080s and 6800/6900 XTs.)
Posted on Reply
#48
mäggäri
TumbleGeorgeWas good extension of x86 instructions set which isn't exist after 2017..:(
Yeah I just used the name for fun. :) So similar with this new naming. I remember clearly those K6-CPUs with 3DNow! text printed on.
Posted on Reply
#49
SL2
WastelandThere's no reason to expect that the 7800x3d will do substantially better.
Given the price, why would it?

As for benchmarks, I remember the disabled 7950X3D being at least 21 % faster in one of the games in the TPU review, so I don't doubt up to 24 % for a second.
Posted on Reply
#50
Imouto
Ayhamb99Precisely why i think arguing about which CPU is better for gaming is pointless in my honest opinion, in real-world usage, it'll be difficult to notice the differences between the top performing gaming CPUs with a 4090, and the differences will be even less noticeable with the GPUs that people are more likely to use (3080s and 6800/6900 XTs.)
There are people pairing i7 4790Ks with 3080s. Platform longevity is something to look out for.
Posted on Reply
Add your own comment
Jun 1st, 2024 04:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts