Monday, April 3rd 2023

AMD Shows More Ryzen 7 7800X3D Gaming Benchmarks

AMD has revealed more Ryzen 7 7800X3D gaming benchmarks ahead of the official launch scheduled for April 6th. AMD has previously shared some results comparing this 8-core/16-thread Ryzen 7000X3D series SKU with Intel's Core i9-13900K or the predecessor, the Ryzen 7 5800X3D, showing up to 24 and 30 percent performance increase.

Now, a new slide has been leaked online, which is a part of AMD's Ryzen 7 7800X3D review guide, comparing it once again with the Intel Core i9-13900K, but going head to head in several more games. At 1080p resolution and high settings, the Ryzen 7 7800X3D is anywhere from 2 to 31 percent faster, but there are several games where the Core i9-13900K is also faster, such as CS:GO.
As noted, the Ryzen 7 7800X3D is the third SKU in the Ryzen 7000X3D series lineup, and this one comes with a single CCD with 3D V-Cache, rather than the asymmetric design with two CCDs like the Ryzen 9 7950X3D and Ryzen 9 7900X3D, where only one of the CCDs had the 3D V-Cache. It works at up to 5.0 GHz Boost clock, has a total of 104 MB of L2+L3 cache, and a 120 W TDP.

The Ryzen 7 7800X3D launches at $449, which puts it at $20 more expensive than the Ryzen 9 7900 and about $100 cheaper than the 7900X. It is also around $110 more expensive than the Ryzen 7 7700X. Intel's Core i9-13900K is currently selling at $579.99, which is the lowest price in 30 days, according to Newegg.com.
Source: Videocardz
Add your own comment

49 Comments on AMD Shows More Ryzen 7 7800X3D Gaming Benchmarks

#26
fevgatos
oxrufiioxoUnless the video shows a person with the hardware in hand I classify it as fake news.
You can tell by the results, i just looked at cyberpunk and yeah. He didn't even put any effort into hiding the fact that it's fake. It's blatantly obvious, lol
Posted on Reply
#27
The King
fevgatosYou can tell by the results, i just looked at cyberpunk and yeah. He didn't even put any effort into hiding the fact that it's fake. It's blatantly obvious, lol
That channel has a long history of posting reliable benchmarks. I was also very skeptical when I first came across it a while back when they posted Alder lake reviews before anyone else.

Understandable, I don't blame you. Time will tell if you right on not. ;)
Posted on Reply
#28
oxrufiioxo
fevgatosYou can tell by the results, i just looked at cyberpunk and yeah. He didn't even put any effort into hiding the fact that it's fake. It's blatantly obvious, lol
Yeah super fake for sure, but even legit reviews I only look at performance in general and usually weight 4-5 different sources I trust. It's too easy either on purpose or by accident to skew results in favor of one cpu or another either by game choice, choice of other hardware mobo/ram, or even a specific point in the game that runs better on one cpu vs another.

I've seen 13900k results that are 10% lower than expected due to a board enabling somthing it shouldn't or 7000 series performance be 10% lower than expected due to a motherboard vendors bios being trash on memory timmings most reputable reviewers will cross reference their result but not all do.

At this point given all the variences in testing etc I consider anything less than 5% a tie anything less than 10% a toss up only at around 20% or more at least with cpus does it get interesting and I start paying attention.
Posted on Reply
#29
fevgatos
The KingThat channel has a long history of posting reliable benchmarks. I was also very skeptical when I first came across it a while back when they posted Alder lake reviews before anyone else.

Understandable, I don't blame you. Time will tell if you right on not. ;)
Νah, the video posted is absolutely fake, you don't need to wait for anything.

His numbers are literally impossible. Although the same applies to every game he tested, take a look at cyberpunk. At the 1:30 minute mark for example, the 12700k has the 3090Ti at 94% with 48 fps, while the 3d part has the 3090ti at 77% usage wtih 69 fps. LOL!

Just for your information, a 12700k can do 120 to 150 fps (depending on the area) in this game with RT on. With RT off it can probably hit 180-200 fps. All of these cpus would basically get the exact same fps at 4k with a 3090ti. You need to drop to 720p to see any difference between them. His tests are done with a different card or different settings on each CPU.

This is my 13900k running cyberpunk, and this is with RT on. With RT off,, add around 30 fps!

Posted on Reply
#30
dirtyferret
1)Hold on, I'm about to post my video review of the Intel I5-14600k that totally wipes the floor with the 7800X3D...I just need to get my sharpie so I can prove I own it

2) it's $200 more than the Ryzen 7600x for what 10% gain in non-gpu limited situations? Is it a great gaming CPU for benchmarks, undoubtedly. Is it a bit hyped up and over priced at launch? Obviously people can make a case for either side.
Posted on Reply
#31
oxrufiioxo
dirtyferret2) it's $200 more than the Ryzen 7600x for what 10% gain in non-gpu limited situations? Is it a great gaming CPU for benchmarks, undoubtedly. Is it a bit hyped up and over priced at lunch? Obviously people can make a case for either side.
I tend to agree although if you can afford it without sacrificing anything else in the build why not. Although I don't doubt people will pair this with mid range gpus like the 4070ti it seems to be more meant for the 7900XTX/4080/4090.
Posted on Reply
#32
The King
My bad seems I confused the YT channel posted above with this one. Anyway take it with a grain of salt etc. These number looks more realistic. lol
Posted on Reply
#33
fevgatos
The KingMy bad seems I confused the YT channel posted above with this one. Anyway take it with a grain of salt etc. These number looks more realistic. lol
That one is as fake as the previous one :D

He is not even using the same GPU between the CPUs, lol
Posted on Reply
#34
AnarchoPrimitiv
oxrufiioxoAt this point given all the variences in testing etc I consider anything less than 5% a tie anything less than 10% a toss up only at around 20% or more at least with cpus does it get interesting and I start paying atattention.
This is why I've been begging sites like TPU and youtube channels like LTT for YEARS to do some testing to figure out exactly how big of a delta in FPS is required before an average human being can start discerning the difference. In other words, to use controlled testing to figure out if a human being can even accurately notice a difference between 100 FPS vs 90 FPS. I think this would be incredibly important and useful as it would once and for all determine whether one CPU or GPU beating another by an average of 7%, for example, really even matters. I think this information would be priceless for learned consumers when making a purchase.
Posted on Reply
#35
Vya Domus
AnarchoPrimitivThis is why I've been begging sites like TPU and youtube channels like LTT for YEARS to do some testing to figure out exactly how big of a delta in FPS is required before an average human being can start discerning the difference. In other words, to use controlled testing to figure out if a human being can even accurately notice a difference between 100 FPS vs 90 FPS. I think this would be incredibly important and useful as it would once and for all determine whether one CPU or GPU beating another by an average of 7%, for example, really even matters. I think this information would be priceless for learned consumers when making a purchase.
I've recently got a 165hz monitor and I was shocked at how difficult it was to tell the difference between when a game was running at 120fp vs 80-90fps for example.
Posted on Reply
#36
dirtyferret
AnarchoPrimitivFPS is required before an average human being can start discerning the difference. In other words
1) every person is different so you really can't do "average"
2) people will BS to justify their upgrade anyway

For me personally, if I'm doing a GPU upgrade I look for a minimum of 30% improvement but often at least 40%. 25% improvement just means I'm going from 30fps to 38fps, 60fps to 70fps or 100fps to 125fps. I can't tell the difference from any of those individual comparisons. In fact once I hit over 90FPS, I really could not tell you the difference of that and my 144hz limit unless I have a FPS tool up.
Posted on Reply
#37
oxrufiioxo
fevgatosThat one is as fake as the previous one :D

He is not even using the same GPU between the CPUs, lol
He's also using ddr4 on one of the intel systems which is kinda stupid.... Not sure why people have to watch these obviously fake benchmarks most reputable sites already simulated the 7800X3D by disabling a CCD on the 7950X3D..... Sometimes it better than a 13900K sometimes it's worse.

Although most 13900KS/13900K owners likely tweaked the shit out of their cpu likely making up or even surpassing any difference the 7800X3D will have. This cpu is for people who want to buy 6000 CL30/32 memory and call it a day. It also seems to scale pretty well down to 5600 mem as well though.

Just make sure to bring a high end gpu to the party.
AnarchoPrimitivThis is why I've been begging sites like TPU and youtube channels like LTT for YEARS to do some testing to figure out exactly how big of a delta in FPS is required before an average human being can start discerning the difference. In other words, to use controlled testing to figure out if a human being can even accurately notice a difference between 100 FPS vs 90 FPS. I think this would be incredibly important and useful as it would once and for all determine whether one CPU or GPU beating another by an average of 7%, for example, really even matters. I think this information would be priceless for learned consumers when making a purchase.
I think the informed pc hardware buyer is already well aware of this. RT complicates things as well becuase depending on what games you want to play and how much rt is actually used the differences can be massive game to game both from a gpu and a cpu level.
Vya DomusI've recently got a 165hz monitor and I was shocked at how difficult it was to tell the difference between when a game was running at 120fp vs 80-90fps for example.
I think it depends on the display tech as well my 165hz nano ips lg monitor feels/looks hella sluggish compared to my 120hz 4k Oled likely due to the nearly 5x slower pixel response.

Also at 60hz the ips monitor feels/looks horrible where the oled still feels pretty good.

It's gotten to a point i have a hard time even gaming on my ips monitor anymore and that's without accounting for how much better oled looks vs it.
Posted on Reply
#38
dirtyferret
Vya DomusI've recently got a 165hz monitor and I was shocked at how difficult it was to tell the difference between when a game was running at 120fp vs 80-90fps for example.
also depends on your games that you play, competitive multiplayer shooters and MOBAs are going to be used to pushing their FPS as high as possible as opposed to single player games like strategy, turn based, RPGs, etc.,
Posted on Reply
#39
BoboOOZ
AnarchoPrimitivThis is why I've been begging sites like TPU and youtube channels like LTT for YEARS to do some testing to figure out exactly how big of a delta in FPS is required before an average human being can start discerning the difference. In other words, to use controlled testing to figure out if a human being can even accurately notice a difference between 100 FPS vs 90 FPS. I think this would be incredibly important and useful as it would once and for all determine whether one CPU or GPU beating another by an average of 7%, for example, really even matters. I think this information would be priceless for learned consumers when making a purchase.
I think it is important to do that test for yourself, because only you can know you. If you have a decent GPU, pick a game not too demanding and limit FPS from your driver and see what you like and what you don't.
I've tried high FPS gaming, when i play Fortnite I definitely prefer to have 165Hz. But for single player games, if frame pacing is constant, I have no issues playing at 30-40FPS.

However, once I played a few times on my new LG TV, I had to retire my VA and buy a QD-OLED, that is the biggest improvement to my gaming setup, regardless of GPU. So I can go back to low FPS without any issue, but I cannot go back from OLED. It depends on each of us, I guess.
Posted on Reply
#40
fevgatos
AnarchoPrimitivThis is why I've been begging sites like TPU and youtube channels like LTT for YEARS to do some testing to figure out exactly how big of a delta in FPS is required before an average human being can start discerning the difference. In other words, to use controlled testing to figure out if a human being can even accurately notice a difference between 100 FPS vs 90 FPS. I think this would be incredibly important and useful as it would once and for all determine whether one CPU or GPU beating another by an average of 7%, for example, really even matters. I think this information would be priceless for learned consumers when making a purchase.
Υοu cant really tell the difference between 90 vs 100 fps, but you can clearly tell the difference between eg 240 and 120. Especially if - let's say - you play on a 240hz monitor for a week and then go back to the 120, the difference will blow your brains out.
Posted on Reply
#41
Vya Domus
oxrufiioxoI think it depends on the display tech as well my 165hz nano ips lg monitor feels/looks hella sluggish compared to my 120hz 4k Oled likely due to the nearly 5x slower pixel response.

Also at 60hz the ips monitor feels/looks horrible where the oled still feels pretty good.

It's gotten to a point i have a hard time even gaming on my ips monitor anymore and that's without accounting for how much better oled looks vs it.
The number of frames being shown on the screen is the same no matter what the display technology is, it just doesn't make that much of a difference, that's the truth.
Posted on Reply
#42
oxrufiioxo
Vya DomusThe number of frames being shown on the screen is the same no matter what the display technology is, it just doesn't make that much of a difference, that's the truth.
Either way I'd likely need a 240hz ips to get the perceived smoothness of my 120hz oled regardless of if it's framerate or pixel response lcd sucks a$$ comparatively.
Posted on Reply
#43
dirtyferret
Vya DomusThe number of frames being shown on the screen is the same no matter what the display technology is
but it does matter on the technology you are using to make those frames
is it pure rasterization?
Are you using something like DLSS 3 which generates an entirely unique frame every other frame at the cost of increased latency? Is that a worthy trade of?
Posted on Reply
#44
kapone32
oxrufiioxoXHe's also using ddr4 on one of the intel systems which is kinda stupid.... Not sure why people have to watch these obviously fake benchmarks most reputable sites already simulated the 7800X3D by disabling a CCD on the 7950X3D..... Sometimes it better than a 13900K sometimes it's worse.
Even though I know that is the narrative it does not make sense to me as in some Games I see all of my cores active in Task Manager (7900X3D). I don't believe that we will see that the 7950X3D will compare to the 5800X3D. I does not even have the same boost clock. It should beat the 5800X3D as long you are not using 3600 or higher with 15 timings. The thing for me though is that regardless of any argument for or against either chip, AMD has power efficiency and there is no argument there. If my CPU pulls 87 Watts under load I expect the 5800X3D to be no more than 70 Watts under load.
Posted on Reply
#45
oxrufiioxo
dirtyferretbut it does matter on the technology you are using to make those frames
is it pure rasterization?
Are you using something like DLSS 3 which generates an entirely unique frame every other frame at the cost of increased latency? Is that a worthy trade of?
At this point frame gen only works well in 2 out of the 7 games I've tried it in and I'd say you probably need to be using a controller as well.

Also 40 series owners are the only ones that can use it, see how it feels, and decide if a slight input latency hit is worth the improved visual smoothness. Watching shit compressed youtube videos at 60fps is pretty much useless.

I'd still say frame gen doesn't count as a framerate it only improves visual smoothess don't get me wrong CP with Psycho RT is hella impressive running at 4k oled at 120+ fps but frames that don't improve Latency shouldn't count.

Improved latency is the primary reason people desire high framerates I'd be willing to bet
kapone32Even though I know that is the narrative it does not make sense to me as in some Games I see all of my cores active in Task Manager (7900X3D). I don't believe that we will see that the 7950X3D will compare to the 5800X3D. I does not even have the same boost clock. It should beat the 5800X3D as long you are not using 3600 or higher with 15 timings. The thing for me though is that regardless of any argument for or against either chip, AMD has power efficiency and there is no argument there. If my CPU pulls 87 Watts under load I expect the 5800X3D to be no more than 70 Watts under load.
Guessing you mean 7800X3D but I agree if efficiency is most important to someone the X3D chips are in a class of their own at the same time if someones system runs at 600w with a 4090/13900k while gaming vs 500w with an X3D chip I doubt they care I know I wouldn't.

if that was the case everyone would buy a 4090/4080 and tune it down and pair it with a 7800X3D because for the performance you couldn't match the efficiency with anything else budget be damned
Posted on Reply
#46
dirtyferret
oxrufiioxoImproved latency is the primary reason people desire high framerates I'd be willing to bet
I would agree, it's just that I can see the whole "what is a frame" question starting to get muddy
Posted on Reply
#47
oxrufiioxo
dirtyferretI would agree, it's just that I can see the whole "what is a frame" question starting to get muddy
I definitely don't envy hardware reviewers jobs these days with all the upscaling tech and now frame generation throw in to muddy the waters further.

You know how fanboys can be thinking anything their hardware can use should be represented regardless of how variable it can make image quality. While I love RT/DLSS I completely understand why others don't and I even think frame generation is impressive given it's a 1st generation technology but I also don't care if they are included in day one hardware reviews....

I do hope to see more RT including in cpu reviews due to it being more demanding on the cpu than rasterization only but if reviewers don't have time for that I don't blame them.
Posted on Reply
#48
fevgatos
oxrufiioxoI definitely don't envy hardware reviewers jobs these days with all the upscaling tech and now frame generation throw in to muddy the waters further.

You know how fanboys can be thinking anything their hardware can use should be represented regardless of how variable it can make image quality. While I love RT/DLSS I completely understand why others don't and I even think frame generation is impressive given it's a 1st generation technology but I also don't care if they are included in day one hardware reviews....

I do hope to see more RT including in cpu reviews due to it being more demanding on the cpu than rasterization only but if reviewers don't have time for that I don't blame them.
FG should never be into a review with normal native framerates. It's a nice feature to have, but it's best used when your framerate without it is already at least 60-70 fps. Lower than that and you can feel the input lag which is off putting for me.
Posted on Reply
#49
ratirt
AnarchoPrimitivThis is why I've been begging sites like TPU and youtube channels like LTT for YEARS to do some testing to figure out exactly how big of a delta in FPS is required before an average human being can start discerning the difference. In other words, to use controlled testing to figure out if a human being can even accurately notice a difference between 100 FPS vs 90 FPS. I think this would be incredibly important and useful as it would once and for all determine whether one CPU or GPU beating another by an average of 7%, for example, really even matters. I think this information would be priceless for learned consumers when making a purchase.
LTT did that testing as far as I remember. Or a very similar one.
Posted on Reply
Add your own comment
Jun 12th, 2024 17:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts