• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What is the point of 9800X3D in 4k? Isn't 9950X better at the same price?

Is the Ryzen 9800X3D Truly Faster for Real-World 4K Gaming? | TechSpot
^
Because it's true?

You're posting an average across all games -- that's the "bill gates in a room full of nuns" problem - everyone in that room is, on average, a billionaire - but it's not the reality.
That test is horrible, Steve did a poll asking how they play at 4K, Native, Quality, Balanced and Performance, the results were the majority uses DLSS Quality, with about the same amount using Performance and Balanced as Native, but he then proceeds to test using DLSS Balanced with a 4090 and calls it "Real-World testing" apparently just to try to prove his point of testing only at 1080p. Beside DLSS Balanced and Performance being less popular, you need to account that most of the people who answered aren't even using a 4090.
 
So I am building a new PC for RTX 5090. I was looking for CPU.
9950X is the same price as 9800X3D right now.
What's the point of 9800X3D if I will use my PC only in 4k+?
Isn't it better to go X and have 8 extra cores for the same price?

Not necessarily. The thing that many won't tell you is that the X3D CPUs are actually pretty weak. Their saving grace is the large cache which massively speeds up memory throughput, and a lot of the processor time is spent waiting on the RAM to respond. That's the magic behind these CPUs and why they are sooo goood for gaming, 4K or not. But once you begin actually demanding chip resources, they generally fall flat because they're just 8 core CPUs with relatively mild clock speeds - this is where the Intel chips with their buff P-cores and plenty of E-cores + fast DDR5, or a 9950X with tightened 6000 MT/s (C26-C28) will shine.

Ultimately only you know what's a priority for you, I target 4K 120 fps and my system should be more than capable of handling this with the 5090, while also being great to edit my clips and play around with my stuff without waiting an inordinate amount of time for the task at hand to be completed. If you mean to just play video games, then perhaps the 9800X3D would help reaching 4K 144 fps, or something. But I don't think even it would actually net that much of a gain.

If you're building a 5090 system, then clearly budget is no objection - and you should IMHO probably be looking at the 9950X and a tier S board like the ROG X670E Gene or X870E Apex. If you're going to strap a 5090 with a 8 core CPU, then perhaps just really do yourself a favor and buy the 5080, using the extra $1000+ you'll pocket to buy higher quality components, more storage, more memory, better cooling, better case, etc. and you'll probably have a better PC instead. That's my honest opinion as someone who actually did splurge on a '90.

To summarize:

Just gaming > 9800X3D (speedy cache chip to make games happy, AVX512 support will also help with some emulators)
Heavy gaming and mixed content creation > U9 285K/14900K (offers raw power)
Focused more on content creation > 9950X (lot of muscle and bringing in the big guns with AVX512 support across all 16 cores which will make a lot of renderers very happy)

You can't have your cake and eat it too, if AMD released a dual X3D CCD Ryzen 9, then you could, but they refuse, likely to protect their server business, so you can't.
 
Last edited:
in what way is it better at 4k

View attachment 384501
the chart only tells the story of avg fps. But says nothing about the quality and experience during the average fps. Does it have occasional stutter during those fps numbers? Because of 1% lows? Perhaps the 9800x3D affects the lows and allows a better /smoother quality and experience at the same average fps etc. ? If so, i'd get one lol. And I'm thinking of getting a 4k 240hz oled. For which i'll use dlss 4 of course. And once upscaling adds more gpu performance, having a cpu that won't bottleneck or distract it with random stutter is the best bet. The 9800x3D is the latest greatest cpu made primarily for gaming right now. Second to that is the 7800x3D. But that's my opinion lol.

I personally want the 9950X3D but wasted on 4K 144hz and with a 5090 youre stuck at running 138hz with gsync. As then youll be comparing frames beyond 144hz especially with framegen whichll make no sense.
You'll only be stuck at 138hz, even with g sync, if you're enabling nvidia reflex in-game OR low latency in control panel is set to ultra. For single player games i turn off reflex and leave low latency set to on (not ultra) for my 4k oled tv to get full 144hz. WITH G-sync =-)
 
So I am building a new PC for RTX 5090. I was looking for CPU.
9950X is the same price as 9800X3D right now.
What's the point of 9800X3D if I will use my PC only in 4k+?
Isn't it better to go X and have 8 extra cores for the same price?
I'd pick 9800x3d anyway, with a 5090 at 4K, you're looking at the same performance that 3090/Ti have at 1080p.
average-fps-1920-1080.png
average-fps-3840-2160.png
 
Last edited:
That test is horrible, Steve did a poll asking how they play at 4K, Native, Quality, Balanced and Performance, the results were the majority uses DLSS Quality, with about the same amount using Performance and Balanced as Native, but he then proceeds to test using DLSS Balanced with a 4090 and calls it "Real-World testing" apparently just to try to prove his point of testing only at 1080p. Beside DLSS Balanced and Performance being less popular, you need to account that most of the people who answered aren't even using a 4090.
Nobody hardly ever post THEIR OWN RESULTS , I do all the time with data , cold hard facts , when it come to 4K MAX with a 4090 and 5950x CPU , plus a few games with DLSSQ only , like 2077 -Star wars Outlaw .
 
Aren't we trending towards more cores being utilized? I'm sure some of these newer games ("Star Citizen" / Indiana Jones) would put them to good use. In a 9900x thread someone there mentioned about cores being prioritized based on their speed (I'm paraphrasing), so you might end up having to utilize both CCD's regardless. Not sure how core parking plays into that?!


I'd personally get a 9950x and pick up a Zen 6 chip in the future, where you'll be able to take advantage of CUDIMM. Not sure what would hold better value out of a 9900x and a 9800x3d?
 
wait for the 9950x3d maybe? if you can afford a 5090 you can likely afford that too.
 
in what way is it better at 4k

View attachment 384501

Those 3 fps make all the difference... If op only using 4k native ultra, yes cpu won't matter too much... often.

Can depend on the game, but may as well get 9700x unless you need the extra cores for something.

X3D don't offer much benefit for a lot of real world use cases, reviews can be very misleading.
 
A 9950X will always be slower at gaming so unless you do more than gaming and by more I mean heavy productivity not youtube tabs a 9800X3D makes the most sense. It likely will also have better resale value last I checked a used 5800X3D is worth more than a used 5950X so even the resale aspect likely won't hold to be true.
 
Last edited:
That test is horrible, Steve did a poll asking how they play at 4K, Native, Quality, Balanced and Performance, the results were the majority uses DLSS Quality, with about the same amount using Performance and Balanced as Native, but he then proceeds to test using DLSS Balanced with a 4090 and calls it "Real-World testing" apparently just to try to prove his point of testing only at 1080p. Beside DLSS Balanced and Performance being less popular, you need to account that most of the people who answered aren't even using a 4090.
I mean those are literally the settings I play at lol. I use both balanced and quality whichever gets me to around 150 fps - 150+ fps feels amazing.

Point is this - it depends on what you're trying to do... alot of these "in the real world" arguments are missing the point -- if you game at high fps get the 9800x3d, if you do production then get the 9950x.

If you want big numbers and want to overclock a 16 core chip for funsies, then get the 9950x -- totally valid reason. If you want to run as close to 230 gsynced fps as possible get the 9800x3d. There's not really a wrong answer.

The whole "My XXXX CPU is still good when I run fully GPU bound" -- that's great and valid, but not everyone likes to game like that -- my 10850k is probably still good if I run fully GPU bound as 30fps locked 8K, that doesn't change anything. The 3D cache can net you 30% in games, will you use it? Depends on your settings - if you're gonna DLAA w/ Max pathtracing at 4k then get whatever you want because it doesnt matter at that point.

Just don't gaslight yourself into thinking there's no difference between these cpus, because you're NEVER gonna turn on DLSS balanced at 4k... or more powerful cards aren't going to come out in 12-18 months that will put more stress on the CPU.
 
Last edited:
Practically any modern CPU is fine for 4K, you wouldn't see any difference if you don't have an OSD on.
 
but he then proceeds to test using DLSS Balanced with a 4090 and calls it "Real-World testing"
he tested at both native and 4K DLSS Balanced because of the poll and he does not call it Real World testing but rather the fan boys call it that because they can't comprehend why reviewers use lower resolution for CPU tests
We often hear arguments like, "I just want to see 1440p and 4K results because they're more 'real-world'; they let me know if a CPU upgrade will give me any additional performance."
apparently just to try to prove his point of testing only at 1080p
you are posting on a web site that test at 720p and explains why
"All games from our CPU test suite are put through 720p using a RTX 4090 graphics card and Ultra settings. This low resolution serves to highlight theoretical CPU performance, because games are extremely CPU-limited at this resolution. Of course, nobody buys a PC with an RTX 4090 to game at 720p, but the results are of academic value because a CPU that can't do 144 frames per second at 720p will never reach that mark at higher resolutions."
you need to account that most of the people who answered aren't even using a 4090
it's a CPU test, the RTX 4090 is used to remove any GPU bottleneck to the best of the reviewers ability even if DLSS is used. If you want to to see DLSS tests on cards below RTX 4090 performance, look at their specific reviews when the reviewers uses a flagship CPU to remove any possible CPU bottleneck to the best of their ability. Are we starting to see a pattern here?

Nothing he did in that test is out of the ordinary from any other reviewer

Practically any modern CPU is fine for 4K, you wouldn't see any difference if you don't have an OSD on.
hey if you turn off the monitor's power, it's all 0 FPS baby!

you're posting an average across all games -- that's the "bill gates in a room full of nuns" problem - everyone in that room is, on average, a billionaire - but it's not the reality.
mean vs mode
 
I see that "I play at 2160p, so I don't need the best CPU" mentality is still going strong. :laugh: It might make some sense only if you play at =/<60FPS. But the best rigs in that case are called gaming consoles, with their ever smooth gameplay with variable resolution always on.

CPU doesn't care what resolution you play at people, It's just that at higher resolutions it might become constrained by the GPU, so it works less in that scenario. It still have to do the work of feeding the GPU with data, and if the game engine scales properly with higher resolutions, much more data.
 
in what way is it better at 4k

View attachment 384501
0.1% and 1% lows (frametimes)
1739398391847.png


Personally, *if* I were doing media production or other non-gaming many-threaded work
I too would pick a dual-CCD 16core over the 9800X3D.
Otherwise, the 9800X3D (near) universally will be the faster CPU, even @ 4K.
 
You will still see a much tighter fps with the X3D chip - your 1% lows will be closer to the 138 fps cap. UE5 games are so badly optimized even at 4K you will feel a difference with the faster chip.

My point is that with a 5090 youre very unlikely to see anything but 144hz and up, at my current res anyway. Thus if your min and max fps are above your display refresh rate and youre comparing numbers between 160 vs 190fps its kinda irrelevant.

And the dudes point was TPU minimum avg and maximum fps data show a margin of error in difference in 4k.. why is everyone then posting from other sources instead of explaining why this is? Myself and many others use this as our preferred and trusted source of information. Someone please explain why the max/avg/minimum fps in the TPU data shows that a Ryzen 7700 is only a few fps off a 9800X3D in 4K.
 
Last edited:
My point is that with a 5090 youre very unlikely to see anything but 144hz and up, at my current res anyway. Thus if your min and max fps are above your display refresh rate and youre comparing numbers between 160 vs 190fps its kinda irrelevant.
That's fine but in the OP's query both CPUs cost the same so why pay for 160FPS when you can get 190FPS for the same cost that will work better in your next upgrade or more demanding game? If the 160 FPS CPU cost significantly less than the 190 FPS CPU, you can make a value argument vs real world difference.
And the dudes point was TPU minimum avg and maximum fps data show a margin of error in difference in 4k.. why is everyone then posting from other sources instead of explaining why this is?
because those are 4k numbers making the games GPU limited in performance, look at 720p numbers or the 1080p numbers that people are posting which removed the GPU limitation as best as possible
 
because those are 4k numbers making the games GPU limited in performance, look at 720p numbers or the 1080p numbers that people are posting which removed the GPU limitation as best as possible

The use case is 4K by the OP thus please explain the benefit of a 9800x3d over the 7700 in 4K based off TPUs Max Avg and Min FPS.
 
The use case is 4K by the OP thus please explain the benefit of a 9800x3d over the 7700 in 4K based off TPUs Max Avg and Min FPS.
Those are stock numbers from the other CPUs.. you can turn them up quite a bit.
 
he tested at both native and 4K DLSS Balanced because of the poll and he does not call it Real World testing but rather the fan boys call it that because they can't comprehend why reviewers use lower resolution for CPU tests
We often hear arguments like, "I just want to see 1440p and 4K results because they're more 'real-world'; they let me know if a CPU upgrade will give me any additional performance."

you are posting on a web site that test at 720p and explains why
"All games from our CPU test suite are put through 720p using a RTX 4090 graphics card and Ultra settings. This low resolution serves to highlight theoretical CPU performance, because games are extremely CPU-limited at this resolution. Of course, nobody buys a PC with an RTX 4090 to game at 720p, but the results are of academic value because a CPU that can't do 144 frames per second at 720p will never reach that mark at higher resolutions."

it's a CPU test, the RTX 4090 is used to remove any GPU bottleneck to the best of the reviewers ability even if DLSS is used. If you want to to see DLSS tests on cards below RTX 4090 performance, look at their specific reviews when the reviewers uses a flagship CPU to remove any possible CPU bottleneck to the best of their ability. Are we starting to see a pattern here?

Nothing he did in that test is out of the ordinary from any other reviewer
The test I quoted is only 1080p Native and 4K DLSS Balanced (I think it's about 1200p). And the article name is "Is the Ryzen 9800X3D Truly Faster for Real-World 4K Gaming?".

TPU tests at 720p, but also in 1080p, 1440p and 4K, you can pick the data point that makes the most sense for you, or extrapolate to fit your needs. HUB tests only at 1080p, while it works to show the difference between CPUs, it seemingly creates a false impression in many new users that the difference in higher resolutions is similar, or uniform, which isn't the case. Some games will be CPU bound even while using the 5090 at 4K+, while some will be heavily GPU bound at 1440p. Also testing in a single resolution doesn't give enough data points to be able to extrapolate the results to other resolutions.

An actual "4K real-world test" would likely show that in most cases even with the fastest GPU at the moment, any $200 CPU from the last 4 years is going to be good enough, but some games (like simulators) would be CPU bound even at 4K, so people that are interested in those should consider going for a faster CPU even if they game at 4K with a non-flagship GPU. But his test pretty much only shows that unsurprisingly 1200P isn't that different compared to 1080p.

I mean those are literally the settings I play at lol. I use both balanced and quality whichever gets me to around 150 fps - 150+ fps feels amazing.

Point is this - it depends on what you're trying to do... alot of these "in the real world" arguments are missing the point -- if you game at high fps get the 9800x3d, if you do production then get the 9950x.

If you want big numbers and want to overclock a 16 core chip for funsies, then get the 9950x -- totally valid reason. If you want to run as close to 230 gsynced fps as possible get the 9800x3d. There's not really a wrong answer.

The whole "My XXXX CPU is still good when I run fully GPU bound" -- that's great and valid, but not everyone likes to game like that -- my 10850k is probably still good if I run fully GPU bound as 30fps locked 8K, that doesn't change anything. The 3D cache can net you 30% in games, will you use it? Depends on your settings - if you're gonna DLAA w/ Max pathtracing at 4k then get whatever you want because it doesnt matter at that point.

Just don't gaslight yourself into thinking there's no difference between these cpus, because you're NEVER gonna turn on DLSS balanced at 4k... or more powerful cards aren't going to come out in 12-18 months that will put more stress on the CPU.
The point is that the poll he made shows that most people that answered use DLSS Quality to play at 4K. And statistically most of those are likely running GPUs weaker than the 4090. There is no issue if you use DLSS Balanced, just that the majority doesn't.
Also in most cases the choices aren't "should I go with a 9800X3D or 9950X with a 5090" like this topic, it's usually "should I get a 7700X CPU with a 5080 or a 9800X3D CPU with a 5070Ti". That would be the point of showing "real world scenarios", to show in which use cases each type of setup makes the most sense.
 
Damn so many arguments from a simple question. It seems quite simple - for only 4k gaming get the 9950X or 9700X unless you're playing city:skylines and a bunch of simulation/strategy games in which case get the 9800X3D and don't look back. Those will get bottlenecked at any resolution.

Looking at 1080p results here are only useful if OP wants to run the same CPU for one or two more GPU cycles. Because at present even with a 5090 it's pretty GPU limited at 4k with the games in TPU's test suite. Maybe 6090 will cause a CPU bottleneck at 4k for the 9950X with TPU's test suite, who knows.
 
I see that "I play at 2160p, so I don't need the best CPU" mentality is still going strong. :laugh: It might make some sense only if you play at =/<60FPS. But the best rigs in that case are called gaming consoles, with their ever smooth gameplay with variable resolution always on.

CPU doesn't care what resolution you play at people, It's just that at higher resolutions it might become constrained by the GPU, so it works less in that scenario. It still have to do the work of feeding the GPU with data, and if the game engine scales properly with higher resolutions, much more data.
It's been around since 5950x vs 5800x3d - and the 5800X3D peeps are now cruising at +30% fps with their 4090s/4080s.
 
Those are stock numbers from the other CPUs.. you can turn them up quite a bit.
9800X3D for the win, because 2 things.

1 higher effective clock rates.
2. Doesn't really care what ram you use.

More listed different importance.

Is advertised and designed for Gaming.
Proves it's better on consistent results.
Gaming Flagship CPU.

9950X - cause you want cores, like to tweak memory settings, and want cores.

Question.
What game is putting either CPU at 100% load all cores??

Love the click title of this thread. But realizing 4K, must meaning gaming?????
 
Back
Top