• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 7 9800X3D To Feature Significant Clock Speed Boost

Im sitting in remnant 2 in 4k at 100-120 FPS on a 240Hz monitor w/ DLSS and 70% usage on the gpu on a tuned 13700k (5.4Ghz all core) with fast ram. Could easily be getting 180-240 FPS if the CPU was strong enough.
Hmm, there ist a CPU on the market that actually doubles the 4k FPS you get from an 13700k in a third person shooter?

As a 4090 gamer in 4k i would like to hear more.
 
In gaming on lower resolutions, certainly.
If you enable DLSS, enable sharpening from the control panel, and lower pointless “ultra” settings in game on the 4090 you’re essentially gaming at lower resolutions. You can get into the 120-200 fps in most titles with no perceptible loss in image quality (in some cases it actually looks better).
 
Yeah, that's not really how things work.

Anyway; it's much simpler to get an air cooler that can take the thermal load, and put a Static Low-RPM profile on the fan/s, so it always turns without causing any thermal-throttling. You want the noise levels of the fans in your machine to stay the same at all times, this way you can keep them at say, 35dBa and tune out the frequency over time.

Some fans might say they are a low dBA, but they can still be extremely annoying depending on the individual person's hearing levels and the frequency the fan makes. It's why some people don't hear coil-whine from their GPU, while others can, which drives them bonkers.
That is how it's been working for me 6 years into using aio cooling on CPU I never hear any fan noise but maybe I'm just crazy.....
 
9800X3D vs 265K is gonna be interesting
Doesn't look interesting, at least not when it comes to gaming. Or am I missing something?
 
LOL, this guy is a joke and has been wrong basically about anything and even this leak doesn't say anything new, he got even pranked and selling the prank as a leak as stuff from is the most consistent source.

1. We already know that the 3D is being hastened to be released in 2k24
2. We already know they have been working on increasing the frequency of the 3D parts and improving the thermal dissipation and interface.
3. 104 MB cache was also mentioned by someone else

He said nothing new.
 
Using an AIO didn't work on my 7800X3D for some reason. It overheated even after several repaste attempts. It's happy under a Dark Rock 4 in a semi-passive operation now.


I get that, but 10% difference still isn't much, imo. If you disagree, that's fine. :)
What were you using?
 
I'm using a 360 mm aio
It should still be able to cool a 7800X3D, just like it did a 7700X before that. I have no idea what went wrong.
 
It should still be able to cool a 7800X3D, just like it did a 7700X before that. I have no idea what went wrong.
I agree
 
It should still be able to cool a 7800X3D, just like it did a 7700X before that. I have no idea what went wrong.
Has to be a mount issue. I know you repasted, but something must be affecting the thermal interface between the chip and the AIO. The TDP is only 15W higher on 7800X3D.
That said, the 7800X3D CCD can easily run hotter since it is essentially insulated by the 3D vcache.

Were you at the absolute limit with the 7700X cooling-wise?
 
Has to be a mount issue. I know you repasted, but something must be affecting the thermal interface between the chip and the AIO. The TDP is only 15W higher on 7800X3D.
That said, the 7800X3D CCD can easily run hotter since it is essentially insulated by the 3D vcache.

Were you at the absolute limit with the 7700X cooling-wise?
The 7700X could just about reach its 142 W max PPT at 92-93 °C. The 7800X3D throttled down to 4.5 GHz and 50-60 W at max temp.

I thought it was a mounting issue, too, but it did the same after several remounts. I also bought a Thermal Grizzly offset mount which was a waste of money.
 
Sure, but what GPU does it take to be CPU limited and actually notice the difference? Not to mention you have to be constantly staring at your FPS monitor, which is not a good way to enjoy a game.
Yeah. CPUs may be having a moment - luckily thats not where most games are limited. I've been playing around with ff16 to try and understand it since there's so many complaints on steam and I want to be able to help but I can't because the I only have one set of hardware and it runs quite well.

At first I was surprised when I changed the core affinity in windows to just pcores and got significantly reduced performance. I couldn't believe it. Do ecores really matter in this game? Then I get a tip that, some dx12 games do not like their affinities being messed with so I disabled ecores AND hyperthreading from the the bios. Then I went in game to 720p and and turned dlss on to balanced and guess what - I was STILL GPU LIMITED. I couldn't believe it. And I have a 4090. Still turning the ecores and hyperthreads back on did improve fps, but not by much, only by 3-4%, so, within margin of error.
 
Last edited:
Dunno what Cpu you've got, but try BG3 Act 3. Easy running into CPU limit there, at least it was.
 
Don't know much on the whole over clocking but did a small PBO and the usual easy steps and hit 6838.8 clocks consistently with the 9800x3d, strix 870-a motherboard and a strix 4090 with 48gb ram. I'm sure it's the usual stuff. But I was excited to finally do my first overclock and it brought Cinebench up a lot.



20241213_222402.jpg
 
Don't know much on the whole over clocking but did a small PBO and the usual easy steps and hit 6838.8 clocks consistently with the 9800x3d, strix 870-a motherboard and a strix 4090 with 48gb ram. I'm sure it's the usual stuff. But I was excited to finally do my first overclock and it brought Cinebench up a lot.
Unless you ran it under LN2 those max clocks are clearly false.
 
Unless you ran it under LN2 those max clocks are clearly false.
I mean does hwinfo make up random numbers? It might its a serious question. I always watch videos on everything pc and everyone uses it to run stability test. Which is what I did and I didn't have liquid nitrogen or anything to be honest I was just gaming and usually run it to keep track of my temps. So very possible hwinfo messed up and put those numbers there which is why I joined the page to ask. I wouldn't know the first step on how to fake a picture of hwinfo. That mixed with the fact that everytime I game or run anything my max ends up being those numbers or very close to them.
 
I mean does hwinfo make up random numbers? It might its a serious question. I always watch videos on everything pc and everyone uses it to run stability test. Which is what I did and I didn't have liquid nitrogen or anything to be honest I was just gaming and usually run it to keep track of my temps. So very possible hwinfo messed up and put those numbers there which is why I joined the page to ask. I wouldn't know the first step on how to fake a picture of hwinfo. That mixed with the fact that everytime I game or run anything my max ends up being those numbers or very close to them.
You're using HWMonitor. This program does not explicitly support 9800X3D tho they did add 9000 series support in summer.
Their last update was several months ago but 9800X3D released just about a month back.
This could be the reason why it's reading clocks wrong. Plus from my personal experience this developer has always been more Intel focused.

I suggest you give HwInfo64 a try instead: https://www.hwinfo.com/download/
 
Just use a different program to control those numbers. Like CPU-Z or something from AMD itself. Doesn't Ryzen Master show the clock too? The CPU has a max boost out of the box to 5200MHz.

So... when you have a load running (like Prime95 or Cinebench or smth), does it show a normal number? Like 4,8GHz or something? Or 5,2-5,3GHz when running on only one thread? Because in that case, this could be a simple mistake with the cores seemingly boosting to 6,5GHz for like one nanosecond and the program reading that as a valid reading.
 
CPU-Z is from the same developer as HWMonitor.
 
Don't know much on the whole over clocking but did a small PBO and the usual easy steps and hit 6838.8 clocks consistently with the 9800x3d, strix 870-a motherboard and a strix 4090 with 48gb ram. I'm sure it's the usual stuff. But I was excited to finally do my first overclock and it brought Cinebench up a lot.



View attachment 375624
Very suspicious...there must be a problem when reading those clocks. My 9800X3D with just +200MHz PBO aka 5.4GHz and a -10 in Curve Optimizer (All Cores) can reach 95°C while doing shader compilation on some games like FF XVI, Horizon ZD/FW, Cyberpunk 2077, GoW: Ragnarök, etc. and I have a Liquid Freezer II 280mm AIO, so 6.8GHz with a max of 75.4°C hmm I doubt it. Try to run some shader compilation at 6.8GHz and post a video of the temps while using RTSS & HWiNFO64 :)
 
Back
Top