• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

CPU power in Cyberpunk

Joined
Feb 24, 2013
Messages
207 (0.05/day)
System Name Vaksdal Venom
Processor Ryzen 7 7800X3D
Motherboard Gigabyte B650 Aorus Elite AX V2
Cooling Thermalright Peerless Assassin 120 SE
Memory G.Skill Flare X5 DDR5-6000 32GB (2x16GB) CL30-38-38-96
Video Card(s) Gigabyte 5080 GAMING OC
Storage WD Black SN750 500GB, Samsung 840 EVO 500GB, Samsung 850 EVO 500GB, Samsung 860 EVO 1TB
Display(s) Viewsonic XG2431
Case Lian Li O11D Evo XL (aka Dynamic Evo XL) White
Audio Device(s) ASUS Xonar Essence STX
Power Supply Corsair HX1000i
Mouse Logitech G703 Hero
Software Windows 11 Home
Hi, this is more of a guesswork type of question, I guess...

I'm currently using an RTX 3080 and Ryzen 5600X to play Cyberpunk 2077 in 1080p using HUB's Quality settings, except Crowd Density which I have on Medium. SMT is on. All RT is off.
In the area just outside the first apartment at the start of the game (1:31:59 in this video). I'm getting from 72 to about 87 fps (it varies) when looking out over the street.

I was wondering if anyone would take an informed guess at how much I would gain going from the 5600X to a 7800X3D (and the AM5 platform obviously) while still using the 3080. Is my 5600X holding my 3080 back in this game?

I have googled a bit myself but couldn't find a video or benchmark that covered this scenario.
It seems version 2.1 is a bit more CPU intensive - even though it's mostly GPU intensive still - but I'm not sure how much.

I'm trying to keep the fps over 75 because I've vlocked to that frequency.
 
I have a 3080 and it would make no sense to upgrade anything, my 12700K runs at 10% load on the benchmark and the game play is fine. The visual fidelity between path tracing on or off is barely noticeable.
 
Are you playing in 1080p?
Btw, this is the video that made me start wondering about the whole thing:
.
 
With HUB quality settings (FSR @Quality because HUB doesnt specify) and crowd density medium I get over 170FPS in that scene with a slightly tuned 6800XT (still should be around of the performance of 3080) with a 5800X3D. Judging by the GPU power I assume this is still CPU limited so a 7800X3D could probably cross 200FPS in this scene. I dont know if I missed something because I wouldnt think 5600X so far behind. I can do further testing if you want. Although at my usual settings in this game (Ultra preset with SSR OFF @1440p FSR Quality) I can play mostly locked at 144FPS (it drops a little in Jumptown). So you should be getting what you want from either X3D chips. In this case, you might wanna consider 5800X3D as well.

Oh I should mention, I do have 2 graphic mods installed but neither of them to boost performance (FSR ghosting fix and high res texture pack)

(Patch 2.11)

cpppp.jpg
 
You would gain quite a bit of CPU performance, obviously, but that might prove irrelevant if you lock to 75 FPS. At that framerate you are unlikely to be CPU limited. If you wanted to push as much as you can for high refresh gaming that would be a different story. As mentioned above, you should consider a 5800X3D perhaps. Buying a whole new platform for minor gains in a specific title seems unwise.
Also, can I ask why you elected to turn SMT off? To my knowledge, CDPR patched the issues with AMD SMT implementation a while ago.
 
With HUB quality settings (FSR @Quality because HUB doesnt specify) and crowd density medium I get over 170FPS in that scene with a slightly tuned 6800XT (still should be around of the performance of 3080) with a 5800X3D. Judging by the GPU power I assume this is still CPU limited so a 7800X3D could probably cross 200FPS in this scene. I dont know if I missed something because I wouldnt think 5600X so far behind. I can do further testing if you want. Although at my usual settings in this game (Ultra preset with SSR OFF @1440p FSR Quality) I can play mostly locked at 144FPS (it drops a little in Jumptown). So you should be getting what you want from either X3D chips. In this case, you might wanna consider 5800X3D as well.

Oh I should mention, I do have 2 graphic mods installed but neither of them to boost performance (FSR ghosting fix and high res texture pack)

(Patch 2.11)

View attachment 334729
Thanks, that's quite a leap indeed. I am considering the 57/5800X3D as well, as that will give me quite a bump in ACC – which I play a lot – for much less money. I do get a lot higher fps in many other places but want to keep it well above 75 in demanding areas. For example I hear Dogtown is very demanding. Haven't been there yet.

Thanks! Hadn't seen those.

You would gain quite a bit of CPU performance, obviously, but that might prove irrelevant if you lock to 75 FPS. At that framerate you are unlikely to be CPU limited. If you wanted to push as much as you can for high refresh gaming that would be a different story. As mentioned above, you should consider a 5800X3D perhaps. Buying a whole new platform for minor gains in a specific title seems unwise.
Also, can I ask why you elected to turn SMT off? To my knowledge, CDPR patched the issues with AMD SMT implementation a while ago.
Well yes, but not irrelevant in the sense that I want to keep the minimum fps well above 75 to avoid stutters.
I have considered a 5800X3D yes, but at some point I'm going to upgrade to AM5 none the less, so still in doubt when to do it.
If you read a bit slower you will see that I have SMT on. :D;)
 
@adilazimdegilx I think you might have frame gen enabled. I get around 90 fps in that scene. Default settings at Ultra. With frame gen on I get over 170..

See system specs.

My advise to the OP, @Whitestar

If you want a budget upgrade, grab a 5700x3d or a 5800x3d. (pretty sure the 5600x3ds are our of stock).

If you are on a upgrade cycle, a 7800x3d would be a good next step, and it will allow a graphics card upgrade in the future, without having to replace the whole system again.
 
Last edited:
Well yes, but not irrelevant in the sense that I want to keep the minimum fps well above 75 to avoid stutters.
Some stutters might just occur in-engine, regardless of FPS, but I get your meaning. You still probably won’t need THAT much of overhead.

I have considered a 5800X3D yes, but at some point I'm going to upgrade to AM5 none the less, so still in doubt when to do it.
At this point, you might as well wait a bit for what Zen 5 would bring, honestly.

If you read a bit slower you will see that I have SMT on. :D;)
My bad, guess head don’t work so good at the end of the day. I just saw SMT and CP2077 together and instantly got flashbacks to when people on AMD had to turn off SMT for better performance.
 
@adilazimdegilx I think you might have frame gen enabled. I get around 90 fps in that scene. Default settings at Ultra. With frame gen on I get over 170..

See system specs.

My advise to the OP, @Whitestar

If you want a budget upgrade, grab a 5700x3d or a 5800x3d. (pretty sure the 5600x3ds are our of stock).

If you are on a upgrade cycle, a 7800x3d would be a good next step, and it will allow a graphics card upgrade in the future, with having to replace the whole system again.
No I dont have it enabled. You would see 'frame gen lag' on the OSD. Those were at HUB Quality settings as OP mentioned, not in Ultra preset. Ultra preset includes SSR Ultra which is the most taxing setting in the game after RT. Here's another quick test with my own gameplay settings (Ultra but SSR OFF, FSR Quality, 1440p). But CPU load may change between scenes due to random events/npcs/cars etc so it can be due to some variation.

cpppp2.jpg

this is with frame gen on (apparently print screen doesnt include OSD when frame gen enabled, I had to take a SS with AMDs own software)

Cyberpunk 2077_2024.02.15-19.34.png

edit: here's same scene with SSR Ultra as well. So I also drop to 90FPS here. But that's due to GPU limit not CPU.
cpppp4.jpg
 

Attachments

  • cpppp3.jpg
    cpppp3.jpg
    883.3 KB · Views: 107
Last edited:
ahh... yeah I didnt go thru all of that mess
 
I got an i5-12400 and it's just barely faster than 5600X. The only way I dip down below 75 FPS is visiting some of the most CPU-intensive Dogtown subregions, like that marketplace in Stacks. Any X3D does the job much better and you'll notice more smoothness but not much more simply because you're locking yourself to 75 FPS. Upgrading the CPU would've made all sense in the world in case of you having a 100+ Hz monitor.

I'd purchase a high quality high refresh monitor of 1440p upwards and upgrade the whole thing to RTX 6070/6080 and then-CPUs if I was you. In case you go 1440p don't worry about FPS drops too much: at this resolution, DLSS at Quality usually is better than native 1080p and performance is similar, albeit a tad worse. Something along the lines of native 1080p: 70 FPS; 1440p: 45 FPS; 1440p + DLSS: 65 FPS.

Cyberpunk 2077 is a cesspool of junk code though. No matter how hard you improve your hardware the game still manages to occasionally run into stutters, ghosting, blurriness, or even crashes. If you don't intend on buying a new display then don't bother upgrading your CPU.
 
When you do a search on your monitor, it shows that they're 144hz panels. So there is no need to lock them at 75hz. I feel the 5600x might be a bottle neck for your 3080. Looking at a few YT vids with a 5600x and a 3080, show the GPU util at around 80-90% with a frame rate of 80fps. (the TPU review shows 113fps for your card) No optimization, just default with DLSS off. So I quick and simple upgrade would be to slap a 5700x3d in and call it a day.
 
Hi, this is more of a guesswork type of question, I guess...

I'm currently using an RTX 3080 and Ryzen 5600X to play Cyberpunk 2077 in 1080p using HUB's Quality settings, except Crowd Density which I have on Medium. SMT is on. All RT is off.
In the area just outside the first apartment at the start of the game (1:31:59 in this video). I'm getting from 72 to about 87 fps (it varies) when looking out over the street.

I was wondering if anyone would take an informed guess at how much I would gain going from the 5600X to a 7800X3D (and the AM5 platform obviously) while still using the 3080. Is my 5600X holding my 3080 back in this game?

I have googled a bit myself but couldn't find a video or benchmark that covered this scenario.
It seems version 2.1 is a bit more CPU intensive - even though it's mostly GPU intensive still - but I'm not sure how much.

I'm trying to keep the fps over 75 because I've vlocked to that frequency.

What is your gpu load % when getting that fps ?

If it's like 50%, then you will gain alot of fps from a faster cpu (which you most likely will), and if it's like 97% or higher, then you won't gain any fps from a faster cpu.
 
What is your gpu load % when getting that fps ?

If it's like 50%, then you will gain alot of fps from a faster cpu (which you most likely will), and if it's like 97% or higher, then you won't gain any fps from a faster cpu.

This part right here. I use this all the time to asses CPU-limiting in games.

BTW I just upgraded from a 5600 (OC to 4650 to match 5600X) to a 5800X3D, playing CP2077 with a 6800 XT (similar perf to your 3080) with HUB Quality settings but at 1440p. I don't seem have a problem maintaining over 75 fps with the 5800X3D but it's not a night-and-day difference in this game from the 5600 as it's been updated a lot to be more CPU-efficient and is still pretty heavy on the GPU. However the 5600 would be CPU-limited in NPC-heavy areas as GPU use would fall from ~97%.

The games which see a noticeable difference are the (broken) ones that jack the CPU:

Starfield - a noticeable difference but not my fave game
Hogwarts Legacy - a big difference, traversal stutter almost absent, very welcome
Ark: SE - less traversal stutter
Minecraft - less traversal stutter (@32 chunk view distance)

CP2077's improvements were more subtle.
 
No I dont have it enabled. You would see 'frame gen lag' on the OSD. Those were at HUB Quality settings as OP mentioned, not in Ultra preset. Ultra preset includes SSR Ultra which is the most taxing setting in the game after RT. Here's another quick test with my own gameplay settings (Ultra but SSR OFF, FSR Quality, 1440p). But CPU load may change between scenes due to random events/npcs/cars etc so it can be due to some variation.

View attachment 334747

this is with frame gen on (apparently print screen doesnt include OSD when frame gen enabled, I had to take a SS with AMDs own software)

View attachment 334749

edit: here's same scene with SSR Ultra as well. So I also drop to 90FPS here. But that's due to GPU limit not CPU.
View attachment 334750

And @Lew Zealand

What you have to take into consideration is that nvidia has a fair bit more driver overhead, meaning that in cpu limited situations, where the gpu is in no way a limiting factor, amd gpus will get quite a bit more fps with the same cpu vs nvidia.

This is my 7800x3d with the same settings, but with a 4090 - only a smidge faster than the amd card on the 5800x3d.

itFPBcF.jpg


In other words, the 5600x is a way bigger limiting factor on his 3080 than it was on your guys amd gpus :)
 
Went back and looked more closely CPU use is about 45 %.
 
Last edited:
I have a 3080 and it would make no sense to upgrade anything, my 12700K runs at 10% load on the benchmark and the game play is fine. The visual fidelity between path tracing on or off is barely noticeable.


Cpu usage tells you absolutely nothing in this context.


I am pretty sure he meant to add that to his previous post.
 
I am pretty sure he meant to add that to his previous post.

Still doesn't tell us anything.

What settings is he using, what fps is he getting, what is his gpu usage... that would be useful info, unlike cpu usage.
 
I have a 3080 and it would make no sense to upgrade anything, my 12700K runs at 10% load on the benchmark and the game play is fine. The visual fidelity between path tracing on or off is barely noticeable.
12700K is ALOT faster than 5600X. My 10700K runs above 50% usage at 1440P with 3080 Ti
 
I ran around the game today, and my Cpu runs at 100% most of the time at 1440p at 150+ fps. ( turned SSR to low ), using super res and setting it at 4k, CPU comes down to 60-70% with 80-90fps.
 
Lots of good feedback here, thanks guys!

About me locking the fps/frequency to 75: Might be a bit silly especially since I now have a Viewsonic XG2431 (I have updated my system specs) connected through HDMI. But here's the thing:
1. I use strobing, which is excellent on the XG2431. That means having a vlock. Well, at least it did on my old XL2720Z. (*see below)
2. What I found is that 75 is perfectly smooth, although far from perfectly responsive. The responsiveness doesn't bother me though, as I play with a low dpi (800) and don't mind a bit of lag. It also gives me a consistent frame pacing/smoothness across different games. Some games I could surely play at a higher vlock yes, but then I'm afraid I would like that too much and it would be a nuisance going back to 75 in more demanding games. Maybe a bit silly, I know. :)

That said, I welcome suggestions on how to play in a way that makes better use of my monitor + GPU. I can't stand screen tearing though, mind you.

In case you go 1440p don't worry about FPS drops too much: at this resolution, DLSS at Quality usually is better than native 1080p and performance is similar, albeit a tad worse. Something along the lines of native 1080p: 70 FPS; 1440p: 45 FPS; 1440p + DLSS: 65 FPS.
That's interesting and something I hadn't thought about. Btw, when you say "usually", you mean that it varies depending on game I assume?

When you do a search on your monitor, it shows that they're 144hz panels. So there is no need to lock them at 75hz. I feel the 5600x might be a bottle neck for your 3080. Looking at a few YT vids with a 5600x and a 3080, show the GPU util at around 80-90% with a frame rate of 80fps. (the TPU review shows 113fps for your card) No optimization, just default with DLSS off. So I quick and simple upgrade would be to slap a 5700x3d in and call it a day.
See my explanation about vlock above, and like I said I will happily change it if I can do it in a better way.
Btw, since you mention it, there is something strange with vlock at 75; When monitoring with hwinfo the fps seems to be a bit higher without vlock, as in it doesn't dip below 75. Which is kinda weird, but I'm sure there is an explanation for it.

What is your gpu load % when getting that fps ?

If it's like 50%, then you will gain alot of fps from a faster cpu (which you most likely will), and if it's like 97% or higher, then you won't gain any fps from a faster cpu.
Good point! I will measure that.

Don't upgrade just for one game.
I wont. Like I said, upgrading to an X3D will boost performance significantly in ACC. :)
That's also reflected in HUB's testing, as ACC is one of the games that makes the most use of v-cache.

* I actually tested on my new XG2431 without vlock (but with strobing). The experience was good with very little screen tearing. I didn't actually run into a scenario where the fps went below freq. On my old XL2720Z that meant horrible stutter. We'll see how it is on the XG2431.
 
Last edited:
Back
Top