• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Cyberpunk 2077 Benchmark Test & Performance

@W1zzard "At least we're part of the PCMasterRace; for console gamers, the whole game looks like that."

You sir are a legend. Great write up and style :) Its even better because its so true.

The quality in the screenshots keeps falling as you go through them. From "holy shit that's awesome" to "that's sloppy".

And the cars look absolutely horrible. Their shadows and reflections are beyond awful.

You have to compare this to GTA, and then put that in perspective. The sheer complexity and size of this environment makes those details largely irrelevant. FWIW GTAV is still chock full of glitches and low quality surfaces and texturing. Also, there is meticulous detail in the places were the gane wants you to look. Something's gotta give I think.

But yeah. Most cars do look weird.
 
This game is an absolute train wreck for performance, i'm pretty floored that i'm only getting 60FPS with my system with DLSS on
 
Any chance for CPU comparison? I'm awaiting RTX 3080 arrival but being worried if it will run smooth with my 8600K...
 
Any chance for CPU comparison? I'm awaiting RTX 3080 arrival but being worried if it will run smooth with my 8600K...

God poor w1zzard, this would have taken him all friggin day to do... This game seems heavily GPU limited so it may not show much difference, would be interesting
 
I'm actually not disappointed by performance...
RT reflections, Shadows, Emissive lights, Global Illumination..
And im getting @1440p framerate in the high 60s and 70s with balanced DLSS.
Without the crazy lighting (aka GI) i can get 8-9fps more or rather switch to DLSS Quality for the same FPS.

It is missing Variable Rate Shading, Mesh Shaders and Sampler Feedback however, i suspect those would prove crucial, VRS alone would boost FPS significantly in action sequences.

The only thing missing is RT wise is Ambient Occlusion, one can hope right? ;o

Oh and that's with the "bad for games" 3900x, 5900x should be coming in any minute now (literally) and might.. hopefully help boost performance even if only slightly.
CPU utilization is lower than Metro Exodus which can quite easily hit over 50%, havent noticed it go above 50% even once, hovering at around 3x-4x%

I actually wanna test on my HDD at some point as im not getting any of those float object/missing people bugs in the world which makes me think its related to IO (2tb mp600 being used atm)
 
Last edited:
I'm actually not disappointed by performance...
RT reflections, Shadows, Emissive lights, Global Illumination..
And im getting @1440p framerate in the high 60s and 70s with balanced DLSS.
Without the crazy lighting (aka GI) i can get 8-9fps more or rather switch to DLSS Quality for the same FPS.

It is missing Variable Rate Shading, Mesh Shaders and Sampler Feedback however, i suspect those would prove crucial, VRS alone would boost FPS significantly in action sequences.

The only thing missing is RT wise is Ambient Occlusion, one can hope right? ;o

Oh and that's with the "bad for games" 3900x, 5900x should be coming in any minute now (literally) and might.. hopefully help boost performance even if only slightly.
CPU utilization is lower than Metro Exodus which can quite easily hit over 50%, havent noticed it go above 50% even once, hovering at around 3x-4x%

I actually wanna test on my HDD at some point as im not getting any of those float object/missing people bugs in the world which makes me think its related to IO (2tb mp600 being used atm)
Uh nope, 5800x here boosting past 5GHz and 70FPS is the best i'm seeing on a 3080
Unless you got a 60Hz monitor, that's not a great frame rate at all.
 
RT pictures are fixed now, sorry about that. I wrote a new version of the comparison engine, which now has the dropdowns, but forgot to enable it for public ;)
Thank you, the new comparison engine is excellent.
The red area callouts and mouse-zoom really do make examining the differences very quick and easy.
5/7 perfect score.
 
For all that RT? The fps is good in my book for the features.
Maybe its because i had a 20series gpu before and kinda knew what to expect from the even single RT featured games (aka shyet unplayable performance which with 30series has turned playable).
LG 27GL850, VRR working like it should and having no issues.
 
The quality in the screenshots keeps falling as you go through them. From "holy shit that's awesome" to "that's sloppy".

And the cars look absolutely horrible. Their shadows and reflections are beyond awful.
I always randomize the order, so there's no spoilers. I do move a couple of good ones to the start, but yes, there's a bunch that look terrible. I still wanted to include them, to show what you're getting in some areas.
 
Also W1z, for some reason on my laptop the comparison zoom is broken. If I try to zoom in on a specific area it zooms on a slightly offset area, so I can never zoom into the edges of the picture. I imagine it has something to do with odd scaling or window sizes or something. Running 125% zoom on a laptop with a native 4k screen and another 1440p screen attached. Bug happens with or without the 1440p screen though, or if I manually set chrome zoom to 100%.
 
RT can be stolen, seriously. The image quality difference is'nt that great for having to give in in roughly 50% of your FPS.

#TeamAMD.
 
RT can be stolen, seriously. The image quality difference is'nt that great for having to give in in roughly 50% of your FPS.

#TeamAMD.
The car-park garage comparison is pretty night-and-day if you ask me:

1607780386292.png

1607780554599.png


That's a scene where screen-space reflections really don't cope well. You really should see light underneath that car on the right.

I've said it before and I'll say it again, I think AMD were right to focus primarily on raytraced reflections. IMO 90% or more of the image quality perception comes from the DXR reflections. Raytraced shadows, GI, and Occlusion are intensively expensive to calculate and offer almost imperceptible IQ differences - notably because the faked raster equivalents work so goddamn well; Dynamic shadowmaps these days are high-enough resolution with enough point lights that there really isn't a need to raytrace shadows. The reason for that is that DXR shadows still don't calculate ALL the shadows from ALL the light sources, that would be too expensive, so they cherry-pick light-sources to raytrace shadows making them just as fake as the dynamic shadowmap method, but at massive performance cost.
 
I think we all just should play the game as intended by the studio and not using a AI to determine how to reflect stuff. Its looking OK but it's too immature yet to fully use it. It simply costs 50% of the performance. Here,
Reviewers not recieving cards because they dont test or prefer RT which makes Nvidia in the numbers look better. F nvidia.
 
I think we all just should play the game as intended by the studio and not using a AI to determine how to reflect stuff. Its looking OK but it's too immature yet to fully use it. It simply costs 50% of the performance. Here,
Reviewers not recieving cards because they dont test or prefer RT which makes Nvidia in the numbers look better. F nvidia.
I've hated Nvidia for years for crap like that - it seems to be their MO for the last 10+ years.

But, it doesn't change my opinion on raytraced reflections. I still think they're worth the cost, and DLSS isn't making up the reflections, it's merely interpolating the results to reduce the cost of them. They are still an order of magnitude more accurate than SS reflections, even in their cheapest, most blurry DLSS Performance state.

In fairness, I haven't installed CP2077 for myself yet, I'm basing that off videos from others and my experience with other titles on my RTX cards. I'm actually pretty impressed by how good the launch has been, but I will still try and hold out for at least the first DLC.
 
The game looks better with ray tracing off, enabling DLSS for performance makes the game look way too blurry, not to mention serious loss of detail. Look I appreciate that DLSS is there and can be used for lower end cards to make games more playable, but Nvidia is marketing it wrong. They are claiming its at "NO LOSS OF QUALITY", that is one giant horse manure! If they just advertise it as an image upscaler like any other that worsens image quality, but allows you to play at higher frames then it would be accurate and acceptable, otherwise its just complete marketing scam!

Looking at the pics in this article confirms my own experience!

The game is very poorly optimized, it runs bad on almost all hardware, I've tested it with my GTX 1060 6gb, RX 5700xt and now RTX 3070. Its about 40-60fps on mix of medium, high and ultra for the GTX 1060 6gb, about 60-75 for the 5700xt and I've done the same test with the 3070 for comparison sake and at those settings it runs 80-95 for the 3070.

It can run even higher on very light scenes, but generally in most of the city it runs at those ranges!
 
1607795958828.png


Physically totally impossible. That's not even taking into account that the distance to the fan is even bigger due to perspective. The reflection of the fan would only show maybe some upper part of the fan.
Are these even path/raytraced after all?
 
Im getting 59-89fps using the Titan V. Ultra/ RTX OFF @1080p

Streaming the game seems to be a problem for my 3900x. Turning off SM helped me gain 10+fps. The stream is just not smooth though when recording. There are constant hiccups not noticable when actually playing. The stream quality is problematic.

Time for a new CPU, shame i cant buy one without paying silly money. Thinking a 5950x might give me a couple of frames in game and sort the streaming issue out.

I think i'm going to try and manage with the Titan V again this round.
cyberPunk2077.png


I'm using OBS in-game is smooth as silk whats going on?
 
Last edited:
I think we all just should play the game as intended by the studio and not using a AI to determine how to reflect stuff. Its looking OK but it's too immature yet to fully use it. It simply costs 50% of the performance. Here,
Reviewers not recieving cards because they dont test or prefer RT which makes Nvidia in the numbers look better. F nvidia.

Ugly. 25 years of cheating, DLSS under the drivers hidden and optimising the frames-per-second, dropping features like DX 10.1, enabled features which are a decade ahead of their time like real-time ray-tracing, paying partners to drop competition support, dropping VRAM amounts, lying, cheating, green-globlin.

I have rebranded them to Nxtrementia. The black--leather-jacket-man should rename the company.
 
Streaming the game seems to be a problem for my 3900x. Turning off SM helped me gain 10+fps. The stream is just not smooth though when recording. There are constant hiccups not noticable when actually playing. The stream quality is problematic.

What can help a lot in this scenario is to manually isolate the game process/threads from the streaming app process/threads.
If you put obs64.exe (or whatever tool you use to stream) on cores 0-5 and cyberpunk2077.exe on cores 6-11 via the "Affinity" menu in the taskmanager (SMT 0-11 and 12-23 of course.) and then start the stream, the hiccups and interference should be gone.
edit: At least this will take care of any race conditions that may appear due to threads spawning all over the place...
 
Last edited:
Holy hell Pascal cards just got old real quick. Lots of async compute at work?

Even the 2060 is nipping at the heels of the 1080ti. IIRC Guru3D has the 2060S in their charts and it beats the 1080ti.
 
What can help a lot in this scenario is to manually isolate the game process/threads from the streaming app process/threads.
If you put obs64.exe (or whatever tool you use to stream) on cores 0-5 and cyberpunk2077.exe on cores 6-11 via the "Affinity" menu in the taskmanager (SMT 0-11 and 12-23 of course.) and then start the stream, the hiccups and interference should be gone.
edit: At least this will take care of any race conditions that may appear due to threads spawning all over the place...
i'll try this, i have an app installed for this purpose. I'll let you know if it helps. :toast:
 
I've hated Nvidia for years for crap like that - it seems to be their MO for the last 10+ years.
Couldn't agree more :laugh:
There GPP also turned me right off, and when they got caught they tried pulling the same stunt as "That wasn't what we were trying to do" nonsense.

Aside from highly reflective surfaces I don't think ray tracing looks better, it only looks different.
Exactly, and some scenes look good with it on and others not so good when on. It's all personal preference.

It is starting to emerge that AMD processors are seeing major performance deficits in Cyberpunk due to code that prevents them from utilizing all logical cores (vs Intel which use all logical cores). Is there any chance we could see some testing to validate this?

Sounds like the game or the CPU need some sort of patch. There's no logical reason why Cyberpunk would recognize Intel's SMT but not AMD's. Was this also tested on ZEN2 processors to see if the same effect can be reproduced?
 
I think we all just should play the game as intended by the studio and not using a AI to determine how to reflect stuff.

This statement doesn't make sense at all to me. Are you saying CD Projekt RED intends the game, in its ideal form, to look like it uses SSR, and not raytraced reflections? Do you have any evidence for this claim? How do you know that raytraced reflections are something that is not "intended" by them? You realized they chose to support DXR and incorporate it into their game right?

This far fetched statement you made that seemingly try to belittle raytracing because of Zen 2's poor performance at it is pretty pathetic. In fact, it make all AMD gpu owners look bad.

The game looks better with ray tracing off, enabling DLSS for performance makes the game look way too blurry, not to mention serious loss of detail. Look I appreciate that DLSS is there and can be used for lower end cards to make games more playable, but Nvidia is marketing it wrong. They are claiming its at "NO LOSS OF QUALITY", that is one giant horse manure! If they just advertise it as an image upscaler like any other that worsens image quality, but allows you to play at higher frames then it would be accurate and acceptable, otherwise its just complete marketing scam!

"NVIDIA's DLSS technology can help with that as it renders the game at a lower resolution and then uses an advanced upscaling algorithm to get you a high-quality anti-aliased output that looks nearly identical to the default TAA anti-aliasing filter at native resolution. "

Funny how you came to a completely different conclusion than W1zzard. No bias right? I try to look for the "blurriness" in the screenshots in this review and there is very little difference between TAA Native vs DLSS Quality. In fact, in some of the screenshots, the texts are sharper for DLSS quality..

Are we looking at the same thing?
 
Back
Top