• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Shares Cyberpunk 2077 RT: Overdrive Mode Performance Numbers

ray tracing performance hit reminds me of the early 2000s gpu gets a massive performance hit when Anti Aliasing enabled for games to smooth out the jaggies...
took the companies a while before the gpu's were powerful enough where enabling them doesnt have much of a performance hit, it will be quite a while before ray tracing performance hit will be small
 
  • Like
Reactions: bug
I see those numbers and just comes to my mind those ratings from TV makers. Like Samsung's PQI.
Going from 11 fps to 59 does makes everyone (I think) wonder about visual quality. I mean can we really consider those 59 fps as equal quality to those 11 fps? If not, then this isn't an apples to apples comparison.
In this case it's more about responsiveness rather than visual quality, frames are being interpolated to go from 30 to 60 fps, that difference in latency will not go unnoticed.

it will be quite a while before ray tracing performance hit will be small
RT performance hit will never be small, it just doesn't scale like that.
 
In this case it's more about responsiveness rather than visual quality, frames are being interpolated to go from 30 to 60 fps, that difference in latency will not go unnoticed.


RT performance hit will never be small, it just doesn't scale like that.

oh it won't, how so if i may ask?
 
In this case it's more about responsiveness rather than visual quality, frames are being interpolated to go from 30 to 60 fps, that difference in latency will not go unnoticed.
In case someone wants responsiveness, they just lower the settings and get higher framerates. DLSS and FSR might look like a nice way to get both, responsiveness and visual quality, but after a point it does makes someone (or at least me) wonder. How much quality are we missing to get from 11 to 59 fps with magic?
There are many cases where in a game someone is having a difficulty to notice the differences between 1440p and 2160p, or between high settings and ultra. I think those techs play ball with that. The fact that we WILL notice a setting's change, but we might NOT notice the actual changes in the graphics from that setting change.
 
DLSS and FSR proved that it’s worth losing a percentage of quality (and not always, sometimes it’s better than native) in order to get playable frame rates.

It seems many gamers don’t know what to expect when RT or path tracing is applied.
They may expect it magically to transform the scene into something close to reality.

The devs already copy the real lighting and bake it into textures etc. If you put the same lights in the scene and run it with RT you’ll have the same results. In one case it’s baked in the other real time.

the problem comes when there are many light sources in the scene or the light sources do not shoot directly to the environment. There you need indirect lighting/ambient occlusion/global illumination etc.
There the RT is the holy grail of gaming until path tracing is usable.
 
ray tracing performance hit reminds me of the early 2000s gpu gets a massive performance hit when Anti Aliasing enabled for games to smooth out the jaggies...
took the companies a while before the gpu's were powerful enough where enabling them doesnt have much of a performance hit, it will be quite a while before ray tracing performance hit will be small
Just like people hate on DLSS right now, people where hating on MSAA back then. It was all "give me true super-sampling, I will have none of this multi-sampling fake AA".
 
In case someone wants responsiveness, they just lower the settings and get higher framerates
Why would you lower the quality if the whole point of this is to have the best visual settings irrespective of the compromises you need to make even if that compromise means getting 10 fps. Sounds stupid but that's what is being pushed right know.

oh it won't, how so if i may ask?

Ray tracing cost is constant per scene, meaning that if you were to double the number of compute units in a GPU it would still infer X % performance impact per frame, the only thing that would improve is the number of frames per second but the cost per frame would remain the same.
 
Why would you lower the quality if the whole point of this is to have the best visual settings irrespective of the compromises you need to make even if that compromise means getting 10 fps. Sound stupid but that's what is being pushed right know.

Ray tracing cost is constant per scene, meaning that if you were to double the number of compute units in a GPU it would still infer X % performance impact per frame, the only thing that would improve is the number of frames per second but the cost per frame would remain the same.
But you DO get lower quality with FSR, XeSS and even DLSS. It's just that the framerate increase is high enough for people to just accept to ignore the lower visual quality. Those techs do try to offer better visual quality than what you get by lowering resolution from 2160p to 1800p for example and from ultra settings to med high settings. But they do bring their own problems, like ghosting, even minimal ghosting. So, you do get lower quality. Only DLAA improves quality. Everything else lowers quality.

It's just psychology.
If I go in a game's settings and change from ultra to high, I KNOW that the quality will be lower. Even if I have a difficulty to spot the differences.
If I go in a game's settings and change from high to ultra and at the same time enable DLSS quality, I BELIEVE I am getting ultra quality, even when I do notice ghosting or whatever in the image.


But I do agree about RayTracing(you are not replying to me obviously). It's probably another thing just finding the edges than following the light while it jumps from a 3D surface to a 3D surface. Nvidia also proved with overdrive mode that we still haven't seen the maximum performance hit from enabling RayTracing.
 
But you DO get lower quality with FSR, XeSS and even DLSS. It's just that the framerate increase is high enough for people to just accept to ignore the lower visual quality. Those techs do try to offer better visual quality than what you get by lowering resolution from 2160p to 1800p for example and from ultra settings to med high settings. But they do bring their own problems, like ghosting, even minimal ghosting. So, you do get lower quality. Only DLAA improves quality. Everything else lowers quality.

It's just psychology.
If I go in a game's settings and change from ultra to high, I KNOW that the quality will be lower. Even if I have a difficulty to spot the differences.
If I go in a game's settings and change from high to ultra and at the same time enable DLSS quality, I BELIEVE I am getting ultra quality, even when I do notice ghosting or whatever in the image.


But I do agree about RayTracing(even not replying to me). It's probably another thing just finding the edges than following the light while it jumps from a 3D surface to a 3D surface. Nvidia also proved with overdrive mode that we still haven't seen the maximum performance hit from enabling RayTracing.

It all comes down to the fact that it's harder to market your product when it's not capable of running a game with all settings maxed out than it is to convince people the image quality is the same.

So 10 fps is justifiable because every RT setting is enabled even if the image quality is not the same.
 
It all comes down to the fact that it's harder to market your product when it's not capable of running a game with all settings maxed out than it is to convince people the image quality is the same.
Yeap! And increasing performance by just increasing the hardware capabilities, probably isn't an option today as it was 15 years ago. So, who would buy an RTX 4070 Ti to play with overdrive enabled at 11fps? Buy they will buy with the promise of close to 60fps.

Still that jump from 11 to 60, it's too big to swallow. Maybe an indication of RTX 4070 Ti's VRAM limitations. That's why the huge jump compared to everything else in both 2160p and 1440p in Nvidia's charts.
 
nvidia may say whatever they want.
Practically the DLSS(and FSR) is widely beneficial while DLSS 3 is useful if you already have 60+ fps natively or you want to run tech demos.
It’s a common logic that when you get 5-10 times more fps than native, you have to give something back. Quality, responsiveness, latency etc.
 
Can you tell the difference between native and dlss q in a blind test? Say I post 5 pictures and you tell me which is which? Everybody I tried it with spectacularly failed, in which case, yeah, they are equal quality. In fact I'd say in some games dlss looks better than native, especially those problematic taa implementations (cod comes to mind)
I don't think side by side pictures is the right test, it's hard to pick out a difference in side by side pics, but actually playing a game I can definitely see the difference. Whether it's very obvious ghosting (much less common with later DLSS versions) or a subtle blurriness to the background (especially the case in CP2077). Don't get me wrong, I think it's a great option to have, but personally I prefer to lower quality settings & keep native resolution than use DLSS to get the same target FPS.

Edit: oh & I'm a big fan of DLAA! That's fantastic, at least in the very few titles I've seen it implemented. Elder Scrolls Online has horrible shimmering at native resolution & DLAA almost entirely solves that with only a slight blurring effect vs native.
 
Just tried it. Woo, 4 FPS on my 6800XT at native 3440x1440.

Setting FSR to ultra-performance just barely gets me into the 20s and 30s.

I did spot a mod on Nexus that tweaks the path-tracing values to improve performance while supposedly not affecting visuals too much. Maybe I'll try that.


Realistically I'm never going to sit down and properly play this game with RT enabled. The slight "improvement" in visuals is absolutely not worth the significant performance hit.
 
Just tried it. Woo, 4 FPS on my 6800XT at native 3440x1440.
We're getting into Portal RT territory here. Some developers do everything they can to butcher their game's performance to get that tasty Nvidia paycheck. And to think that I get a stable 60 FPS in The Market of Light UE5 demo... Ridiculous!

I'll try it on my 6750 XT at 1080p, but with zero hopes.
 
It runs and looks great
79,03.JPG
 
We're getting into Portal RT territory here. Some developers do everything they can to butcher their game's performance to get that tasty Nvidia paycheck. And to think that I get a stable 60 FPS in The Market of Light UE5 demo... Ridiculous!

I'll try it on my 6750 XT at 1080p, but with zero hopes.
It's not about paychecks, it's about staying ahead of the curve. If path tracing is somewhat acceptable today, it will be acceptable with the next generation and probably common place after another 2-3 generations. As a developer, it's important to try it early and learn how it can help and what pitfalls it presents.
As for "butchering performance", I don't see how anything got butchered. If you're not interested in this (or, more likely, don't have the hardware), don't activate path tracing.
 
It's not about paychecks, it's about staying ahead of the curve. If path tracing is somewhat acceptable today, it will be acceptable with the next generation and probably common place after another 2-3 generations. As a developer, it's important to try it early and learn how it can help and what pitfalls it presents.
As for "butchering performance", I don't see how anything got butchered. If you're not interested in this (or, more likely, don't have the hardware), don't activate path tracing.
Well CD project RED already said they are switching to Unreal Engine 5 for the next games so all this work might not be reusable. It's a super nice technical feats but not sure how much they will be able to reuse with the new engine. And since the assets doesn't change, we can't say they are learning to create assets that work well with Path tracing. But i am pretty sure the engine devs will be able to get some work somewhere for sure

To me, one of the main reason(except noise and some DLSS hiccup) that the game do not look that much better. To me it feel like the game was designed with raster in mind, then they added basic raytracing, then path tracing.

I suspect that games made with Pathtracing from scratch will look significantly better even if the performance is about the same on same hardware.
 
Well CD project RED already said they are switching to Unreal Engine 5 for the next games so all this work might not be reusable. It's a super nice technical feats but not sure how much they will be able to reuse with the new engine. And since the assets doesn't change, we can't say they are learning to create assets that work well with Path tracing. But i am pretty sure the engine devs will be able to get some work somewhere for sure

To me, one of the main reason(except noise and some DLSS hiccup) that the game do not look that much better. To me it feel like the game was designed with raster in mind, then they added basic raytracing, then path tracing.

I suspect that games made with Pathtracing from scratch will look significantly better even if the performance is about the same on same hardware.
It's not that simple. They're taking the low-level approach now, using DX12 directly, but they're learning what's possible. Tomorrow, even if they switch to another engine, if that engine is missing some features, or implements them in a particular way, they are in a position to patch the engine or request upstream improvements.
 
Back
Top