• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX owners only - your opinion on DLSS Image quality

Really? well I can't upload full size 4K image because it exceed TPU size limit (12MB for a single image)
here is half pic of the Native 4K image
View attachment 201007
Converting to lossy JPEG would have blurred out the image.
interesting. the light bouncing off on a lot of that.is in wrong direction. if you look at the green area to left of player. and top right. to.but ai can only do so much.
 
In my experience, stuff gets more blurry in Warzone, but performance sky shoots compared to other methods of AA. As such, I've been using it since it has been implemented.
 
Not sure any of that would be the fault of DLSS?
partly due to it having to fake data in the scene for a higher rez and fps.
 
partly due to it having to fake data in the scene for a higher rez and fps.
Well the 'fake' data is just their upscale of the lower res image, so in theory, if the lighting is wrong at 4k native, it would also be wrong at 1080p native, and any DLSS setting, there should be no effect on lighting direction in a scene. Do you mean it's correct in the native image and not correct in the DLSS image? If that's what you mean I'm not seeing it.
 
The higher the res, the better DLSS works.

I genuinely did not want a 4K monitor when i bought my current one due to low FPS, but knowing DLSS is spreading so much (its becoming part of the unity engine!) it makes a 4K monitor more likely in the future

4k monitor makes sense too, since so many games are capped at 30 or 60 fps. which modern gpu's can play older games at native 4k 30/60 fairly easy. i for one am waiting to replay Final Fantasy X when I get a 4k screen... so far no 4k screens impress me. I want a 32" 4k IPS or OLED (never doing VA again, never ever) I'm leaning towards just getting a 4k OLED tv in a year or two... hopefully they have decent 40" OLED 4k panels from LG in like two years... that's my main hope. I wouldn't mind sitting back a little further on my desk for that.
 
4k monitor makes sense too, since so many games are capped at 30 or 60 fps. which modern gpu's can play older games at native 4k 30/60 fairly easy. i for one am waiting to replay Final Fantasy X when I get a 4k screen... so far no 4k screens impress me. I want a 32" 4k IPS or OLED (never doing VA again, never ever) I'm leaning towards just getting a 4k OLED tv in a year or two... hopefully they have decent 40" OLED 4k panels from LG in like two years... that's my main hope. I wouldn't mind sitting back a little further on my desk for that.

How about trying out the 48in CX :), they are pretty cheap now, around 1000usd and the gaming experience they offer is far ahead of any 1440p IPS screen
 
I feel like I read somewhere there's going to be a 42" 4k LG OLED this year with all the juicy specs we want. Could be even better suited to gaming? I'd 100% rock the 48" already though if I was in the market, but I haven't even had my 34" 3440x1440 144hz Ultrawide for 12 months yet, which also feels like a great fit for a 3080 class GPU. DLSS where applicable though can really make or break the 4k experience. I have a HDMI 2.0 4k60 TV now, but it's just that much harder without HDMI 2.1 and VRR to tune the experience and cap 60fps and NEVER drop below to have it be seamless, VRR does so much heavy lifting.
 
Well the 'fake' data is just their upscale of the lower res image, so in theory, if the lighting is wrong at 4k native, it would also be wrong at 1080p native, and any DLSS setting, there should be no effect on lighting direction in a scene. Do you mean it's correct in the native image and not correct in the DLSS image? If that's what you mean I'm not seeing it.
lighting just is wrong in the image
 
lighting just is wrong in the image
Fair enough, so it's completely decoupled from DLSS imagine quality then, as the aberration is present no matter the resolution.

It does look unnatural to me I'll admit, but I've been playing quite a few RTX games lately, especially Metro, which is so natural that at select times it's almost drab... like a cloudy day outside actually is.
 
lighting just is wrong in the image

Well Outriders doesn't support RayTracing, only DLSS.

I feel like I read somewhere there's going to be a 42" 4k LG OLED this year with all the juicy specs we want. Could be even better suited to gaming? I'd 100% rock the 48" already though if I was in the market, but I haven't even had my 34" 3440x1440 144hz Ultrawide for 12 months yet, which also feels like a great fit for a 3080 class GPU. DLSS where applicable though can really make or break the 4k experience. I have a HDMI 2.0 4k60 TV now, but it's just that much harder without HDMI 2.1 and VRR to tune the experience and cap 60fps and NEVER drop below to have it be seamless, VRR does so much heavy lifting.

4K60hz LED TV have horrible input delay that your UW 1440p screen should be a better option.
Yeah I hope LG start making 42in OLED TV soon :)
 
Somebody I know picked up the 55 of that LG TV, he has a 3090 attached to it and he says it's just amazing. Like he doesn't use DLSS, everything maxed and still around 120 FPS in Warzone. You do need to stay at least a good meter away from those TVs to make the experience worthwhile though. To me however, it's still too massive. 42 is generally left for lower end TVs, as such I wouldn't expect OLED to come there too soon. They did make a 32 inch OLED monitor, but it's overly expensive and for design purposes.
 
Fair enough, so it's completely decoupled from DLSS imagine quality then, as the aberration is present no matter the resolution.

It does look unnatural to me I'll admit, but I've been playing quite a few RTX games lately, especially Metro, which is so natural that at select times it's almost drab... like a cloudy day outside actually is.
the way my eye see. i spot odd or out of place lighting.
the best use of pro lvl global illumination was end game suits. fully trick me eyes
 
I'm going in blind and have barely gotten to the meat of the game, so I hope I'm not being preemptive... but the things I'm seeing in Metro Enhanced... am I crazy for calling it a transformation? I gotta see what went into this. Not only is the performance absurdly better (I laughed out loud when the game started and I saw 134FPS outta my 2060 as the camera panned down on the Moscow wastes,) but visually it looks totally different. You almost don't notice at first, but after a few minutes of looking at stuff, you realize EVERYTHING is different. And much cleaner. Very cool stuff. Really liking it so far.

Spoke too soon. CTD's are back in the same fashion as in the past. This one seems to have eaten my save, because it forgot 2 hours of progress. It's either that or the save switcheroo phenomenon is back. I don't feel like sifting.

It does look absolutely fantastic though, heh. ALL of the RTX effects are pretty siginificantly expanded and higher in quality. Full GI on the lighting now and it makes a huge difference, especially for interiors. It does make say, that one light in a dark space appear brighter, but it's not distracting because the colors are now more natural, even in the shadows cast around it. All in all the visual quality shoots up pretty much a full generation. This looks like a different era. I always found metro visually impressive, with some nice tricks. But graphically I thought it was just well done... not standout but good. Of its time, still. Now, it really looks like the future. I feel comfortable saying that. If they can just get it stable that would be nice! :laugh:
 
Last edited:
I do not have a lot of games with DLSS support but that will change over time. Sadly the rip off bitcoin saga still is stiffing corporations galore.
 
woop woop, got my 5900x today, should go in tonight. There are areas of Metro Exodus EE that seem very CPU limited and I drop below 60fps (as GPU usage plummets to 40-50%), any idea's what sort of improvement I can expect?
 
woop woop, got my 5900x today, should go in tonight. There are areas of Metro Exodus EE that seem very CPU limited and I drop below 60fps (as GPU usage plummets to 40-50%), any idea's what sort of improvement I can expect?

I woud expect about 15% higher on the 1% low FPS, which is more important than AVG FPS anyways.

Also I just tried out Amid Evil, 4K DLSS Balanced bring about 100% perf improvement (~50fps with maxed RT and 100fps with DLSS Balanced). The game look to be quite modern with the inclusion of RayTracing.
 
I use a 1080p display, and there is a slight reduction even in qualitymode in Cyberpunk vs no dlss, but the performancebump makes it worth it. I think dlss 1.0 is garbage, BFV and SOTTR was a disgrace, but the semi 2.0 mode in Control and full 2.0 in Cyberpunk is impressive, though at 1440p or above it makes more sense than 1080p.
 
I use a 1080p display, and there is a slight reduction even in qualitymode in Cyberpunk vs no dlss, but the performancebump makes it worth it. I think dlss 1.0 is garbage, BFV and SOTTR was a disgrace, but the semi 2.0 mode in Control and full 2.0 in Cyberpunk is impressive, though at 1440p or above it makes more sense than 1080p.

You can try Image Shaperning in NVCP to make the image sharper, with DLSS or not. With a 24in 1080p screen I would use around 30% sharpening without DLSS and 50% with DLSS in CP2077.
This is due to the nature of TAA implementation in CP2077 that make moving characters look soft even without DLSS.
 
You can try Image Shaperning in NVCP to make the image sharper, with DLSS or not. With a 24in 1080p screen I would use around 30% sharpening without DLSS and 50% with DLSS in CP2077.
This is due to the nature of TAA implementation in CP2077 that make moving characters look soft even without DLSS.
Yeah, I do that, but still I feel it looks better WO dlss even at quality, but the difference is small while performancebump is large so still worth it :)
 
After more farting about, i've decided on my final opinion on DLSS, which is law and must be obeyed at all times or i feed you to my fancy crab with his cute hat

Run the settings you want. Happy with FPS? leave it.
Unhappy with FPS? Try turning DLSS on, before lowering other settings. You may like it.



Oh phew, that was hard.
 
After more farting about, i've decided on my final opinion on DLSS, which is law and must be obeyed at all times or i feed you to my fancy crab with his cute hat

Run the settings you want. Happy with FPS? leave it.
Unhappy with FPS? Try turning DLSS on, before lowering other settings. You may like it.

Oh phew, that was hard.

Hell nawh, screw your Law :roll: . I just set everything to Ultra + DLSS Balanced and game.
 
It's interesting how much the 3700X to 5900X upgrade has increased my FPS across the board, but especially in DLSS games. I guess the lower input resolution puts you more into CPU limited territory. Every single game I play atm plays better with this CPU, it's a much better match for a current-gen high-end card.
 
It's interesting how much the 3700X to 5900X upgrade has increased my FPS across the board, but especially in DLSS games. I guess the lower input resolution puts you more into CPU limited territory. Every single game I play atm plays better with this CPU, it's a much better match for a current-gen high-end card.
Interesting. I could see that. 2600 to 3900x was quite an uplift playing at 1080p, especially for high refresh. There was a real bottleneck with CPU utilization. I haven't watched it much running these new DLSS titles, but there are some minor bottlenecks that will knock 30% of the frames off. Usually areas with a lot of actors or places you'd expect to have a lot scripts running, or central elevation points. I can probably count them on one hand. Both Metro and Control had many more of them before all of this, but a few lesser stragglers remain.

However, I do generally get a GPU bottleneck coming out on top. Of course, this a 2060 we are talking about in my case. Less straight raster cajones, less RT moxie. Any frame rate limitations I hit can be alleviated by lowering the base resolution and utilization points to that as well. So we're probably both just in our own sweet spots. Of course, utilization doesn't tell the whole story. I could have a 'secondary' bottleneck with CPU utilization itself. There could still be more gains with a faster CPU.

Though come to think of it... if I shut of RT and just leave Control running only 720p DLSS, it is redlining at 165fps. Tried that down in the quarry. Other areas may not fare as well. Not to mention, the 3900x was a higher-bin, generally higher-boost chip iirc. Might just be enough. Probably big things I'm missing there.

Honestly though, it's hard for me to gauge those things with this card at this point... namely because it only has 6gb of vram.
 
Back
Top