• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS 2.5.1

Joined
Sep 9, 2021
Messages
78 (0.06/day)
NVIDIA has released an update for their Super Resolution Technology (DLSS 2.5) and it contains improved fine detail stability, reduced ghosting, improved edge smoothness in high contrast scenarios and overall image quality improvements. In this mini-review we take a look, comparing the image quality of DLSS 2.4 and DLSS 2.5.1 versions in Cyberpunk 2077 and The Witcher 3: Wild Hunt.

Show full review
 
Good to see they are still improving features that do not require the newest generation of GPUs.
 
The tech has improved a lot, but until they can do something about that glaring loss of texture quality and clarity I can't see myself using the tech ever.
That's the one thing that maybe matters more to me than FPS.
Realistic lighting and shadows? Don't care much for it. Realism or pixel graphics? It's all good. But that feeling that somebody rubbed vaseline over your monitor? Can't shake it.

I also hope we one day get to see the tech adopted widely as a native resolution antialiasing option so we can leave the blur filters behind us
 
GPU power consumption during Witcher 3 RT is significantly lower than CP2077. on my GPU Witcher 3RT uses 320-350 watts, where CP2077 uses 420-450 watts.

Even in these videos you can clearly see the difference in GPU power consumption. CP2077 consuming 230 to 250watts, and Witcher 3RT consuming 180-200 watts.

Lower performance in Witcher 3RT is not from being "more GPU demanding", but due to being a badly optimized mess that doesn't use the GPU completely, and then runs horribly because of it.
 
Not a fan of RT but a big fan of DLSS 2.x, this is nice.
 
Good to see they are still improving features that do not require the newest generation of GPUs.

Friend just downloaded the RT patch for plague tale, Frame Generation is in it.....sorry only 4000 series so nope.avi from my peasant 3000 series friend.
 
Yeah very pleased with the results of 2.5.1.0, I've been sticking the DLL in every game folder I can find that supports DLSS, which is easy thanks to GPU-Z... thanks @W1zzard :toast:
 
Friend just downloaded the RT patch for plague tale, Frame Generation is in it.....sorry only 4000 series so nope.avi from my peasant 3000 series friend.

Frame generation is a DLSS 3.0 feature... if you do not own an RTX 4000 series GPU it is not supported. You and your friend should have known that already and it has nothing to do with DLSS 2.5.1 either.
 
Frame generation is a DLSS 3.0 feature... if you do not own an RTX 4000 series GPU it is not supported. You and your friend should have known that already and it has nothing to do with DLSS 2.5.1 either.

yeah...thanks...I know this already....it was more a comment on tech that does not support previous generation hardware...as that was what the comment I directly responded to was about?
 
Nvidia is not stopping for AMD. They won't do the same mistake as Intel on the CPU side.

It will be way harder for AMD to grab significant market share outside of tech enthusiast on the GPU front, unless they find a way to make a huge step forward.
Irrelevant to the topic

The best use case for these latest improvements on DLSS (ultra performance, aka the lowest quality possible) are clearly tailored to weaker hardware, which fits perfectly with the new Switch / Switch 2 or whatever it's called.

Nintendo will surely release weak hardware by today standards, but with this level of DLSS upscaling, it should make 4k 60 actually playable and pretty enough on a TV.
 
Great. Good to see support for a useful technology.
 
I only compared quality mode and the main difference I could see is that 2.5.1 does a better job of rendering fine details in foliage of plants.

Can you manually update games that use FSR to a newer version too?
 
  • Like
Reactions: Kei
Irrelevant to the topic

The best use case for these latest improvements on DLSS (ultra performance, aka the lowest quality possible) are clearly tailored to weaker hardware, which fits perfectly with the new Switch / Switch 2 or whatever it's called.

Nintendo will surely release weak hardware by today standards, but with this level of DLSS upscaling, it should make 4k 60 actually playable and pretty enough on a TV.
umm not really the switch is using horribly low 6-10 x lower ppi per second every second and has bad framereate and textures and low polygons(u can count the lines on rocks far away like somone sanded them to be flat) ... go watch digital foundry's video on the newest switch pokemon game... go watch on a big screen bayonatta and its ps3 trees and textues the switch vs the 2018 iphone on fortnight looked and ran worse... nintendo has HAS to proably support the weaker then 2013 consoles 2017 switch meaning they would need a ton more dev time to have 2 massively different lvls of textures... and this video is likley not using low end tech probably medium high

dont expect anything much larger if they stick with a portable device much higher then the steam deck that is running games at 720p but at least close to ps4 (still meh now) tech that developers are not making games for very much for...
 
DLSS 2.5 just destroyed that lightboard on the first Cyberpunk image. Whilst you can tell it clears up thin line stuff pretty well, it looks too aggressive.
 
Confirms how I've felt about ultra performance mode at 4k native using the latest DLL, looks amazing for what it is.
 
Nvidia is not stopping for AMD. They won't do the same mistake as Intel on the CPU side.

It will be way harder for AMD to grab significant market share outside of tech enthusiast on the GPU front, unless they find a way to make a huge step forward.
I feel Intel was not sitting around, but rather their failure to deliver 10nm timely led them to this point. In fact, I feel Nvidia fell for the same problem, though it took them a generation to correct this. The reason is because Nvidia basically cheap out by going with a lesser node with Samsung to keep cost low, while AMD pressed on with TSMC 7nm. AMD did not capture a lot of market share no doubt, but it allowed them to close the performance gap significantly when comparing RDNA2 with Ampere.
 
IMO then most obvious improvement is on performance mode, now it seems to be somewhat usable.

Quality mode just seems about the same, maybe a bit better on 1080P, but I refuse to imagine myself using DLSS in 1080P.
 
Frame generation is a DLSS 3.0 feature... if you do not own an RTX 4000 series GPU it is not supported. You and your friend should have known that already and it has nothing to do with DLSS 2.5.1 either.

Surely you aren't one of those... barbarians using a mainstream gamer's GPU with classic DLSS 2, non? Ti > Tie, the more you buy, the more you save, you know the drill...

It's not supported because they don't want to, and that's about it. It's blatant artificial segmentation and everyone knows it.

DLSS 2.5 just destroyed that lightboard on the first Cyberpunk image. Whilst you can tell it clears up thin line stuff pretty well, it looks too aggressive.

Agreed, I was about to mention that... but shadow detail seems to be improved at least in Witcher. It'd be cool if NVIDIA stopped holding out on existing customers though. I hope AMD FSR 3.0 ships soon so their hand is forced, but it looks like the red team has too much on their plate already...
 
Good to see them fixing the lower ends of DLSS, because i've found it super helpful - but anything other than quality mode looked disgusting in many titles
 
Take a look at native 4K and compare to DLSS 2.5.1 Quality, in Cyberpunk, the very first image in this review. It's pathetic.

I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.

I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.

That's nothing like 4k. So why not just play at 1080 instead?

Untitled.png


Untitled2.png
 
DLSS 2.5 just destroyed that lightboard on the first Cyberpunk image. Whilst you can tell it clears up thin line stuff pretty well, it looks too aggressive.

Yeah it fixes some things and destroys others. Very annoying.
 
  • Like
Reactions: Kei
I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.

I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.

That's nothing like 4k. So why not just play at 1080 instead?

View attachment 280165

View attachment 280168
It somehow still looks better than stretching, even with all the artifacts and blurriness. Yes, definitely nowhere near native 4K but even FSR1.0 was better than stretching 1080p to a 4K screen. Plus with DLSS you get to keep all the benefits of sharper UI and text at 4K that aren't there if you're dropping the overall res down. That was possible in games with render resolution scaling before though.
 
I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.

I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.

That's nothing like 4k. So why not just play at 1080 instead?

View attachment 280165

View attachment 280168

Actually, you're not too far off. Ultra performance DLSS targets 33% per axis, which assuming a 4K (3840 x 2160) resolution turns out to be around 1267 x 712, ever so slightly below 720p, or around 11% of the resolution in raw pixel count (8.3 MP/8.294.400 px > 0.9 MP/902.104 px) which is then run through the AI filter and output as a 4K signal. This is why it looks like some sort of glorified 720p, that is what it is.

The same proportion is applied if you use ultra performance DLSS at 1080p, to... needless to say, catastrophic results. Quick cheat sheet for DLSS resolution scaling should be:

Quality: 66.6% per axis, 45% resolution
Balanced: 58% per axis, 33% resolution
Performance: 50% per axis, 25% resolution
Ultra Performance: 33% per axis, 11% resolution

Personally, I think that no one should be using DLSS below Quality settings, and Balanced should be reserved for situations where your graphics card just realistically cannot cope with the workload. The performance settings are intended strictly for ultra high resolution gaming, IMO.
 
Back
Top