• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS 2.5.1

I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.

I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.

That's nothing like 4k. So why not just play at 1080 instead?

View attachment 280165

View attachment 280168
It seems like the term "ultra performance" is not clear here as these settings are config to.

"Ultra Performance" targets high-frames, etc BUT at a huge quality lost. Hence the still image here that's being overlooked.

This is by design as the same for AMD (but much MuCh worse) and Intel upsampling solution featured settings.

Now, if these stills were in "quality+ mode" setting, well, the argument about 2.5.1 improved scaling will have more weight.
 
Last edited:
Surely you aren't one of those... barbarians using a mainstream gamer's GPU with classic DLSS 2, non? Ti > Tie, the more you buy, the more you save, you know the drill...

No, I am the kind of user that makes use of upscaling technologies whenever possible because it is not only an easy way to keep GPUs (even or maybe I shoulds say especially high-end ones) cooler, more silent and to keep my energy bill at bay but because it's also improving FPS. I am not very susceptible to blurry images but to low frametimes. In gaming I would have to concentrate to recognize blurriness while stuttering leads to nausea quite fast.
 
No, I am the kind of user that makes use of upscaling technologies whenever possible because it is not only an easy way to keep GPUs (even or maybe I shoulds say especially high-end ones) cooler, more silent and to keep my energy bill at bay but because it's also improving FPS. I am not very susceptible to blurry images but to low frametimes. In gaming I would have to concentrate to recognize blurriness while stuttering leads to nausea quite fast.

I get what you mean, Quality should be enough to lower the power footprint quite a bit without sacrificing much in the way of image quality.

I was mostly taking a jab at Nvidia's extremely distasteful and downright deceptive marketing around Ada and DLSS 3 as a whole :)
 
I was mostly taking a jab at Nvidia's extremely distasteful and downright deceptive marketing around Ada and DLSS 3 as a whole :)

This I can and will support in full. It's also the reason why I chose to go red after approx. 15 years of using nVIDIA GPUs only. I still consider RTX 4000 series the superior product but the only way to make a company change it's behavior is by not giving them your hard earned money anymore.

And it's another reason why I like seeing them improve the "old" DLSS variant. It makes it easier to skip another generation when your RTX 2000 or RTX 3000 series card will be able to provide the necessary performance.
 
I only compared quality mode and the main difference I could see is that 2.5.1 does a better job of rendering fine details in foliage of plants.

Can you manually update games that use FSR to a newer version too?
Theoretically Yes, but AFAIK no FSR 2 games offers DLL files to update.
 
As an RTX 3070 user, I am very pleased with the improvement of DLSS 2.
 
As an RTX 3070 user, I am very pleased with the improvement of DLSS 2.

As an RTX 3090 user, I am absolutely miffed at their ridiculous marketing (especially the Ampere = mainstream gamer's GPU with classic DLSS 2 bait claim which makes me fume every time) and intentional withholding of DLSS 3 because they don't want to bother optimizing for an earlier generation optical flow acceleration engine.
 
It seems like the term "ultra performance" is not clear here as these settings are config to.

"Ultra Performance" targets high-frames, etc BUT at a huge quality lost. Hence the still image here that's being overlooked.

This is by design as the same for AMD (but much MuCh worse) and Intel upsampling solution featured settings.

Now, if these stills were in "quality+ mode" setting, well, the argument about 2.5.1 improved scaling will have more weight.
The thing is, quality mode doesn't look much better either. Then again, even at native resolution that looks like a horrible aliased mess, so no wonder the AI doesn't know what to do with it.
A textbook case of garbage-in, garbage-out.
 
Hey TechPowerUp, could you please stop using such compressed JPEG images for these comparisons? Some of these images are under 500 KB in size and the compression artifacts significantly reduce the usefulness of these comparisons. You cannot use compressed JPEGs here to provide like-for-like comparisons.

Please use PNGs or something that isn't using lossy compression. Your comparison slider tool is very good and well-implemented, but your source images are hideously bad with those compression artifacts and they will show artifacts that don't actually exist when using DLSS.
 
Would be interesting to see Frame Generation image comparison for the latest revision too (TPU also hosts version 1.0.4 of Frame Generation DLL)
 
As an RTX 3090 user, I am absolutely miffed at their ridiculous marketing (especially the Ampere = mainstream gamer's GPU with classic DLSS 2 bait claim which makes me fume every time) and intentional withholding of DLSS 3 because they don't want to bother optimizing for an earlier generation optical flow acceleration engine.
While theoretically possible, it isn't easy simply because of the tensor power gulf between Ada (4090/4080) against even the 3090ti. Current lowest Ada is ~641 TOPs (4070ti) and the highest is ~1321 TOPs (4090). Ampere's best is around 285-300 TOPs (ballpark for both 3090 and 3090 Ti).

To illustrate the real-world impact, frame interpolation and generation has been ongoing with SmoothVideo Project for years now, and since RTX 20 series did actually use NV Optical Flow with mixed quality results (and still heavy shader load). TensorRT has somewhat recently been added on, and while interpolated frame quality saw massive improvement, most GPUs below the 4090 had difficulty going past 2-3x of 24fps above 1080p source frames (4K was often off the table). And that's with majority of shader and tensor power going into frame interpolation and generation. Not to mention compute latencies where there are seconds of delay when seeking the video.

DLSS FG has to contend with sharing compute power with DLSS SR, plus the actual game engine rendering requirements, while attempting to maintain quality interpolation at manageable latencies.
 
While theoretically possible, it isn't easy simply because of the tensor power gulf between Ada (4090/4080) against even the 3090ti. Current lowest Ada is ~641 TOPs (4070ti) and the highest is ~1321 TOPs (4090). Ampere's best is around 285-300 TOPs (ballpark for both 3090 and 3090 Ti).

To illustrate the real-world impact, frame interpolation and generation has been ongoing with SmoothVideo Project for years now, and since RTX 20 series did actually use NV Optical Flow with mixed quality results (and still heavy shader load). TensorRT has somewhat recently been added on, and while interpolated frame quality saw massive improvement, most GPUs below the 4090 had difficulty going past 2-3x of 24fps above 1080p source frames (4K was often off the table). And that's with majority of shader and tensor power going into frame interpolation and generation. Not to mention compute latencies where there are seconds of delay when seeking the video.

DLSS FG has to contend with sharing compute power with DLSS SR, plus the actual game engine rendering requirements, while attempting to maintain quality interpolation at manageable latencies.

Call me an old man, but if Crytek put Crysis 3 on the PS3 with its gimped 7900-class 256 MB GPU and about 224 MB of developer usable RAM, they can give us some form of basic frame generation too.

Could be work, sure, will agree with you on that. Work the company doesn't want to put in. But in that case you get your marketing department to brand the feature as something else than a major version of your flagship marketable feature and imply that everyone else is lame for not having it.

If anything it's a biblical marketing failure.
 
Call me an old man, but if Crytek put Crysis 3 on the PS3 with its gimped 7900-class 256 MB GPU and about 224 MB of developer usable RAM, they can give us some form of basic frame generation too.

Could be work, sure, will agree with you on that. Work the company doesn't want to put in. But in that case you get your marketing department to brand the feature as something else than a major version of your flagship marketable feature and imply that everyone else is lame for not having it.

If anything it's a biblical marketing failure.

Lots of compromises were made, even for Crytek. And the point was to get it to run and look good enough, despite these compromises. It worked, but the fact is that it doesn't hold a candle to how good the full Crysis 3 was. And when someone did try to hack FG into older GPUs, they found instability and frame drops (which make perfect sense if you imagine the algorithm expecting workload to be done by a specific interval).

As it is, DLSS FG already has compromises even with Ada's compute power, which show up in interpolation errors that so many critics are already using as ammunition that FG is 'useless', 'terrible', etc. I'd say they should focus on perfecting it for Ada first before even thinking of trying to scale it down to require less compute to fit within at least Ampere's capacity.

In the example I provided for SVP, the folks developing RIFE are optimizing their models and algorithms in the hope of one day being able to do real-time High Quality Interpolation and FG for higher frame rates, and they're running into many obstacles squeezing it out of limited headroom.

I'm not sure where you're getting that everyone else with no DLSS FG is 'lame'. After all, game support is still low, and it was marketed as an new and awesome feature. If you take that as "you're lame for not having it", I think that's a 'you' problem. Everyone else will likely wait for the tech to mature before diving into it, as most have with both RT and DLSS.
 
2.5.1 ultra performance is highly comparable to 2.4.3 quality in terms of image quality - that's super impressive, considering how much more fps you get with ultra performance vs quality
 
I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.

I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.

That's nothing like 4k. So why not just play at 1080 instead?

View attachment 280165

View attachment 280168
Exactly. It's a way Nvidia has found to create the illusion that you have enough performance to run games with RT (at an irrelevant level.)

I don't understand what is being praised. Before it was horrible and now it's a little less horrible?
 
@maxus24 any chance for similar analysis on FSR versions? Same goes for the dlss vs. fsr image quality reviews.
 
It looks to me like it has always looked, a bit sharper than 1080p, but nowhere close to native 4K.
I have not seen a real-life comparison in action, but I wouldn't expect quality to improve when in motion.

At first I thought it's cool for reaching very high framerates for competitive FPS, but then I saw that there's an additional latency associated with it.
This also rules out VR, since you need absolute minimum input latency for it.

Outside of RT, its only real use seems to be on weak hardware on modern games on a 4K TV.

The underlying hardware though, that's something worth talking about.
As always, someone will find maybe some special effects that benefit or other more interesting use for it.
 
Lots of compromises were made, even for Crytek. And the point was to get it to run and look good enough, despite these compromises. It worked, but the fact is that it doesn't hold a candle to how good the full Crysis 3 was. And when someone did try to hack FG into older GPUs, they found instability and frame drops (which make perfect sense if you imagine the algorithm expecting workload to be done by a specific interval).

As it is, DLSS FG already has compromises even with Ada's compute power, which show up in interpolation errors that so many critics are already using as ammunition that FG is 'useless', 'terrible', etc. I'd say they should focus on perfecting it for Ada first before even thinking of trying to scale it down to require less compute to fit within at least Ampere's capacity.

In the example I provided for SVP, the folks developing RIFE are optimizing their models and algorithms in the hope of one day being able to do real-time High Quality Interpolation and FG for higher frame rates, and they're running into many obstacles squeezing it out of limited headroom.

I'm not sure where you're getting that everyone else with no DLSS FG is 'lame'. After all, game support is still low, and it was marketed as an new and awesome feature. If you take that as "you're lame for not having it", I think that's a 'you' problem. Everyone else will likely wait for the tech to mature before diving into it, as most have with both RT and DLSS.

Just me and my mainstream gamer's GPU with classic DLSS 2 being salty at the sky high prices of this GPU generation I guess

Marketing rarely ever has that effect of making me viciously hate a company, which is the exact opposite of what it's supposed to do...

No doubt the tech is cool, but they just ensured they aren't selling me one on that one line of marketing claim alone. Not like they care... I, alongside everyone complaining, will be buying Blackwell anyway...
 
Just me and my mainstream gamer's GPU with classic DLSS 2 being salty at the sky high prices of this GPU generation I guess

Marketing rarely ever has that effect of making me viciously hate a company, which is the exact opposite of what it's supposed to do...

No doubt the tech is cool, but they just ensured they aren't selling me one on that one line of marketing claim alone. Not like they care... I, alongside everyone complaining, will be buying Blackwell anyway...
What 'line' is this, may I ask?
 
What 'line' is this, may I ask?

CES 2023 video, where they kept pushing DLSS 3 with brutal dishonesty claiming that the 4070 Ti was 3x faster than the 3090 Ti. Around 14 minutes the guy says that the 30 series "remain the best option for mainstream gamers, starting at $329", and then proceed to obliterate it with the most disgusting and filthiest lies against a company's own products I've ever seen in a marketing presentation

No wonder every single YouTuber got mad at the vitriol and called it a ripoff (which it is)
 
I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.

I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.

That's nothing like 4k. So why not just play at 1080 instead?

View attachment 280165

View attachment 280168
Ultra performance IS the 720p option

Its the lowest of the low, so at least they've improved upon it


When your 3050 laptop GPU is 5 years old, you'll be grateful it has DLSS Ultra performance
 
When your 3050 laptop GPU is 5 years old, you'll be grateful it has DLSS Ultra performance
Any RTX card really, Would love to pick up a cheap 2060 or above to replace the ageing GTX1080 in my lanbox purely for the extra feature set, but at this rate it seems more likely I'll just upgrade in the next year or two and the 3080 becomes the lanbox card.
 
Ultra performance IS the 720p option

Its the lowest of the low, so at least they've improved upon it


When your 3050 laptop GPU is 5 years old, you'll be grateful it has DLSS Ultra performance

DLSS Ultra Finewine :D

Also it's ridiculous that there are people who think 1080p Native looks anywhere close to 4K DLSS Ultra Perf

W3 4K DLSS UP vs 1080p TAA
 
Last edited:
DLSS Ultra Finewine :D

Also it's ridiculous that there are people who think 1080p Native looks anywhere close to 4K DLSS Ultra Perf

W3 4K DLSS UP vs 1080p TAA

Yeah, I agree the Witcher 3 looks better but it's game dependant. But it's still nothing similar to a 4K image. When I played CP2077 first time around, I think the DLSS was super early implementation. Never really noticed it that much, but I also had RT on lower settings (or it did it anyway because of DLSS?). Anyway, I had bought a Gsync monitor so playing single player fps didnt ned high frames, 40-45 fps was smooth enough for me.
 
Back
Top