• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX owners only - your opinion on DLSS 2.0 Image quality

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
6,139 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X 5ghz
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
Much the same as THIS post, this is STRICTLY ADDRESSED TO NVIDIA RTX OWNERS AND THEIR PERSONAL EXPERIENCES WITH DLSS2.0 - No I'm not interested in your non-owner opinion from watching youtube comparisons, nitpicking stills, regurgitated reviewer thoughts, how many games it features in etc.

In the spirit of the post I've mentioned and linked, I'll be reporting posts that do not follow the request of this thread. Don't like it? start your own thread.

This is just about Image quality and performance from the people who have actually extensively played with it enabled. I've seen my fair share of comments across the web from people who Don't own an RTX card and have never actually played a DLSS 2.0 game and/or seen it with their own eyes, they perhaps dislike how it works, and/or have their opinion after seeing some of the things I mentioned above. if that sounds like you, I don't want to hear from you.

Here are some super generalized statements you can feel free to base your response off, or just go from scratch and tell me everything you want.
  • DLSS 2.0 generally improves the visual quality of the game
  • I can't tell the difference between DLSS 2.0 on and off
  • Some parts of the DLSS 2.0 image are as good or better, but some are as good or worse
  • DLSS 2.0 generally reduces the visual quality of the game
Some other things for owners to comment on if you feel so inclined.
  • Even if there are IQ limitations, would you enable it anyway for the performance gain? if so which mode?
  • Are you hopeful for ongoing adoption and support?
  • Which is your preferred mode and for what native output resolution?
  • Do you also use Image sharpening / adjust the negative LOD bias in conjunction with DLSS 2.0 to further improve results?
  • If a game featured RTX and DLSS 2.0, would you use both, just one, neither?
  • Would you like to see it adopted in more games that do not feature Ray Tracing purely for increased performance?
 
Joined
Dec 26, 2016
Messages
266 (0.15/day)
Processor Ryzen 3900x
Motherboard B550M Steel Legend
Cooling XPX (custom loop)
Memory 32GB 3200MHz cl16
Video Card(s) 3080 with Bykski block (custom loop)
Storage 980 Pro
Case Fractal 804
Power Supply Focus Plus Gold 750FX
Mouse G603
Keyboard G610 brown
Software yes, lots!
I think DLSS is a good thing to boost performance when using raytracing. Because without DLSS RT performace is just to bad even on RTX 3000. Some games look really good with RT, so the decline in overall image quality by adding DLSS is acceptable.

On the other hand, i still think DLSS looks shitty and since RTX 3000 is very fast without RT, there is no need to smear the image by turning on DLSS for these (non-RT) scenarios.

In the end, DLSS is just a gap filler until RT finally gets the performance it needs. DLSS is nice to have in some scenarios, but whenever the framerate without it is high enough to satisfy the viewers expectations, I would turn it off. Hopefully one day GPUs will be powerful enough to drop rasterrisation completely and generate pictures by raytracing alone.
 

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
51,298 (8.24/day)
Location
Oystralia
System Name Rainbow Sparkles
Processor Ryzen R7 5800X (PBO tweaked, 4.4-5.05GHz)
Motherboard Asus x570 Gaming-F
Cooling EK Quantum Velocity AM4 + EK Quantum ARGB 3090 w/ active backplate. Dual rad.
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @4000 C18 (1.4V, SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Often underclocked to 1500Mhz 0.737v
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Gigabyte G32QC (4k80Hz, 1440p 165Hz) + Phillips 328m6fjrmb (4K 60Hz, 1440p 144Hz)
Case Fractal Design R6
Audio Device(s) Logitech G560 |Razer Leviathan | Corsair Void pro RGB |Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE (custom white and steel keycaps)
VR HMD Oculus Rift S
Software Windows 11 pro x64 (Yes, it's genuinely a good OS)
Benchmark Scores I don't quite know how i managed to get such a top tier PC, I am not rich.
DLSS quality looked like some attempts at anti aliasing over the years, softening things up.
I could run with it on and forget it was there, apart from the odd shimmering texture or visual glitch

I would definitely choose DLSS over dropping the res, but i'm not sure i'd choose DLSS on over disabling other features (like RTX)
I see it as being super helpful on older cards as time goes by... especially in the laptop world.
 
Joined
Nov 11, 2016
Messages
1,903 (1.05/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
I have tried DLSS 2.0 on 1080p, 1440p and 4K on at least 10 games with DLSS 2.0 and the results are that DLSS is the superior AA version than TAA, maybe not as good as MSAA2x or 4x but that is to be expected.
If and only if a game support MSAA2x and 4x and you are getting playable framerate with MSAA, then I would use MSAA instead of DLSS. For example Shadow of the Tomb Raider where RT is useless and DLSS 1.0 sucks ass.

Otherwise DLSS would be the default AA option in supported games. If the performance is excessive for single player games with DLSS Quality (like Death Stranding, Nioh2) then I just cap the FPS to 120 and save on power consumption (100W less is possible) while getting better IQ than Native + TAA.

And there is no need to using DLSS only with DXR, with competitive games like Fortnite and Warzone support DLSS, turning on DLSS will give you major advantage (higher fps the better)

As for DLSS making image blurry, just use Image Sharpening, it's not that hard.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
6,139 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X 5ghz
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
apart from the odd shimmering
Funny you mention it, I regularly play two DLSS 2.0 games, Control and CP2077 (slowly making my way through both, dad life...) and my finding is that DLSS reduces or even virtually eliminates shimmering, which is at the top of negative visual artefacts for me, making DLSS even more appealing, as shimmering makes the whole image appear unstable in motion. Which games would you say shimmering is worsened by using DLSS?
As for DLSS making image blurry, just use Image Sharpening
I definitely agree here, on a per game basis I have tweaked sharpening to offset imagine softness in every game I play. And, this solution is 100% not confined to DLSS, virtually any TAA game stands to benefit as well as many others, I'd highly encourage anyone DLSS or not to give sharpening a try and use the control panel or in game settings to get that 'goldilocks' amount, it can make a drastic difference.
 

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
51,298 (8.24/day)
Location
Oystralia
System Name Rainbow Sparkles
Processor Ryzen R7 5800X (PBO tweaked, 4.4-5.05GHz)
Motherboard Asus x570 Gaming-F
Cooling EK Quantum Velocity AM4 + EK Quantum ARGB 3090 w/ active backplate. Dual rad.
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @4000 C18 (1.4V, SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Often underclocked to 1500Mhz 0.737v
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Gigabyte G32QC (4k80Hz, 1440p 165Hz) + Phillips 328m6fjrmb (4K 60Hz, 1440p 144Hz)
Case Fractal Design R6
Audio Device(s) Logitech G560 |Razer Leviathan | Corsair Void pro RGB |Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE (custom white and steel keycaps)
VR HMD Oculus Rift S
Software Windows 11 pro x64 (Yes, it's genuinely a good OS)
Benchmark Scores I don't quite know how i managed to get such a top tier PC, I am not rich.
Funny you mention it, I regularly play two DLSS 2.0 games, Control and CP2077 (slowly making my way through both, dad life...) and my finding is that DLSS reduces or even virtually eliminates shimmering, which is at the top of negative visual artefacts for me, making DLSS even more appealing, as shimmering makes the whole image appear unstable in motion. Which games would you say shimmering is worsened by using DLSS?

I definitely agree here, on a per game basis I have tweaked sharpening to offset imagine softness in every game I play. And, this solution is 100% not confined to DLSS, virtually any TAA game stands to benefit as well as many others, I'd highly encourage anyone DLSS or not to give sharpening a try and use the control panel or in game settings to get that 'goldilocks' amount, it can make a drastic difference.

different shimmer

the shimmering around say - a fence would go away, but some other textures would go weird in exchange. someone had a GIF of it in the other thread, looked the sort of thing a patch or driver would fix.
 
Joined
Nov 11, 2016
Messages
1,903 (1.05/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
I definitely agree here, on a per game basis I have tweaked sharpening to offset imagine softness in every game I play. And, this solution is 100% not confined to DLSS, virtually any TAA game stands to benefit as well as many others, I'd highly encourage anyone DLSS or not to give sharpening a try and use the control panel or in game settings to get that 'goldilocks' amount, it can make a drastic difference.

Yeah with a 48in 4K TV hanging <1m from my face, every game need at least .25 Image Sharpening before it can look good, with DLSS Balanced I just increase the sharpening to .5 and it's perfect. Nvidia Image Sharpening is such a versatile tool that not too many people know of.
 
Joined
Dec 16, 2012
Messages
462 (0.14/day)
Processor AMD Ryzen R7 5800x
Motherboard B550i Aorus Pro AX
Cooling Custom Cooling
Memory 32Gb Patriot Viper 3600 RGB
Video Card(s) MSI RTX 3080 Ventus Trio OC
Storage Samsung 960 EVO
Display(s) Specterpro 34uw100
Case SSUPD Meshlicious
Power Supply Cooler Master V750 Gold SFX
VR HMD Rift S
Software Windows 10 64bit
The DLSS implementation is different from game to game. I'm currently using a 3440 x 1440 100Hz monitor and DLSS is really a must to get decent frame rates on heavy games.

Cyberpunk 2077 - With the digitalfoundry settings I get around 45-55fps in the city using DLSS Quality. For some reason, DLSS Quality looks better than native if standing still or slowly moving.

Avengers - DLSS Quality is better than native if standing still. Since this is a fast paced game, there is a barely-noticable blur/smear/abberation on the edsges of moving things.

Outriders - DLSS is better than native. Everything is vibrant and much detailed. Blur is not as noticeable.



For some reason games today have hidden TAA settings. Cyberpunk has it and if you remove it, it breaks the game. I just changed the Negative LOD setting in Nvidia inspector for some games to make them sharper.

Here's some anecdotal observations of mine. (did not thoroughly test) DLSS consumes more power.

I'm already capable of playing Outriders at max settings at 95fps cap I had. I turn on DLSS Quality (coz its better than native) and GPU board power increases by 10 Watts. Since you are rendering it at a lower resolution CPU power would also increase. Just my experience and I'm not sure if you all noiced this as well.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
6,139 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X 5ghz
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
different shimmer

the shimmering around say - a fence would go away, but some other textures would go weird in exchange. someone had a GIF of it in the other thread, looked the sort of thing a patch or driver would fix.
Very interesting, would you be able to dig it up because I haven't noticed that at all, but I'd be keen to check it out. Hopefully it is the sort of thing that can be patched.
Nvidia Image Sharpening is such a versatile tool that not too many people know of.
Yeah, I do wonder how many people use it, in virtually all circumstances I've found it to boost IQ, to my eyes, when tweaked on a per-game basis.
 
Last edited:

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
51,298 (8.24/day)
Location
Oystralia
System Name Rainbow Sparkles
Processor Ryzen R7 5800X (PBO tweaked, 4.4-5.05GHz)
Motherboard Asus x570 Gaming-F
Cooling EK Quantum Velocity AM4 + EK Quantum ARGB 3090 w/ active backplate. Dual rad.
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @4000 C18 (1.4V, SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Often underclocked to 1500Mhz 0.737v
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Gigabyte G32QC (4k80Hz, 1440p 165Hz) + Phillips 328m6fjrmb (4K 60Hz, 1440p 144Hz)
Case Fractal Design R6
Audio Device(s) Logitech G560 |Razer Leviathan | Corsair Void pro RGB |Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE (custom white and steel keycaps)
VR HMD Oculus Rift S
Software Windows 11 pro x64 (Yes, it's genuinely a good OS)
Benchmark Scores I don't quite know how i managed to get such a top tier PC, I am not rich.
Very interesting, would you be able to fig it up because I haven't noticed that at all, but I'd be keen to check it out. Hopefully it is the sort of thing that can be patched.

Yeah, I do wonder how many people use it, in virtually all circumstances I've found it to boost IQ, to my eyes, when tweaked on a per-game basis.
cant find it, it was in one of the many 2077 threads here on TPU, but not the 'OFFICIAL' thread
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
3,357 (1.45/day)
Location
Western Canada
System Name Austere Box R1.4
Processor 5900X
Motherboard ROG X570 Impact (3601)
Cooling NH-C14S
Memory 2x16GB 3800 14-15-15 1.52V
Video Card(s) RTX 2060 Super FE 0.987V
Case Cerberus
I only play two games that have it, both DLSS 2.0:

War Thunder - I haven't tested it with tanks yet, but in Air RB DLSS fucking sucks. Everything in sight is blurry as hell even on the highest quality setting possible when flying at >1000ft, and DLSS has a terrible time dealing with the distant texture of open water. The performance improvement is tangible, sure (+30ish fps @ 1440p), but the quality is so horrible I haven't used it since testing it for about a day. There are a lot of good, high quality historical skins for the F-4C and F-4E and they straight up look like ass. Especially this livery that I love: WT Live // Camouflage by Danny74_ (warthunder.com). But then again, given Gaijin, it really wasn't a surprise - they're the type to burn water.

MW 2019 - added just yesterday I think? I don't play Warzone but it works in the rest of MW2019 too. The implementation is pretty damn good. Compared to 1440p near-maxed settings, it's a ~25-30fps uplift all the time with no noticeable loss in quality aside from some minor shimmering on some textures (usually weapon skin specific, the deag skin with the union jack on the side). No artifacts, no extra stuttering. Frankly I think it's better on image quality than DLSS off, because the AA modes in MW2019 are kinda jank - you get noticeable jaggies if you turn Filmic off, but if you use Filmic it really degrades the sharpness and visibility of certain parts of the image even if it looks cinematically "good". So DLSS definitely offers a good middle-ground AA mode, and honestly I think it's the way to go.

But performance might be influencing what I think is "acceptable" as well, War Thunder is an easy 120fps @ 1440p locked all the time without DLSS at near-max settings. MW2019 on the other hand hovers in the 90-110fps range @ 1440p without DLSS, but DLSS takes that up to a constant 120fps. But on pure performance gain alone, DLSS is a great thing.

I'm not sure if DLSS needs time to "settle in"? In any case, MW works a bit better today than day 1. Smoother, no occasional slowdowns where DLSS previously seemed uncertain of itself.
 
Last edited:
Joined
Nov 11, 2016
Messages
1,903 (1.05/day)
System Name The de-ploughminator
Processor I7 9900K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 3090 + Bitspower WB
Storage Plextor 512GB nvme SSD
Display(s) LG OLED CX48"
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software Win10
Here's some anecdotal observations of mine. (did not thoroughly test) DLSS consumes more power.

I'm already capable of playing Outriders at max settings at 95fps cap I had. I turn on DLSS Quality (coz its better than native) and GPU board power increases by 10 Watts. Since you are rendering it at a lower resolution CPU power would also increase. Just my experience and I'm not sure if you all noiced this as well.

In NVCP, Power Management Mode, did you put it to "Prefer Maximum Performance" in the Global tab? This mode will keep the clocks as high as possible even when you have a Max FPS cap, leading to higher power consumption. Usually I only put it to prefer maxium performance in online competive game profile in order to reduce input latency.

You can add the Outriders profile and set it to "Optimal Power", then the clocks speed will vary accordingly to keep the 95FPS cap, this can drastically reduce power consumption. DLSS Balanced mode can reduce power consumption further if you can't distinguish any image quality degradation associated with it.
 
Last edited:
Joined
Dec 16, 2012
Messages
462 (0.14/day)
Processor AMD Ryzen R7 5800x
Motherboard B550i Aorus Pro AX
Cooling Custom Cooling
Memory 32Gb Patriot Viper 3600 RGB
Video Card(s) MSI RTX 3080 Ventus Trio OC
Storage Samsung 960 EVO
Display(s) Specterpro 34uw100
Case SSUPD Meshlicious
Power Supply Cooler Master V750 Gold SFX
VR HMD Rift S
Software Windows 10 64bit
NVCP is at default except for gsync and vsync.its just 10 Watts and nothing to cry about. I think it was from 170 to 180 watts. The card is also deshrouded so it can't be the fans. (Running at .818mv at 1830mhz.)

I'll try to test it out over the weekend and check if there really is a difference in power consumption.
 

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
51,298 (8.24/day)
Location
Oystralia
System Name Rainbow Sparkles
Processor Ryzen R7 5800X (PBO tweaked, 4.4-5.05GHz)
Motherboard Asus x570 Gaming-F
Cooling EK Quantum Velocity AM4 + EK Quantum ARGB 3090 w/ active backplate. Dual rad.
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @4000 C18 (1.4V, SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Often underclocked to 1500Mhz 0.737v
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Gigabyte G32QC (4k80Hz, 1440p 165Hz) + Phillips 328m6fjrmb (4K 60Hz, 1440p 144Hz)
Case Fractal Design R6
Audio Device(s) Logitech G560 |Razer Leviathan | Corsair Void pro RGB |Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE (custom white and steel keycaps)
VR HMD Oculus Rift S
Software Windows 11 pro x64 (Yes, it's genuinely a good OS)
Benchmark Scores I don't quite know how i managed to get such a top tier PC, I am not rich.
DLSS lowering the GPU load, could allow higher FPS and increase CPU load

Its going to mess with the wattage the system uses, could be up or down depending on settings (especially Vsync/FPS caps)
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
6,139 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X 5ghz
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
Indeed in every single case where I've used DLSS, GPU power consumption as reported by MSI AB has either been equal or lower, so if the system is pulling 10w more from the wall using DLSS, I'd wager it's shifting some load elsewhere.
 
Joined
Mar 28, 2020
Messages
990 (1.72/day)
Based on my experience, you should never run RT without DLSS enabled. While I am not using the latest and greatest card from Nvidia, I don't expect a top end card like RTX 3090/ 3080 to run sub 100 FPS with RT enabled, and DLSS disabled. I can stomach some blurriness, as long as its not that bad. Generally if I don't run DLSS 2.0 on performance mode, it will look fine until you do a side by side comparison with trained eyes. You gain some, but lose some in this case, i.e. you lose the jaggered images, but introduce some minimal level of visual downgrade.
 
Joined
Feb 3, 2017
Messages
3,169 (1.84/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) Geforce RTX 3070 FE
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
DLSS 2.0 generally reduces the visual quality of the game.

Still enable in low-FPS scenarios - like RT enabled - because the resulting performance boost is just too big to ignore.
Quality only, the rest have a too visible impact on image quality.

The adoption seems to be well on its way for wide selection of games already. DLSS now being included in Unreal Engine as well as Unity will go a long way towards that goal.

Here's some anecdotal observations of mine. (did not thoroughly test) DLSS consumes more power.
The only case when that is true if you are running with an FPS cap that does not fully utilize the GPU. Lower native resolution means lower load on GPU (added load from DLSS is relatively small) and less load means less power.
 
Joined
Mar 28, 2020
Messages
990 (1.72/day)
Indeed in every single case where I've used DLSS, GPU power consumption as reported by MSI AB has either been equal or lower, so if the system is pulling 10w more from the wall using DLSS, I'd wager it's shifting some load elsewhere.
As the load gets offloaded from the GPU, it will naturally shift towards the CPU. The GPU works less because it is pumping out a lot less pixels, i.e. 1440p instead of 2160p, if you are using a 4K monitor. But it will quickly become bottlenecked by the CPU because of the higher CPU utilization on Nvidia's GPUs. Which is why the gains coming from enabling DLSS at lower resolution hits a limit pretty quickly. I suspect this is especially so if RT is not enabled.
 
Joined
Mar 26, 2012
Messages
99 (0.03/day)
System Name Mixed Bag of OC
Processor AMD Ryzen 5900X
Motherboard MSI B550 Tomahawk
Cooling CPU+GPU on Water with 3 X 420 Rad´s
Memory 32GB Patriot Viper RGB @ 3800 Mhz CL16
Video Card(s) AMD RX 6900 XTXH flashed with AMD LC Bios
Storage 2X 1TB PCIe 3 NVME @ Raid0 + 2X 256GB Sata SSD @ Raid0
Display(s) Sony KD-55AG8 OLED
Case Selfmade Huuuuuge *Case* :)
Audio Device(s) ifi Zen DAC + Monoprice M1060C + Burmester Replica AMP + Selfmade Huuuuuge Speakers :)
Power Supply EVGA SuperNOVA 1300 G2
Mouse Gameball Trackball (main device) + Razer Lancehead Tournament Edition (for FPS)
Keyboard Sharkoon PureWriter RGB, Kailh Blue switches
VR HMD None
Software Windows 11
Benchmark Scores do not matter, my PC is fast :)
I just want to say the following to DLSS 2.0: In Watchdogs 2 I can say it boosts performance like it should and looks better than just resolutionscaling alone BUT on a 55" 4K OLED TV it looks like a 2,5-3K image @ the Quality Mode when compared to native 4K.
 
Joined
Mar 28, 2020
Messages
990 (1.72/day)
I just want to say the following to DLSS 2.0: In Watchdogs 2 I can say it boosts performance like it should and looks better than just resolutionscaling alone BUT on a 55" 4K OLED TV it looks like a 2,5-3K image @ the Quality Mode when compared to native 4K.
I feel there is no way to mask the reduction in resolution when you use such a huge display and the image is stretched. Generally for conventional monitor users, i.e. up to 32 inch, the issue may not be that pronounced because the PPI is dense enough to mask the problem. If you think it looks like 2.5K, that it is exactly the resolution that quality DLSS 2.0 should be upscaling from for a 4K display.
 
Joined
Dec 24, 2012
Messages
87 (0.03/day)
DLSS 2 saved the day for me with cyberpunk 2077 @ 4k.

BUT... only if you take the time to experiment with it to arrive at best IQ. For me that involves DLSS Quality + image sharpening 0.8 + AFx16. Resulting IQ was indistinguishable from 4k native, but with a massive 30-40% perf jump. Very impressed.
 
Joined
Dec 16, 2012
Messages
462 (0.14/day)
Processor AMD Ryzen R7 5800x
Motherboard B550i Aorus Pro AX
Cooling Custom Cooling
Memory 32Gb Patriot Viper 3600 RGB
Video Card(s) MSI RTX 3080 Ventus Trio OC
Storage Samsung 960 EVO
Display(s) Specterpro 34uw100
Case SSUPD Meshlicious
Power Supply Cooler Master V750 Gold SFX
VR HMD Rift S
Software Windows 10 64bit
Okay I did the test. I had it backwards. DLSS saves around 20 Watts on GPU board power. I don't have a means of checking it from the plug though.

DLSS off

dlss off.PNG



DLSS on


dlss on.PNG
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
3,357 (1.45/day)
Location
Western Canada
System Name Austere Box R1.4
Processor 5900X
Motherboard ROG X570 Impact (3601)
Cooling NH-C14S
Memory 2x16GB 3800 14-15-15 1.52V
Video Card(s) RTX 2060 Super FE 0.987V
Case Cerberus
Okay I did the test. I had it backwards. DLSS saves around 20 Watts on GPU board power. I don't have a means of checking it from the plug though.

DLSS off

DLSS on

My UPS has a data cable going to my computer so I can read a rough output wattage at all times in HWInfo. It's obviously not 100% precise and is subject to PSU efficiency, but it's good enough to corroborate the GPU Power and GPU Total Board Input Power readings. In MW2019 specifically, capped at 120fps, I save anywhere between 10-30W of power with DLSS on. About 380-390W total system power draw with DLSS off, about 360-380W total system power draw with DLSS on. With 120fps capped DLSS my 2060S still runs at pretty high (70-95%) utilization so it doesn't affect temps more than 2C.

Obviously it's a best case scenario as I was unable to hit even 110fps most of the time with DLSS off in that game, while DLSS on is a constant 120fps at all times. Also MW2019 always pretty much maxes out two CPU cores even when it's not CPU-bound, so there is no difference in CPU power draw between uncapped, 60fps capped or 120fps capped.
 
Joined
Mar 20, 2019
Messages
366 (0.39/day)
System Name Hermes
Processor 9600k
Motherboard MSI Z390I Gaming EDGE AC
Cooling Scythe Mugen 5
Memory G.Skill Ripjaws V 3600MHz CL16
Video Card(s) MSI 3080 Ventus OC
Storage 2x Intel 660p 1TB
Display(s) Acer CG437KP
Case Streacom BC1 mini
Audio Device(s) Topping MX3
Power Supply Corsair RM750
Mouse R.A.T. 6
Keyboard HAVIT KB487L / AKKO 3098 / Logitech G19
VR HMD HTC Vive
Benchmark Scores Shameful
I have a 3080 and I have the most experience, as far as RTX and DLSS, in Cyberpunk 2077. I bought the card mostly because of DLSS and, to a lesser extent, general performance improvement over the 1080ti. So, here goes:

As for image quality, I can see occasional artifacts such as ghosting on object edges in high-contrast situations, like a brightly colored car moving in front of a dark wall or around other fast-moving objects (not the pixel response's fault). I can see them only because I saw some static a-b comparisons in reviews. I can also see situations where DLSS actually improves fine detail on distant objects, but again, only because I know what to look for and only when I actively look for it. None of the artifacts are visible to me when playing the game.

I personally don't care for ray tracing at all - yes, some areas of the game look slightly nicer, but even though my playing style is mostly of a relaxed sightseeing type, I consider the minuscule improvement in image quality not worth the performance hit.

To be frank, even the 3080 can't run this game in native 3840x2160, so DLSS is a godsend. I tried playing CP2077 on the 1080ti, but the rasterized scaling from 2560x1440 looked awful to the point of actually being distracting, DLSS is a night and day improvement in quality. So yes, I hope this feature will be available in more games.
In short, I don't care for RTX and consider DLSS an infinitely more important feature. Also, it's technologically much more impressive if you look at it.
 
Last edited:
Joined
Dec 16, 2012
Messages
462 (0.14/day)
Processor AMD Ryzen R7 5800x
Motherboard B550i Aorus Pro AX
Cooling Custom Cooling
Memory 32Gb Patriot Viper 3600 RGB
Video Card(s) MSI RTX 3080 Ventus Trio OC
Storage Samsung 960 EVO
Display(s) Specterpro 34uw100
Case SSUPD Meshlicious
Power Supply Cooler Master V750 Gold SFX
VR HMD Rift S
Software Windows 10 64bit
DLSS would work wonders in VR. I havent tested a game yet that has it though. The only thing I'm worried about with it in VR is if the ghosting on edges are more visible.
 
Top