• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX Ada owners only - your opinion on DLSS 3.0 Frame Generation

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,863 (1.33/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30 tuned
Video Card(s) Palit Gamerock RTX 5080 oc
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Much the same as THIS post, this is STRICTLY ADDRESSED TO NVIDIA RTX ADA OWNERS AND THEIR PERSONAL EXPERIENCES WITH DLSS3.0 Frame Generation - No I'm not interested in your non-owner opinion from watching youtube videos, written articles, nitpicking stills, regurgitated reviewer thoughts, how many games it features in etc.

In the spirit of the post I've mentioned and linked, I'll be reporting posts that do not follow the request of this thread. Don't like it? start your own thread.

This is just about Image quality, performance and input lag/controls latency feeling from the people who have actually extensively played with it enabled. I've seen my fair share of comments across the web from people who don't own an Ada RTX card and have never actually played a DLSS FG game and/or seen it with their own eyes, they perhaps dislike how it works, and/or have their opinion after seeing some of the things I mentioned above. if that sounds like you, I don't want to hear from you.

Feel free to be either as casually subjective about how the experience was, or as technically deep dive with testing numbers as you choose, I'd love to hear anything you've got to say about the experience.
 
I played Spiderman Remastered for 20 hours now with DLAA + FG, can't tell the input lag difference, neither in Plague Tale Requiem.

Because reviewers now default their testing to FPS only, which is not correct way to test FG/DLSS/XeSS/FSR because we need normalized Image Quality before testing for FPS. Now what if we normalize FPS and test for what higher settings can be enabled with FG and how much those higher settings affect image quality, the anwer is quite alot. HardOCP had been reviewing GPUs this way for a long time before he moved to Intel.

So I try for normalized 120FPS with FG enabled, I can use DLAA, which offer much higher visual fidelity than Native and more important full resolution RT (whereas DLSS/XeSS/FSR will reduce RT Reflection res)
Spiderman Remastered 4K RT Maxed DLAA+FG vs DLSS Q vs Native TAA

So yeah, with FG, I can enable the highest visual quality offered by the game and then some and still get fluid gaming experience. About the arfifacts, very hard to notice at 120FPS.

The current flaws of FG are broken V-sync and additional input delay, which may restrict its usage in some games, but for slow single player RPG games, FG is a very good option to boost visual quality with playable framerate (and input latency)
 
Cheers @nguyen , I've read there's a workaround to FPS caps/VSYNC where you can limit FPS in NVCP to just under half your refresh, so 57-59, then enable FG and you should get a 114-118fps presentation on your LG OLED, have you tried it?
 
Cheers @nguyen , I've read there's a workaround to FPS caps/VSYNC where you can limit FPS in NVCP to just under half your refresh, so 57-59, then enable FG and you should get a 114-118fps presentation on your LG OLED, have you tried it?

Wow, awesome, it works, capping FPS to 59 keep the in-game FPS to 118 in Spiderman, thanks for the your info.
 
"normalize" FSR or DLSS is just max quality settings. I cannot tell it apart unless I'm not moving in game and looking for it. Free 25% FPS boost.
 
"normalize" FSR or DLSS is just max quality settings. I cannot tell it apart unless I'm not moving in game and looking for it. Free 25% FPS boost.

When you exceed 120FPS (or even 100FPS), another 25%FPS is just a number, can't tell without a FPS counter.

Try DLAA vs DLSS Q in Spiderman, you can feel the visual difference even when moving.
 
Last edited:
I have 165Hz a g-sync monitor, so I try to get close. No point in surpassing that because the frames won't show.
 
Well, I have quite a story with DLSS3:
I have upgraded from VA panel (AORUS FV43U) to OLED (LG C2) and now it completely sucks.
Basically, on VA it was flawless - twice the FPS for free, yay! All smooth, no visible artifacts, no laggy inputs feeling.
But after swapping to OLED it plainly sucks big time: micro stuttering, tearing, and artifacts, all fairly noticeable despite 100+ framerate. Disabling DLSS Frame Gen fixes everything back to normal.
This is, the night-and-day difference, and nothing changed anywhere except the display. Been playing A Plague Tale in case that matters.

Bottom line: your mileage may vary.
 
Back
Top