• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS 3.5 Ray Reconstruction

People are so use to fake image techniques and non accurate lighting/shadows etc when they see something more accurate it throws them off.
It's still not that accurate, just a better "version" of what Nvidia's algorithm determines.
I get it though we've been conditiond by 30+ years of developers faking lightning/reflections etc somthing that is more accurate doesn't always look better.
And that's because it's computationally expensive to get the real reflections, shadows et al depending on the environment & near impossible for them to be "real life" accurate in games. Think Andy Serkis/LOTR but probably at least 1000x more complicated.
Also this is a free extra toggle for anyone with an RTX gpu that in most scenarios seems to improve both image quality
Only for 4xxx owners & that's a really really small minority! Of course AMD would probably do the same if they were in JHH's canvas shoes, I guess we'll have to wait for something similar (FG?) from FSR 3.5 or wait till Nvidia decides to grace previous gen RTX owners with their largesse :shadedshu:
 
Only for 4xxx owners & that's a really really small minority! Of course AMD would probably do the same if they were in JHH's canvas shoes, I guess we'll have to wait for something similar (FG?) from FSR 3.5 or wait till Nvidia decides to grace previous gen RTX owners with their largesse :shadedshu:

Yeah definitely only currently usable on 40 series for anything more than screenshots but at least they are working on it for non path traced games. It still seems like it's somthing the developer is going to have to implement though so game support may be limited. I'm really digging it so far over not using it though but it'll likely be a generation or two before it starts to really matter.

Hopefully amd comes up with somthing by then but with them seemingly ditching the high end again maybe not.
 
apart from DLSS reflections that improved a lot with the tech enabled, the overall picture quality to me seems more like

- better than native
- blurrier than native
 
It's nice that they managed to make it work on all RTX cards, instead of just limiting it to RTX 40X0, as with frame generation.

"Oh, you have RTX 3090 Ti? But it does not have the discombobulator, even the RTX 4050 has the discombobulator, you can't use that old product..."

But the end result is kind of the same. RTX 3080 gets less than 30 FPS in Cyberpunk 2077 in 1080p native with path tracing. Sure, you can almost double that with cranking up the DLSS and lowering the graphics setting down from Ultra, but we're upscaling now to 1080p, and we're still not really at comfortably playable levels. With a card that's one bar below highest end in last generation.
 
apart from DLSS reflections that improved a lot with the tech enabled, the overall picture quality to me seems more like

- better than native
- blurrier than native

Well if you watch both Hardware Unboxed and Gamer's Nexus deeper dives, they nails down why that is. There are allot of elements to RR image that are better, but there are also ones that are not any better and further still notably some that are worse. So the end result is a new compromise, where most things look better, but not everything. Which is basically DLSS in a hand basket at this point. You might not notice what the new compromise's are, till you directly compare. And even then... you might not care.

Case in point, the new shiny reflections are eye catching... so much so you might miss that texturing in other spots of the same image have blurred and lost detail.
 
Poor 2080ti and 3070 getting 2 fps. They are within margin of error from 0 fps.

Well it shouldn't be that slow, normally 2080 Ti isn't 20% of 4090 in native but 40%, that's prefroming like a 980 Ti.
So there must be some setting tanking the performance even at 1080p. I don't get it.
 
So far in actual gameplay none of them are noticeable at least not more so than terrible screen space reflections or RT reflections/lighting being generally more grainy and taking a few frames to update.

Disclaimer though that is with a 4090 running around 100 fps with frame generation on a 65 inch oled if I was sitting 2 feet from a 1440p monitor it may or may not be more noticeable

Definitely not any reason to purchase a RTX 4080/4090 for but def a really nice bonus.
Well, I have not played Cyberpunk ever, I played games with RT and what I see that, even if the lightning getting more realistic than ever, there are a lot of work to be done to make it as realistic as they claim.
And the only limitation is the power of the GPUs.
Even with the "native" mode needs tricks to be able to work on current generation cards.
Games will be glorious within 3, I cannot wait for the next Alien Isolation with full RT or PT lighting.
 
Honestly, I am a bit disappointed, I expected more because the demos looked so nice, and while many things are better, such as responsiveness, many details within the textures are just neutered, fine detail tends to be lost, and it even introduces more ghosting within textures. Seems like a sidegrade rather than an upgrade.
Though some fine shadowing is improved, which leads to some nicer grounding of objects, which does look better.

The upscaler also looks weaker than dlss usually is, for whatever reason when I zoom in it reminds me of early offline image upscaling tech, a bit blobby and not that impressive, or pleasant, wonder if it's a problem with the dlss dll and not dlss just becoming worse.
 
Last edited:
Well it shouldn't be that slow, normally 2080 Ti isn't 20% of 4090 in native but 40%, that's prefroming like a 980 Ti.
So there must be some setting tanking the performance even at 1080p. I don't get it.

It's basically the same artificial limitation on what works with old generation and what doesn't as with DLSS frame generation - only a bit less obvious. With lackluster performance gains of Ada generation this seems to be the new plan of making gamers switch more often.

"Oh, you'd like to upgrade just every two or three generations? We don't think so! Last generation top model gets you just 36 FPS at 1080p!"

And if that means falling sales - who cares about gamers! AI is where it's at, and I'm sure they're getting ready for new cryptowave too...
 
Yeah but this is brutal even without FG 2080 Ti lost alot more ground compared to a 3070.
 
Last edited:
Well, I have not played Cyberpunk ever, I played games with RT and what I see that, even if the lightning getting more realistic than ever, there are a lot of work to be done to make it as realistic as they claim.
And the only limitation is the power of the GPUs.
Even with the "native" mode needs tricks to be able to work on current generation cards.
Games will be glorious within 3, I cannot wait for the next Alien Isolation with full RT or PT lighting.

For sure the hardware has to come a long way before developers actually start making games with PT in mind and also are not limited by 2 bounces we will likely need to see something 10x-15x a 4090 before that day happens. Hopefully I am still alive and young enough to enjoy it lol.
 
So vector adaptive multiframe RT, and it doesn't need all the extra hardware.

Interesting.
 
so the conclusion is … no need for 4090… you just need to get the 4060ti turn dlss 3.5 and it looks the same and has the same frame rate as the 4060ti (damn the measurements)

thank-you nvidia for saving me 1800 dollars… cause it is the same right? /sarcasm…

(when has nvidia been know to save people’s money?)
 
In Image Quality Comparison Ray Reconstruction specifically destroyed the marble grain texture in 3rd and 5th picture, yet in the 1st picture the RR define the concrete floor texture more than native !
 
In Image Quality Comparison Ray Reconstruction specifically destroyed the marble grain texture in 3rd and 5th picture, yet in the 1st picture the RR define the concrete floor texture more than native !

Yeah he missed allot of stuff like that. Which is fair, as the elements that RR improves are going to draw your eye more than those things that are wrong. But thats the point of AI methods. DLSS does the same thing, the vast majority of the eye catching elements it has refined to look good enough that your less likely to notice the points it got wrong.

The issue I would take is nVidia's marketing keeps throwing out there that DLSS is better than native. Yet, as soon as you take a critical analysis, that statement comes out as a lie. And nVidia know it, there is no way nVidia's engineers don't see the faults, they often gradually improve the elements that aren't working right. So the marketing teams just start handing out the cool-aid and lets the fanboys do the rest.
 
Another in-depth comparison.

 
Another in-depth comparison.

Credit to him highlighting the reality. I also reeeally like him pointing out that it will likely improve, just like DLSS 2/SuperResolution has with time. With allot of the mistake RR makes feeling an awful lot like those in early generation DLSS2 itself.
 
Credit to him highlighting the reality. I also reeeally like him pointing out that it will likely improve, just like DLSS 2/SuperResolution has with time. With allot of the mistake RR makes feeling an awful lot like those in early generation DLSS2 itself.

I think all of the media outlets did a pretty good job but still feel people need to actually use it to make a final decision I'm personally enjoying it quite a bit. I hope they improve it and get it working well with standard RT so it is more usable on lesser gpu's than a 4090.

It's not perfect but the overall image is much better and I definitely won't be disabling it in any game that supports it looking forward to Alan Wake 2s implementation.
 
Just noticed that for some reason TPU are getting higher FPS numbers compared to Toms Hardware and HBU, though Toms is a better match as they are using a 13900K

And also both Toms and HBU show that having Ray Reconstruction On lowers performance.

Checked GN and they have Ray Reconstruction On giving better performance and they use a 13900K too on a 4090 and shows lower performance numbers, not much but at the level they are is showing more than a 5% difference. Though the performance graphs from them aren't a full benchmark as they said they will be doing a proper benchmark of the update soon
 
Depends on the complexity of the scene.
And also both Toms and HBU show that having Ray Reconstruction On lowers performance.
And no It doesn't. Just in one or two instances out of how many is probably a statistical error or they swapped the results.
More importantly they have to show consistency for example 4070 is 1/2 4090 this macthes the result here and equal to a 3080 but here oddly enough it's the equal of 3090 instead but whatever.
 
Depends on the complexity of the scene.

And no It doesn't. Just in one or two instances out of how many is probably a statistical error or they swapped the results.
More importantly they have to show consistency for example 4070 is 1/2 4090 this macthes the result here and equal to a 3080 but here oddly enough it's the equal of 3090 instead but whatever.

So far vs 1.6 I have noticed better performance across the board anywhere from 5-15% some of that is just the patch in general some of that is ray reconstruction. Cpu usage is much higher though 10-15% on average on a 16 core cpu, my guess on an 6-8 core it would be 20-30% higher.

The new area is not available yet and the geometry looks like it's more complex so I will have to wait till it releases to judge that portion of the game.
 
It's best to watch videos with movement some things look way better, some about the same, some worse. It's no different than any other tradeoff.

After watching a few videos, I'd definitely say that ray reconstruction has more benefits than downsides. Much better than static comparisons.

And it's just the first release. It'll probably get a lot better over time, looking at how much DLSS upscaling has improved, even just since version 2.0.
 
And here I am, a lowly peasant with my 3080Ti enjoying games without all the gimmicks.
Even if I did have a fancy new 4090 that could run these options....what the hell do you pick because Nvidia keeps adding new DLSS options every 6 months?

DLSS
DLAA
DLSS Frame Generation
DLSS Ray Reconsruction
DLSSLGBTQ+
....one, two or all of the above?

Then, let's all be honest. Those that do utilize some sort of DL(whatever letters you want to insert here) - how many of you stand around gushing over the fact that a reflection in a puddle shows a glowing sign correctly or you stop to look into a window so you can see the reflection of stuff behind you? Or you decide to do some star gazing off the reflection of a pond?

Maybe I just don't understand it because I haven't dropped $1000+ on a GPU to experience all the goodness of standing around in a video game just staring at stuff instead of playing the game. The couple of games I've tried DLSS and RT in I didn't give a rip about any of it. It didn't make the game more fun, it didn't make me want to take screenshots of using the settings vs not using them and then stare at them for hours trying to justify the actual purpose of using the settings over not using them. Do people actually do this? Do you not play games anymore or are you so bored with games that this whole concept if just a new trend for you spend you time on?
 
Not that bad and not that good. Some drawback but there are improvements as well. You decide if it is worth it or not to spend a lot of money for it. As for me, I think it's too expensive and in order to have all the RT stuff on, you need a 4090 and still you will have some problems with performance here and there.
DLSS
DLAA
DLSS Frame Generation
DLSS Ray Reconsruction
DLSSLGBTQ+
....one, two or all of the above?
Yep a lot of them. Next year new cards new DLSS exclusive version. At least that is my bet. We will see how it goes.
 
After watching a few videos, I'd definitely say that ray reconstruction has more benefits than downsides. Much better than static comparisons.

And it's just the first release. It'll probably get a lot better over time, looking at how much DLSS upscaling has improved, even just since version 2.0.
By the time it's better you're going to need a 5090 to run that iteration.
 
Back
Top