Wednesday, February 3rd 2021

NVIDIA's DLSS 2.0 is Easily Integrated into Unreal Engine 4.26 and Gives Big Performance Gains

When NVIDIA launched the second iteration of its Deep Learning Super Sampling (DLSS) technique used to upscale lower resolutions using deep learning, everyone was impressed by the quality of the rendering it is putting out. However, have you ever wondered how it all looks from the developer side of things? Usually, games need millions of lines of code and even some small features are not so easy to implement. Today, thanks to Tom Looman, a game developer working with Unreal Engine, we have found out just how the integration process of DLSS 2.0 looks like, and how big are the performance benefits coming from it.

Inthe blog post, you can take a look at the example game shown by the developer. The integration with Unreal Engine 4.26 is easy, as it just requires that you compile your project with a special UE4 RTX branch, and you need to apply your AppID which you can apply for at NVIDIA's website. Right now you are probably wondering how is performance looking like. Well, the baseline for the result was TXAA sampling techniques used in the game demo. The DLSS 2.0 has managed to bring anywhere from 60-180% increase in frame rate, depending on the scene. These are rather impressive numbers and it goes to show just how well NVIDIA has managed to build its DLSS 2.0 technology. For a full overview, please refer to the blog post.
Source: Tom Looman Blog
Add your own comment

74 Comments on NVIDIA's DLSS 2.0 is Easily Integrated into Unreal Engine 4.26 and Gives Big Performance Gains

#1
ZoneDymo
"everyone was impressed by the quality of the rendering it is putting out."

and those who werent impressed enough were shunned from future founder editions
Posted on Reply
#2
medi01
Running games at lower resolution might increase your framerates!

News at 11:00!

Oh, and blur? Loss of details? You are not sitting far enough from your screen!
Posted on Reply
#3
Bwaze
Yeah. Is the word "upscaling" forbidden in an article about DLSS? I imagine it is, since it's rarely used even by the critics.
Posted on Reply
#4
Vayra86
Make it non proprietary or you can stick it right back up in that scalper's anus.
Posted on Reply
#5
londiste
medi01Oh, and blur? Loss of details? You are not sitting far enough from your screen!
Have you tried games with DLSS?
Posted on Reply
#6
spnidel
rendering things at a lower resolution increases FPS?! wowie! that's a first! I had no idea!
Posted on Reply
#7
TumbleGeorge
Relax Nvidia Lovelace maybe will come with more and better RT cores and maybe will play 1440p without DLSS.
Posted on Reply
#8
Dredi
Long live 360p gaming!
Posted on Reply
#9
watzupken
I think there are a lot of missing details here to conclude how well DLSS 2.0 works in this game engine. There are different quality settings in DLSS, so 60% is derived from quality of performance settings? The compromises will be lesser on quality settings, but will be quite severe with performance settings. I do look forward to see some real numbers, than some random percentages shared.
medi01Running games at lower resolution might increase your framerates!

News at 11:00!

Oh, and blur? Loss of details? You are not sitting far enough from your screen!
I feel DLSS 2.0 have proven that it works well upscaling from a lower resolution. But of course depending on the settings, blurring and flashing objects are likely problems, especially when selecting the performance settings. If DLSS works in every game released, I think Nvidia will have a winner here since AMD have not introduce an alternative.

The only alternative for AMD is the Sapphire Trixx Boost which Sapphire claims that it works in every game. It may not yield the same quality as DLSS upscaling, but I feel the benefit is that you can select how low a resolution you want to go via a slider bar, and if I take Sapphire's words for it, works in every game.
Posted on Reply
#10
Vayra86
Its really simple, magic does not happen, it gets precooked on Nvidia's DGX farms and if they didn't do that for you, you're simply out of luck, in that case the best you get is a small update to what AMD FidelityFX also has on offer - NON proprietary. A simple blurry upscale with sharpening overlays.

Let's stop beating around the bush. If you lack Nvidia's extended support for DLSS its nothing special at all and never will be, no matter how much misdirecting marketing they pour over it.
Posted on Reply
#11
TumbleGeorge
Somethere I read system requirements for gaming on 1080p @60 fps with full ray tracing effects enabled(if game support full ray tracing)... At least 32GB VRAM(64GB recommended) and around 60 teraflops FP32 performance...If that is true never will happen 4k or more gaming with full RT if not use some fakes like DLSS or Fidelity or something hacks. Hardware is too weak now and will be not enough an in future. Never!.
Posted on Reply
#12
medi01
londisteHave you tried games with DLSS?
Have you seen Titanfall?
Dear god...
TumbleGeorgefull ray tracing
Name a single "full ray tracing' effect that you cannot find in this game, running on 8 Jaguar cores with 7870 slappted on top:

Posted on Reply
#13
Dredi
TumbleGeorgeSomethere I read system requirements for gaming on 1080p @60 fps with full ray tracing effects enabled(if game support full ray tracing)... At least 32GB VRAM(64GB recommended) and around 60 teraflops FP32 performance...If that is true never will happen 4k or more gaming with full RT if not use some fakes like DLSS or Fidelity or something hacks. Hardware is too weak now and will be not enough an in future. Never!.
Never?!? Surely you are joking and mean only in the next two years or something meaningless in the grand timescale.
Posted on Reply
#14
W1zzard
Vayra86Its really simple, magic does not happen, it gets precooked on Nvidia's DGX farms and if they didn't do that for you, you're simply out of luck
That's not how DLSS works since v2.0. It's game agnostic now, doesn't require per-title training
BwazeYeah. Is the word "upscaling" forbidden in an article about DLSS? I imagine it is, since it's rarely used even by the critics.
The forbidden word is literally there in the first sentence?
Posted on Reply
#15
TumbleGeorge
DrediNever?!? Surely you are joking and mean only in the next two years or something meaningless in the grand timescale.
I'm sure soon lithography processes will come to its end for desktop parts. In theory If AMD or Nvidia has much more better GPU achitectures like a trump card hidden in the sleeve ... Maybe they'll be able to accomplish something. Their current architectures are not efficient enough.
Posted on Reply
#16
KainXS
watzupkenI think there are a lot of missing details here to conclude how well DLSS 2.0 works in this game engine. There are different quality settings in DLSS, so 60% is derived from quality of performance settings? The compromises will be lesser on quality settings, but will be quite severe with performance settings. I do look forward to see some real numbers, than some random percentages shared.


I feel DLSS 2.0 have proven that it works well upscaling from a lower resolution. But of course depending on the settings, blurring and flashing objects are likely problems, especially when selecting the performance settings. If DLSS works in every game released, I think Nvidia will have a winner here since AMD have not introduce an alternative.

The only alternative for AMD is the Sapphire Trixx Boost which Sapphire claims that it works in every game. It may not yield the same quality as DLSS upscaling, but I feel the benefit is that you can select how low a resolution you want to go via a slider bar, and if I take Sapphire's words for it, works in every game.
If nvidia can prebake DLSS(with no ai) now then they have a winner at this point, but it does say at the bottom to contact Nvidia for your dll so maybe, maybe not.

Since 2.0 its really not as bad as people are saying it is, people just like to complain. If you don't like it don't use, it you do then use it problem solved. 1.0 was pretty bad though
Posted on Reply
#17
Dredi
TumbleGeorgeI'm sure soon lithography processes will come to its end for desktop parts. In theory If AMD or Nvidia has much more better GPU achitectures like a trump card hidden in the sleeve ... Maybe they'll be able to accomplish something. Their current architectures are not efficient enough.
No shit the current ones are not powerful enough, but based on that I would not say that they will NEVER be enough. Just compare something from ten or twenty years ago and see how many percent faster raytracing is now. Then extrapolate for twenty years and you still think that it will NEVER be viable? Come on.
Posted on Reply
#18
Vayra86
W1zzardThat's not how DLSS works since v2.0. It's game agnostic now, doesn't require per-title training
But then the gains are hardly as great, right?
Posted on Reply
#19
W1zzard
Vayra86But then the gains are hardly as great, right?
Actually DLSS 2.0 works much better. Surprised you missed that.

What makes a huge difference is that the game now feeds motion vectors to the algorithm, which solves the TAA movement problem
Posted on Reply
#20
Vayra86
W1zzardActually DLSS 2.0 works much better, are you not following the news? What makes a huge difference is that the game now feeds motion vectors to the algorithm, which solves the TAA movement problem
I am and the results vary wildly depending on what support Nvidia had per title.

You still need a per-title implementation, so you still need Nvidia's support going forward, and we haven't got a complete overview of what the performance gain truly is without any DGX pre render support. Nvidia is doing a fine job muddying those waters.

I'm not denying the technology is nice, but this is not quite as fire and forget as it looks.
Posted on Reply
#21
ViperXTR
TumbleGeorgeRelax Nvidia Lovelace maybe will come with more and better RT cores and maybe will play 1440p without DLSS.
RTX 4080 at 1500 USD!!!! (3000+USD scalper price) can't wait!!!!
Posted on Reply
#22
TumbleGeorge
DrediJust compare something from ten or twenty years ago and see how many percent faster raytracing is now
How to compare something without RT with something with (partially) RT? Before 2018 all light effects was predefined not real-time calculated.
Posted on Reply
#23
rutra80
watzupkenThe only alternative for AMD is the Sapphire Trixx Boost which Sapphire claims that it works in every game.
There is Radeon Boost now in the drivers - it reduces rendering resolution of dynamic scenes by a given percent when you move, look around etc.

As for DLSS, have you people seen prototypes of lossy video encoders based on AI/deep learning? It's hard to believe what they cook up from tiny amount of data. These technologies are effective and here to stay.
Doesn't change the fact that on local PC I want raw power and losslessness, not compressed textures, reduced shaders, upscaled images, lossy compressed monitor connection, etc. I could as well use streaming instead. I prefer lossless lower resolution than baked up pseudo high resolution.
Posted on Reply
#24
Dredi
TumbleGeorgeHow to compare something without RT with something with (partially) RT? Before 2018 all light effects was predefined not real-time calculated.
RT has been used in the film industry for a long long time already. The performance gain per watt in ten years or so is measured in tens of thousands of percents.
Posted on Reply
#25
TumbleGeorge
DrediRT has been used in the film industry for a long long time already. The performance gain per watt in ten years or so is measured in tens of thousands of percents.
Cinema and movies is off topic to this discussion?
Posted on Reply
Add your own comment
May 18th, 2022 03:46 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts