• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Launches DLSS 4 Plugin for Unreal Engine 5.6

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,214 (1.12/day)
NVIDIA has released its DLSS 4 plugin for Unreal Engine 5.6, allowing developers to incorporate the company's most advanced upscaling and AI-driven frame generation tools into their projects. With this update, performance improvements introduced in UE 5.6, such as optimized memory handling and refined rendering pipelines, receive the benefits of Transformer-based upscaling, which significantly lowers VRAM usage while generating extra frames to improve motion smoothness. Developers using Unreal Engine 5.6 can now integrate multi-frame generation, ray reconstruction, deep learning anti-aliasing, and super resolution in one convenient package that requires fewer resources than earlier DLSS versions.

The DLSS 4 plugin relies on a Transformer neural network that evaluates the relationships among all pixels in a frame rather than using localized convolutional filters. This technique reduces video memory requirements by approximately 15-20% compared with the prior DLSS 3 SDK, freeing up capacity for higher-resolution textures and more detailed environments. Multi-frame generation predicts intermediate frames based on recent frame history, effectively boosting perceived frame rates without additional GPU draw calls. Ray reconstruction enhances reflections and global illumination by learning from high‑quality offline renders, delivering realistic lighting with minimal performance loss. Early feedback indicates a 30-50% uplift in ray-traced effects quality in GPU-bound scenes, as well as noticeably sharper visuals under dynamic lighting. The plugin supports Unreal Engine versions 5.2 through 5.6; however, only projects running on version 5.6 can access the full suite of improvements.



View at TechPowerUp Main Site | Source
 
Meanwhile, the updated steam FPS counter shows DLSS3 / 4 costs you 10%+ original real frames to produce the fake frames.
 
I'll be leaving this whole frame generation bullshit for generations after me. Can't stand it, and at high fps, don't need it.
 
In a pure upscaling scenario with no framegen, DLSS 4 Transformer increases VRAM usage compared to DLSS 3 CNN. I have to assume the OP is referring to framegen VRAM consumption (which I cannot test), otherwise the statement there is not inline with my testing (or the results posted elsewhere).
 
This is where AMD is currently faltering, FSR4 is practically MIA by comparison, despite working well.

In a pure upscaling scenario with no framegen, DLSS 4 Transformer increases VRAM usage compared to DLSS 3 CNN. I have to assume the OP is referring to framegen VRAM consumption (which I cannot test), otherwise the statement there is not inline with my testing (or the results posted elsewhere).
Frame gen doesnt reduce VRAM either.

The only way DLSS reduces VRAM is by reducing the native resolution then upscaling. It's a desperate ploy by nvidia to justify their 8GB trash cans.
I'll be leaving this whole frame generation bullshit for generations after me. Can't stand it, and at high fps, don't need it.
There's a lot more to DLSS4 then frame gen. You can turn off frame gen but still benefit from DLSS4 transformer for better performance.
 
Meanwhile, the updated steam FPS counter shows DLSS3 / 4 costs you 10%+ original real frames to produce the fake frames.
Repeat after me: All frames are fake.
 
If everything now is fake frames and A.I. generations then why do we need video cards. Just need a network high speed connection to a A.I. fake frame machine on the net to generate the frames. Wish some people would quit Nvidia form a real video card company again and show them all we do not need no fake frames A.I. BS.
 
Can't stand it, and at high fps, don't need it.
If you're playing @1080p or 2K, ok. Try running the most demanding games maxed out in 4K, though, and report back. Bloody Oblivion falters even on 5090 with DLSS upscaling.

The fake frames let me play the former, or CP2077 with PT in 4K on my 5070Ti, so yes, I'll take them. There are some trade offs, but they are vastly overblown in a typical internet fashion. Some amounts of lag are acceptable (depending on game too of course) and the gfx artifacts are mostly visible only to those who look for them.
 
Repeat after me: All frames are fake.
You are not insinuating that a computer game, played with a mouse and keyboard on a piece of glass sitting in front of your eyes is fake, are you? Nono, it's all real. Just the nvidia part is fake, the rest is real.

Ill try to explain it once, but im sure it will fly over people's heads anyways.

Using upscaling - which basically means the GPU will work hard to render eg. 500k pixels and "guess" the remaining 5m pixels leads to much much better image quality for the simple fact that the GPU is free to run more demanding graphics instead of spending 100% of the horsepower for pixel guessing. It's just common sense really.

And because I believe a picture is a thousands words - this is what my gpu achieves at native

native low.JPG

So 4k native with everything low, 125 fps.

Now activating DLSS

dlls ultra performance max.JPG


This isn't even DLSS 4 and we still get more performance and vastly better image quality. If you'd rather play the former then go play pacman, i don't know what else to say.
 
Last edited:
A new video from TI continue to show that there's no help for UE5's terrible performance:

 
If you're playing @1080p or 2K, ok. Try running the most demanding games maxed out in 4K, though, and report back. Bloody Oblivion falters even on 5090 with DLSS upscaling.

The fake frames let me play the former, or CP2077 with PT in 4K on my 5070Ti, so yes, I'll take them. There are some trade offs, but they are vastly overblown in a typical internet fashion. Some amounts of lag are acceptable (depending on game too of course) and the gfx artifacts are mostly visible only to those who look for them.
Yeah, to each their own, I don't see a reason to find the most demanding settings invented specifically just so you are forced to overspend on your GPU to get it to run WITH a bunch of crutches and still suffer from an image with issues.

Its pretty strange though to present such high graphics demands and then say 'the gfx artifacts (which are there!) are visible if you look for them'. Pray tell, if you have to have 4K with maxed detail, how are you NOT looking at that max detail and finding its faults? That's one hell of a special kind of cognitive dissonance if you ask me ;) Image stability always matters. Simple as that.

You could also just not care at all and save a shitload of money and not get tied to proprietary/game specific featuresets. After all, the extra detail is only there if you look for it, right?

So no... sorry, your story doesn't check out. You just want the maximum achievable thing and are invested so you gloss over the problems it has. That's fine. But call it what is, instead of downplaying said issues. The bottom line is NOT an overblown internet response. The bottom line is people who are venturing into ridiculousness just to use some PT and defend it as the next best thing ;) Its even better when they say this is 'progress' in gaming... while the actual game they're playing is old news and lacking in many ways. Its a graphics posterboy before anything else. That is one damn shallow view on what gaming is, imho.

CP2077... I've been hearing that example for half a decade now. I guess its the only thing that exists that truly gives back some value on a way overpriced GPU... Its a pretty sad affair when it comes to actually valuable RT content. Its rare. It doesn't proliferate much. And it never will, because this whole thing is invented to draw people like you into a silly purchase mode and tied to proprietary features. Its an Nvidia marketing story you've drowned yourself in.

And a last issue with this technology is how it enables even shittier engines / base performance on top of what became possible with just an upscale. Underneath that... you get ever less hardware for your money, so your GPU is realistically just getting more and more dependant on these crutches. Better watch out what you wish for.
 
Last edited:
The fake frames let me play the former, or CP2077 with PT in 4K on my 5070Ti, so yes, I'll take them. There are some trade offs, but they are vastly overblown in a typical internet fashion. Some amounts of lag are acceptable (depending on game too of course) and the gfx artifacts are mostly visible only to those who look for them.

Exactly. The FG haters are usually people who have zero practical experience with FG but simply ASS-ume stuff because they have superficially heard how FG increases lag. They conveniently ignore the fact that there are technologies like nVidia Reflex/Reflex 2 (+ Boost) that mitigate the latency increase to the point of irrelevance.

It is also very rare, in practice, to see actual artifacts in a game. It will depend on the individual game and some games may be more prone to it due to certain visual in-game features but it is generally a non-issue.

I have had an RTX 4090 until recently and now an RTX 5090 so I rarely have a need for (M)FG but every time I have used it so far it has been a very smooth experience with no noticeable issues of any kind. If possible, I keep it off, but I have zero qualms about enabling it if I need/want more fps in a certain title.
 
Exactly. The FG haters are usually people who have zero practical experience with FG but simply ASS-ume stuff because they have superficially heard how FG increases lag. They conveniently ignore the fact that there are technologies like nVidia Reflex/Reflex 2 (+ Boost) that mitigate the latency increase to the point of irrelevance.

It is also very rare, in practice, to see actual artifacts in a game. It will depend on the individual game and some games may be more prone to it due to certain visual in-game features but it is generally a non-issue.

I have had an RTX 4090 until recently and now an RTX 5090 so I rarely have a need for (M)FG but every time I have used it so far it has been a very smooth experience with no noticeable issues of any kind. If possible, I keep it off, but I have zero qualms about enabling it if I need/want more fps in a certain title.
For Doom The Dark Ages, with MFGx2 enabled I am getting 120-180 FPS in 4K all highest settings and it's beautiful. I realize all these GPU hardware and financial geniuses permit Nvidia forcing them to spend their money, but for me, my Aorus 5080 Master Ice is badass and fit's in my budget. I guess I ignored the crybaby "experts". But I hope they enjoy FSR or XESS LOL I guess.
 
You are not insinuating that a computer game, played with a mouse and keyboard on a piece of glass sitting in front of your eyes is fake, are you? Nono, it's all real. Just the nvidia part is fake, the rest is real.

Ill try to explain it once, but im sure it will fly over people's heads anyways.

Using upscaling - which basically means the GPU will work hard to render eg. 500k pixels and "guess" the remaining 5m pixels leads to much much better image quality for the simple fact that the GPU is free to run more demanding graphics instead of spending 100% of the horsepower for pixel guessing. It's just common sense really.

And because I believe a picture is a thousands words - this is what my gpu achieves at native


So 4k native with everything low, 125 fps.

Now activating DLSS



This isn't even DLSS 4 and we still get more performance and vastly better image quality. If you'd rather play the former then go play pacman, i don't know what else to say.

if you're going to compare the least you could do is turn off depth of field.
because anything close to the camera is blurry for you to focus on the player instead the a land scape on the native one.

the main difference in those picture is one has proper "Depth of field" rendering & burring. While the other does not, namely the D.L.S.S one bottom.

Second the frame time isn't correct on either one. :/
 
if you're going to compare the least you could do is turn off depth of field.
because anything close to the camera is blurry for you to focus on the player instead the a land scape on the native one.

the main difference in those picture is one has proper "Depth of field" rendering & burring. While the other does not, namely the D.L.S.S one bottom.

Second the frame time isn't correct on either one. :/
I'm just using the graphics presets.

What do you mean the frame time isn't correct?
 
I personally don't mind framegen, I use AMD's AFMF in all games where I can. What I do despise with full force of a nuclear blast is NVIDIA plastering their 300+ fps graphs everywhere pretending these cards are now suddenly as fast as RTX 4080 or RTX 4090 was a year or two ago.

Also all the framegen stuff should be a bonus, not a requirement. Same with DLSS/FSR/XeSS. Game should be playable by itself and all this stuff should be a bonus to make experience really exceptional. Instead it's being used to make over the top insanely demanding games barely playable on 2000€ graphic card. I've always loved graphics progress, but what's happening now is just absolutely ridiculous.
 
Also all the framegen stuff should be a bonus, not a requirement. Same with DLSS/FSR/XeSS. Game should be playable by itself and all this stuff should be a bonus to make experience really exceptional. Instead it's being used to make over the top insanely demanding games barely playable on 2000€ graphic card. I've always loved graphics progress, but what's happening now is just absolutely ridiculous.
If only there were a way to adjust performance, like settings that let you tailor graphics to your own preferences.
 
Frame generation/interpolation..... What a great tool at peoples disposal to deploy if and when they choose. Nobody has a gun to anyone's head to use FG of any variety, so it's pretty simple, just don't use it if you don't want to. But if you don't want to ever use it, consider perhaps that some people do want to and do like it. Crazy I know. I definitely look forward in every thread it's mentioned to hearing valiant and heroic tales of people ensuring it's off and avoiding it so they can enjoy natural, free-range frames only (phew!).

I can't understate how excellent LSFG (and now NVIDIA Smooth Motion - so thanks AMD for putting a rocket up them, genuinely) is for framerate locked content.

Emulation, video content and the odd game locked at 24/30/60 fps is practically begging to have FG applied to it. I'm yet to find the downside, or at least one even remotely bad enough to even threaten to negate the benefit.

There is the odd game where I've found that properly implemented FG is also of great merit for use, but they are rarer as VRR and traditional resolution upscaling in concert tend to get the results I'm chasing. Still, I'd rather it be available and not use it, than want to use it and it not be available. Just another tool in the box to tweak the experience to a given individuals tastes. And given how heavy UE5 is, it's nice to see that option in the settings menu every time.
 
Frame generation/interpolation..... What a great tool at peoples disposal to deploy if and when they choose. Nobody has a gun to anyone's head to use FG of any variety, so it's pretty simple, just don't use it if you don't want to. But if you don't want to ever use it, consider perhaps that some people do want to and do like it. Crazy I know. I definitely look forward in every thread it's mentioned to hearing valiant and heroic tales of people ensuring it's off and avoiding it so they can enjoy natural, free-range frames only (phew!).

I can't understate how excellent LSFG (and now NVIDIA Smooth Motion - so thanks AMD for putting a rocket up them, genuinely) is for framerate locked content.

Emulation, video content and the odd game locked at 24/30/60 fps is practically begging to have FG applied to it. I'm yet to find the downside, or at least one even remotely bad enough to even threaten to negate the benefit.

There is the odd game where I've found that properly implemented FG is also of great merit for use, but they are rarer as VRR and traditional resolution upscaling in concert tend to get the results I'm chasing. Still, I'd rather it be available and not use it, than want to use it and it not be available. Just another tool in the box to tweak the experience to a given individuals tastes. And given how heavy UE5 is, it's nice to see that option in the settings menu every time.
FG is way more niche than dlss / fsr though, it's not an "auto enable wherever is available toggle" like upscaling is, but it definitely has it's use cases - especially for locked games or heavily cpu bound games that you are stuck at 60 or below cause your cpu is getting hammered. Personally I could live without FG, but without dlss / fsr? Hell no.
 
Back
Top