• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS 4 Transformer

Cyberpunk 2077 has been transformed by the new transformer model, pardon the pun. It has so much more detail, and correct colours on objects and skin. I'm amazed nv gave this to all RTX owners.
 
A famous quote from a TV series "Welcome to the death of the Age of Reason"
How can let's say a 10 megapixels picture be downscaled to 4 megapixels and then upscaled to 10 megapixels and look better than the original ?
Unless, and this sounds crazy, the original doesn't have contrast, sharpness, color and is somewhat gimped so the upscaled image can look better.
That's easily explained....
 
Sadly Squad which supports DLSS will most likely not get the newest DLLs (still haven't managed to fix the broken Reflex implementation) and since it uses EAC DLL swapping is out of the question.
 
So a game that screws up rendering due to poor TAA implementation means that DLSS4 improves image quality in all titles?

Nah, DLSS4 improves image quality in games where the devs massively cock up. TAA is a scourge among GPU rendering and frankly shouldnt exist. In games that dont use it, DLSS4 does not improve image quality.

It is indeed great news for nVidia owners, especially midrangers. I look forward to the extra boost out of my 4060 mini PC.

With AMD locking FSR4 to rDNA4 GPUs, that's going to be yet another uphill battle for Team Red.
There is always the performance argument though, cause saying native looks better than dlss is as useful as saying 8k looks better than 4k. Sure it does, but is the performance you are losing worth the trade off? Or to put it better, at iso performance does the use of dlss give you better image quality? I think the answer was a resounding YES with the CNN model already. Transformer just made the gap larger.
 
There is always the performance argument though, cause saying native looks better than dlss is as useful as saying 8k looks better than 4k. Sure it does, but is the performance you are losing worth the trade off? Or to put it better, at iso performance does the use of dlss give you better image quality? I think the answer was a resounding YES with the CNN model already. Transformer just made the gap larger.
That's still only valid under the assumption that you'd be playing at lower than your monitor's native resolution, which I agree, isn't a very good idea. Upscaling can do better than that.
But then, I wouldn't want to lower my resolution anyhow, so I'd rather not compare "shit" to "just slightly bad". 1440p native will always look better than 1440p + DLSS/FSR/XeSS. The question is, by how much.
 
That's still only valid under the assumption that you'd be playing at lower than your monitor's native resolution, which I agree, isn't a very good idea.
People buy monitors man. When someome wants to buy one, it's better to target one resolution higher than what he would normally get.


1440p native will always look better than 1440p + DLSS/FSR/XeSS. The question is, by how much.
That's not true either but whatever.
 
People buy monitors man. When someome wants to buy one, it's better to target one resolution higher than what he would normally get.
I disagree, but I don't intend to pursue this any further. I believe in living to one's means, not just with money, but with monitor resolution, too.

That's not true either but whatever.
Sure, if you throw in some VSR/DLAA into the mix, which I didn't mention anywhere. But with pure 1440p native vs. upscaled 1440p with no GPU driver trickery, native always wins.
 
I've been preaching the ISO performance for a while now and this is what you end up with, 4k DLSS Q vs 1440p native = 34 vs 36 fps. Image quality difference though is drastic. Native looks abhorent. Funny cause I thought "upscaling can't create missing information". Oh well

4k dlss transformer.JPG



1440p native.JPG
 
So I looked at the picture difference before reading the article and the details, and I had never heard about the terms CNN or Transformative before. Comparing the pictures, I could not find any that had one better than the other. Alan Wake 2, I couldn't notice differences even when zooming, except maybe one that had pixels smoothed out ever so slightly more. The rest was either different shades (Cyberpunk) and screens content, but one doesn't look better than the other either.

So I'm sure I'm missing it, but to me the benefits are not as clear as what's mentioned in the article. And it comes at a frames per second cost. I guess I'll have to see it for myself!
 
Just for information.

Igorslab, i checked german site, has some Pictures about NVIDIA software.

Very disappointing, they have the same games: hogwards legacy, cp2077, alan wake 2

It seems there are no other games
 
I've been preaching the ISO performance for a while now and this is what you end up with, 4k DLSS Q vs 1440p native = 34 vs 36 fps. Image quality difference though is drastic. Native looks abhorent. Funny cause I thought "upscaling can't create missing information". Oh well
Of course a higher resolution is better even with DLSS/FSR. That's not the comparison I was talking about. What I said was 1440p native vs 1440p DLSS/FSR.

When it comes to monitors, dpi matters a lot more than just resolution as a raw number, imo.
 
Of course a higher resolution is better even with DLSS/FSR. That's not the comparison I was talking about. What I said was 1440p native vs 1440p DLSS/FSR.

When it comes to monitors, dpi matters a lot more than just resolution as a raw number, imo.
Wasn't replying to you my man, was just confirming that at same performance, native is horrible. Im not very interested in comparing image quality at different performance, doesn't make sense to me.
 
Wasn't replying to you my man, was just confirming that at same performance, native is horrible. Im not very interested in comparing image quality at different performance, doesn't make sense to me.
If you're targeting performance, then sure. I always try to target visuals and see about performance after. Each to their own.
 
Looks like there's some kind of bug at 4K and 1080P.

Bug is just another word for broken feature.

As long as the buyer of the graphic card is happy.

I do not see really a difference or an improvement. Everything looks awful. To be precise. A noticeable improvement where I would personally want only that one resolution with that rendering software. Some stuff has more light, some stuff is more dark, some has more reflexion. Still it looks awful. That stuff are just techdemos, to show maybe what can be possible or what is being worked on.

I prefer certain graphics before i see such awful stuff.
 
Last edited:
I do not see really a difference or an improvement. Everything looks awful. To be precise. A noticeable improvement where I would personally want only that one resolution with that rendering software. Some stuff has more light, some stuff is more dark, some has more reflexion. Still it looks awful. That stuff are just techdemos, to show maybe what can be possible or what is being worked on.
Now that I'm home and looking at it on a computer screen, I kind of agree.

Looking at the 1440p quality and performance comparisons, the leaves on the palm trees look a bit smoother with transformer. I do not see any other difference whatsoever, no matter how hard I try to.
 
What's CNN? What's transformer? Can somebody shed light on some details? I feel like I've just read some marketing material, not a review, to be honest.
CNN is a type of neural network (that we've "stolen" from real life).
Transformer is... another type of neural network.

The latter has seen increasingly more use even in image generation (which at some point was CNNs exclusive).

These terms at least make sense, unlike "ray reconstruction" or "neural radiance".
 
Now that I'm home and looking at it on a computer screen, I kind of agree.

Looking at the 1440p quality and performance comparisons, the leaves on the palm trees look a bit smoother with transformer. I do not see any other difference whatsoever, no matter how hard I try to.
Fine details like foliage, and smearing were really the biggest weak point of upscaling, so it makes sense that the differences are more visible on those areas. Native TAA also seemed to blotch a few details on the building that every upscaling method managed to get get back. Good lord TAA is trash. Also RR seems to smooth out the details on the foliage. DLLS without RR looks sharper.

1738069034140.png
1738069180127.png


CNN is a type of neural network (that we've "stolen" from real life).
Transformer is... another type of neural network.

The latter has seen increasingly more use even in image generation (which at some point was CNNs exclusive).

These terms at least make sense, unlike "ray reconstruction" or "neural radiance".
Neural radiance for games just seems to re-use a few of the principle from neural radiance field used to construct a 3D scene from 2D images. Instead of using ML to extrapolate 2d images in 3D, you use ML to extrapolate GI from a lower number of rays. More bounces at a lower cost.

Ray reconstruction does basically "reconstruct" missing details from the default denoiser used for Ray tracing.
As far as AMD is concerned "reconstructing" rays makes senses :D (in offline 3D, each pixels is computed by a ray). They have their equilavent coming soon.

Reconstructing pixels in noisy rendering​

Denoising is one of techniques to address the problem of the high number of samples required in Monte Carlo path tracing. It reconstructs high quality pixels from a noisy image rendered with low samples per pixel. Often, auxiliary buffers like albedo, normal, roughness, and depth are used as guiding information that are available in deferred rendering. By reconstructing high quality pixels from a noisy image within much shorter time than that full path tracing takes, denoising becomes an inevitable component in real-time path tracing.

Bug is just another word for broken feature.

As long as the buyer of the graphic card is happy.

I do not see really a difference or an improvement. Everything looks awful. To be precise. A noticeable improvement where I would personally want only that one resolution with that rendering software. Some stuff has more light, some stuff is more dark, some has more reflexion. Still it looks awful. That stuff are just techdemos, to show maybe what can be possible or what is being worked on.

I prefer certain graphics before i see such awful stuff.
I mean even the native rendering was bugged with those weird dark and sharp shadows. Is it DLSS that caused the game to bug on the foliage, or is it the game that caused DLSS to bug ?:D
1738070466438.png
 

Attachments

  • 1738069065842.png
    1738069065842.png
    359.2 KB · Views: 52
Last edited:
That's still only valid under the assumption that you'd be playing at lower than your monitor's native resolution, which I agree, isn't a very good idea. Upscaling can do better than that.
But then, I wouldn't want to lower my resolution anyhow, so I'd rather not compare "shit" to "just slightly bad". 1440p native will always look better than 1440p + DLSS/FSR/XeSS. The question is, by how much.
Real AMD warrior never say DLSS looks better or Extra Fps makes game looks better and smoother=better image quality when u move in the games..

U dont want to lower u res because FSR is bad!yes i get that.
But DLSS is great, u just dont want to say/or see it because brand loyalty and Nvidia hate.
 
Real AMD warrior never say DLSS looks better or Extra Fps makes game looks better and smoother=better image quality when u move in the games..

U dont want to lower u res because FSR is bad!yes i get that.
But DLSS is great, u just dont want to say/or see it because brand loyalty and Nvidia hate.
Yeah, I had an RTX 2070 and would probably still have it if it hadn't died, and I played and finished Cyberpunk using DLSS because I'm such an AMD fan! :rolleyes:

Since you know so much about people, Mr Holmes, please tell me why I have Nvidia GPUs in my two HTPCs, and why I have spares of both AMD and Nvidia cards. Or why I've used lots of Nvidia, as well as AMD/ATi cards in my life. (Are you old enough to remember ATi?) There seems to be some kind of identity crisis here, don't you think? ;)

Fanboy calling other people a fanboy! Hmph.
 
How can let's say a 10 megapixels picture be downscaled to 4 megapixels and then upscaled to 10 megapixels and look better than the original ?
Add previous 3 frames and now you have 16 megapixels of data. That's how temporal techniques increase resolution. As not all previous pixels are useable you won't get true 16 megapixels of information, but somewhere between 4-16 MP. And when some elements are missing motion vectors, ghosting artifacts occur.
 
Last edited:
Real AMD warrior never say DLSS looks better or Extra Fps makes game looks better and smoother=better image quality when u move in the games..

U dont want to lower u res because FSR is bad!yes i get that.
But DLSS is great, u just dont want to say/or see it because brand loyalty and Nvidia hate.
 
Yeah, I had an RTX 2070 and would probably still have it if it hadn't died, and I played and finished Cyberpunk using DLSS because I'm such an AMD fan! :rolleyes:

Since you know so much about people, Mr Holmes, please tell me why I have Nvidia GPUs in my two HTPCs, and why I have spares of both AMD and Nvidia cards. Or why I've used lots of Nvidia, as well as AMD/ATi cards in my life. (Are you old enough to remember ATi?) There seems to be some kind of identity crisis here, don't you think? ;)

Fanboy calling other people a fanboy! Hmph.
I really am not getting this fanatics i have also several brands of hardware in my house from Intel to AMD and several old nv..
Why because they do the things i want them todo but slowly i will have to switch them out for newer hardware
So i do not care about a brand, but i care about what they make and if they actually also care about their footprint on the planet and for that nv.. scores nothing
Intel and AMD makes fantastic products for a reasonable price, and all the people i help with pc problems are not being ignored if they have the wrong color brand
BTW nv.. should not have the green color which should goto AMD and the red to the other .... if you look at the power gulping cards they make.
 
@W1zzard

I have always found these articles rather silly: how can a single still frame in isolation, give any sort of useful information about the experience of playing a game that is rendering upwards of 30 frames every second? Or to put it another way, why are we trying to evaluate an upscaler, that is designed to give consistent average quality over a series of rendered frames, on a single arbitrary outputted frame? By the same type of logic, you might as well rank GPUs not by their performance in games, but by how fast they finish rendering something in Photoshop!

So yeah, I'd just have the article contain a side-by-side comparison video, and discuss time stamps/ranges in that video that display the most obvious improvements or regressions. You can then embed stills from those points to help illustrate these observations; that also makes it more accessible to people who aren't gonna watch the video.
 
Posting static screens doesn't do DLSS 4 any justice. It's very noticeable in motion. Especially with RT + RR games, but even games without like I put the dll on FF7 Rebirth and it's easy to see the better image quality. Excited for the new drivers this week.
 
Back
Top