• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS 4 Transformer

Anyone notice that "X.e.S.S Quality" is the most accurate in cyber punk 2077 when using RT?
There's some slightly off shadows & reflections in both D.L.S.S implementations & F.S.R. too.
 
Sorry, but even on static images I can see ZERO difference between either modes.

Well you have to open your eyes. There's your problem.

Until yesterday DLSS 3.8 was impressive. Today we learn that DLSS 4.0 is impressive while DLSS 3.8 is not so impressive after all.

Waiting for DLSS 5.0 review in a couple of years vs DLSS 4.x, to read the same, that DLSS 4.x isn't as much impressive as we will be reading for the next couple of years, but DLSS 5.0 is really impressive.

Isn't it how it always has been? Do I think Hardware Texture and Lighting is impressive now? No. But it was when it was released.
 
A famous quote from a TV series "Welcome to the death of the Age of Reason"
How can let's say a 10 megapixels picture be downscaled to 4 megapixels and then upscaled to 10 megapixels and look better than the original ?
Unless, and this sounds crazy, the original doesn't have contrast, sharpness, color and is somewhat gimped so the upscaled image can look better.

LaPassionduChrist_00_27_53_990_00_28_09_819_00_00_08_478_00_00_13-ezgif.com-video-to-gif-conve...gif

He has spoken blasphemy! Why do we need any more witnesses? Look, now you have heard the blasphemy.
Matthew 26:65
 
@W1zzard

I have always found these articles rather silly: how can a single still frame in isolation, give any sort of useful information about the experience of playing a game that is rendering upwards of 30 frames every second? Or to put it another way, why are we trying to evaluate an upscaler, that is designed to give consistent average quality over a series of rendered frames, on a single arbitrary outputted frame? By the same type of logic, you might as well rank GPUs not by their performance in games, but by how fast they finish rendering something in Photoshop!

So yeah, I'd just have the article contain a side-by-side comparison video, and discuss time stamps/ranges in that video that display the most obvious improvements or regressions. You can then embed stills from those points to help illustrate these observations; that also makes it more accessible to people who aren't gonna watch the video.


I think the new DLSS4 looks amazing with a few minor defects that will either be fixed but aren't a make or break deal, I watched the video, I looked at the still frames.


Still frames and the detail of things like the palm tree leaves being more crisp and realistic, while overblown a bit underneath, the missing rear defroster wires, but in Stalker the powerlines actually appearing... After all the collection of still frames is where the illusion of motion comes from so what better way to compare product?

The only better way would be to scan each set of frames in the pipeline and compare the values actually rendered.
 
STEP 1 - various AA methods, leading up to TAA (temporal hooks etc)
STEP 2 - offline "AI" training to make lower resolution A look like higher resolution B (to oversimplify)
STEP 3 - modify games to take advantage of what upscales WELL (line thickness, texture types, denoisers etc)
STEP 4 - improve AI algorithm (including optimizing for newer hardware)
STEP 5 - repeat steps 2 to 4 (train algorithm, optimize game engines for AI upscaling, improve AI algorithm)

*You can kind of lump training and improving together, I guess. Probably a constant train/improve loop until the final product, like the new Transformer Model sees consumer usage.
 
Last edited:
@W1zzard

I have always found these articles rather silly: how can a single still frame in isolation, give any sort of useful information about the experience of playing a game that is rendering upwards of 30 frames every second? Or to put it another way, why are we trying to evaluate an upscaler, that is designed to give consistent average quality over a series of rendered frames, on a single arbitrary outputted frame? By the same type of logic, you might as well rank GPUs not by their performance in games, but by how fast they finish rendering something in Photoshop!

So yeah, I'd just have the article contain a side-by-side comparison video, and discuss time stamps/ranges in that video that display the most obvious improvements or regressions. You can then embed stills from those points to help illustrate these observations; that also makes it more accessible to people who aren't gonna watch the video.

The stills image comparisons are mostly useful to discern texture clarity details and general text crispness which is basically delicate sharpness and/or relative resolution derived subpixel qualitative differences. You glean a lot from a image comparison, but a video is more impactful and will show details a still image can't replicate with what's going on like frame interpolation and it's impact on animation details.
 
It was already released? I saw in my Cyberpunk Transformer option. And on 4k I have some VRAM free thanks to it, on my 4070Ti. And no lags thx to it, IMHO because that insufficient VRAM.
 
Isn't it how it always has been? Do I think Hardware Texture and Lighting is impressive now? No. But it was when it was released.
We are not talking about new features but the quality of the image we get from upscaling and Frame Generation. When for years most of the tech press can't see anything bad on a version of DLSS until they review the next version, that means we have a problem in objectivity of reporting the visual quality of DLSS.
 
We are not talking about new features but the quality of the image we get from upscaling and Frame Generation. When for years most of the tech press can't see anything bad on a version of DLSS until they review the next version, that means we have a problem in objectivity of reporting the visual quality of DLSS.
For years nobody complained about 1080p. Then we moved to 1440p and suddenly everyone said it's better. Then for years nobody complained about 1440p and yet we moved to 4k and everyone claimed it's better. I wonder if there is a grand conspiracy here :roll:

Btw, a lot of reviews have pointed out issues with DLSS before, I don't know what rock you live under.
 
Until yesterday DLSS 3.8 was impressive. Today we learn that DLSS 4.0 is impressive while DLSS 3.8 is not so impressive after all.
Is it a really a surprise? One day something is the great product, tomorrow a better version comes out and now people want that one. It doesn't make the first one worse, it just makes the new one better. The bar has been raised.

Until 5th April 2023 the 5800X3D was impressive, on that day we learn that the 7800X3D is more impressive. The 5800X3D didn't get worse, something new came out that was better.
aha .. that's Maxus24's personal channel... also posted a few days before he even gave us access to the video.. that's a breach of my personal trust, I'll terminate the relationship.

Any volunteers to take over this position?
PM'd
You wouldn’t know it by the way AMD users are shitting all over this article.
I believe I've noticed a consistent trend, at least if the system specs tab is to be believed... The most negative comments in this thread (testing methodology comments not withstanding) are coming from users who do not use/run/main RTX hardware and can try/are able to run DLSS CNN/Transformer for themselves.

Shocker I know right! who could have possibly predicted such an outcome :rolleyes:
 
For years nobody complained about 1080p. Then we moved to 1440p and suddenly everyone said it's better. Then for years nobody complained about 1440p and yet we moved to 4k and everyone claimed it's better. I wonder if there is a grand conspiracy here :roll:

Btw, a lot of reviews have pointed out issues with DLSS before, I don't know what rock you live under.

1080p is still good enough for most people (according to the Steam hardware survey) and I still use it for my desktop. Until video games look literally like blu-ray movies in terms of realism which they are light years from, then the limits of 1080p haven't come close to being exhausted. It was GPU and monitor manufacturers who convinced people 1080p wasn't good enough. 1440p is better, but it's insignificant to me. 4K is completely pointless unless you're viewing on a large TV screen. Yet without 4K how would GPU manufacturers get people to buy their absurdly expensive GPUs when their mid-range ones work perfectly well at 1080p and 1440p?

It is because of the move to 4K that game graphics have basically stalled for the last 10-15 years. We've got essentially the same graphics as we had years ago but just at higher resolutions. All of the advances in GPU technology and efficiency have just been soaked up pushing more pixels. What a waste of time and energy.
 
I've been preaching the ISO performance for a while now and this is what you end up with, 4k DLSS Q vs 1440p native = 34 vs 36 fps. Image quality difference though is drastic. Native looks abhorent. Funny cause I thought "upscaling can't create missing information". Oh well
Dude, you can't compare apples to oranges. Compare at same resolutions, otherwise image with lower resolution will always look worse.
 
Dude, you can't compare apples to oranges. Compare at same resolutions, otherwise image with lower resolution will always look worse.
Dude, you can't compare apples to oranges. Compare at same performance
 
I think I may have looked at the bottom left instead of the bottom right.
I also didn't see the comment box at the bottom right at the beginning. Only the top right one. Not sure why, maybe it took some time to load or a review needs a comment for it to appear (just spit balling here)
 
1080p is still good enough for most people (according to the Steam hardware survey) and I still use it for my desktop. Until video games look literally like blu-ray movies in terms of realism which they are light years from, then the limits of 1080p haven't come close to being exhausted. It was GPU and monitor manufacturers who convinced people 1080p wasn't good enough. 1440p is better, but it's insignificant to me. 4K is completely pointless unless you're viewing on a large TV screen. Yet without 4K how would GPU manufacturers get people to buy their absurdly expensive GPUs when their mid-range ones work perfectly well at 1080p and 1440p?

It is because of the move to 4K that game graphics have basically stalled for the last 10-15 years. We've got essentially the same graphics as we had years ago but just at higher resolutions. All of the advances in GPU technology and efficiency have just been soaked up pushing more pixels. What a waste of time and energy.
There is a significant difference between 1080p and 1440p. Its not as day and night as the difference between 720p to 1080p, but its big. Same with 60hz vs 160hz, major difference. Once you see and feel it you can't unsee it.

Of course there is PPI, etc... but your average 27inch 1440p monitor is way better than any 24/25/27inch 1080p monitor.

The issue is that even in 2025 we are forced to play at 720p upscaled to 1080p just in order to be able to run games at barely 60fps.

I still find Wolfenstein II: The New Colossus graphics and AC: Black Flag graphics to be much better than most of the games today.

Heck Crysis has better graphics in some spheres than many modern games today!
 
Is it a really a surprise? One day something is the great product, tomorrow a better version comes out and now people want that one. It doesn't make the first one worse, it just makes the new one better. The bar has been raised.
That's the difference here. With every new version people find problems that they couldn't spot before.
For years nobody complained about 1080p. Then we moved to 1440p and suddenly everyone said it's better. Then for years nobody complained about 1440p and yet we moved to 4k and everyone claimed it's better. I wonder if there is a grand conspiracy here :roll:
1080p remained the same before and after the introduction of 1440p and 1440p remained the same before and after the introduction of 4K. 1080p didn't suddenly looked worst than before.
Btw, a lot of reviews have pointed out issues with DLSS before, I don't know what rock you live under.
Most reviews analyze the errors of other upscaling techs thoroughly, while any issues with DLSS are described as small and unimportant. Then a new version of DLSS comes out and everyone sees huge differences. What was unimportant yesterday, it's a deal breaker today.
 
1080p remained the same before and after the introduction of 1440p and 1440p remained the same before and after the introduction of 4K. 1080p didn't suddenly looked worst than before.
Are you suggesting that DLSS CNN now looks worse than before? Huh?

Most reviews analyze the errors of other upscaling techs thoroughly, while any issues with DLSS are described as small and unimportant. Then a new version of DLSS comes out and everyone sees huge differences. What was unimportant yesterday, it's a deal breaker today.
Ah, the usual bs story. Yes, everyone and everything is overly critical about amd and gives a free pass to nvidia. Yes, absolutely man, have a good day.
 
That's the difference here. With every new version people find problems that they couldn't spot before.
I don't think it changes anything, the last version was compared against native and it's rival upscalers, and had it's image quality scrutinised and ranked as such. Now a new version improves on it and does some aspects better, highlighting some differences vs the old one. Again, that doesn't change the last versions quality and it's comparative standing against everything but the new one, the new one just does some things even better, this is literally progress marching forward. I'm not sure I remember anyone claiming DLSS was perfect and unable to ever be improved upon, either, just that it was currently the best at tackling the same task.

So, I don't really see point you're trying to make, but I appreciate that it resonates with people who don't use DLSS and enjoy any way to have a jab at it.
 
I don't think it changes anything, the last version was compared against native and it's rival upscalers, and had it's image quality scrutinised and ranked as such. Now a new version improves on it and does some aspects better, highlighting some differences vs the old one. Again, that doesn't change the last versions quality and it's comparative standing against everything but the new one, the new one just does some things even better, this is literally progress marching forward. I'm not sure I remember anyone claiming DLSS was perfect and unable to ever be improved upon, either, just that it was currently the best at tackling the same task.

So, I don't really see point you're trying to make, but I appreciate that it resonates with people who don't use DLSS and enjoy any way to have a jab at it.
When I see reviews, if all three upscalers throw a pixel in the wrong position on the screen, reviewers usually take for granted that DLSS is in fact putting that pixel in the correct position and all other upscalers are doing a worst job. When a new DLSS version comes out and changes the position of that pixel, even if that position is still wrong, the conclution is that the new version fixes the position of that pixel and reviewers finally aknowledge that the pixel was in the wrong position.

Are you suggesting that DLSS CNN now looks worse than before? Huh?
No, it looks as bad as before. We only need a new version to acknowledge it.
Ah, the usual bs story. Yes, everyone and everything is overly critical about amd and gives a free pass to nvidia. Yes, absolutely man, have a good day.
You have your opinion, I have mine. And I really don't care about yours.
Have a good day.
 
I don't think it changes anything, the last version was compared against native and it's rival upscalers, and had it's image quality scrutinised and ranked as such. Now a new version improves on it and does some aspects better, highlighting some differences vs the old one. Again, that doesn't change the last versions quality and it's comparative standing against everything but the new one, the new one just does some things even better, this is literally progress marching forward. I'm not sure I remember anyone claiming DLSS was perfect and unable to ever be improved upon, either, just that it was currently the best at tackling the same task.

So, I don't really see point you're trying to make, but I appreciate that it resonates with people who don't use DLSS and enjoy any way to have a jab at it.

It's always been a bit Emperor's New Clothes and I have a card which can use it so I'm not saying that out of sour grapes. It amazes me that GPUs used to be about performance, now it's about obsessing over tiny details in competing upscaling methods. How did it come to this. nvidia's hype machine unfortunately. They brought us upscaling and RT, and convinced a lot of people that we need them. And predictably, developers pounced on upscaling and now use it as a crutch, so we're no further forward overall.
 
When I see reviews, if all three upscalers throw a pixel in the wrong position on the screen, reviewers usually take for granted that DLSS is in fact putting that pixel in the correct position and all other upscalers are doing a worst job. When a new DLSS version comes out and changes the position of that pixel, even if that position is still wrong, the conclution is that the new version fixes the position of that pixel and reviewers finally aknowledge that the pixel was in the wrong position.
I find that a convoluted way to view things. Reviewers play the games to get a sense of what they're like viewed live, then record multiple solutions and spend time analysing the images and video side by side, looking for what is most stable in motion, what is less blurry, has less artefacts etc and so on. It's fairly straight forward to see what some get right and what they get wrong, and then articulate the differences and perhaps even rank them accordingly. Right pixels, pixels in the wrong place.. this is jibberish adjacent. None of these are perfect, there is no ground truth reference image to compare them to for the 'right place' for any given pixel to be.
It's always been a bit Emperor's New Clothes and I have a card which can use it so I'm not saying that out of sour grapes. It amazes me that GPUs used to be about performance, now it's about obsessing over tiny details in competing upscaling methods. How did it come to this. nvidia's hype machine unfortunately. They brought us upscaling and RT, and convinced a lot of people that we need them. And predictably, developers pounced on upscaling and now use it as a crutch, so we're no further forward overall.
Be that as it may, this is the situation we find ourselves in, and as each one improves it brings qualitative improvements to image quality and perhaps performance to those who use them. These reviews and discussions genuinely help users to make informed decisions on which ones to use, or not, and perhaps even which products they may want to buy, or not.
 
Last edited:
Being honest I think I prefer CNN. Ignoring alan wake where I cant see a difference.

Transformer looks like its too sharp with edges that are too thin, you need a little blur effect so things like wires and branches stand out, I think some people consider SGSSAA a little blurry which to me is the ultimate appearance, and CNN is closer to SGSSAA look than transformer.
 
Just to clarify, this is on desktop?
Yes on desktop. It's there now. So that's why I think I either didn't look at the correct spot or missed it some other way

I also didn't see the comment box at the bottom right at the beginning. Only the top right one. Not sure why, maybe it took some time to load or a review needs a comment for it to appear (just spit balling here)
Oh so maybe it wasn't just me missing it? @W1zzard
 
Back
Top