• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

3DMark Adds NVIDIA DLSS Feature Performance Test to Port Royal

Every comparison image I see, I'm trying to find method to this madness in terms of the differences. With DLSS, the whole vibe of the image seems to be all over the place. Some things look like they've been given a sharpening pass (with visible high contrast edges, like the bushes on the lower half of the shot) while others look like they have been upscaled from a lower res, and again others look like a downscale. Its weird. At the same time, the TAA comparison next to it is rock solid in terms of consistency. It does its job noticeably the same way across the whole frame.

http://images.nvidia.com/geforce-co...-dlss-interactive-comparison-clarity-001.html

When you look at the arm, and more specifically the little bag on the arm, you can clearly see the DLSS being extremely low res (you see huge jaggies as if it were upscaled) whereas the TAA shot shows a smooth edge on the bag.

DLSS:
1549407127826.png


TAA:
1549407178305.png


And then there is the whole contrast/color saturation part of it. The DLSS picture always looks out of balance in that respect. As if someone did some SweetFX tweaks on top, its minor, but its there. You can also see more detail on the TAA example above. The darker yellow hue is more pronounced, and look at this shadow in the dude's armpit - and take note of the difference in yellow saturation/brightness levels as well. TAA has a wider range.

DLSS:
1549407300708.png


TAA:
1549407359291.png


There is literally no consistency whatsoever and its very clearly done at a lower resolution. And that is even against a very imperfect AA like TAA...
 
Last edited:
@Vayra86 I think you're not seeing the forest because of the trees ;)
DLSS is just like JPEG. It's supposed to look worse up close. The trick is to be (close to?) indistinguishable under normal circumstances.
And since it's a per-title affair, I fully expect some titles to look better with it and some to look worse.
 
DLSS gets compared to TAA for a reason. Both are temporal solutions, reusing past data to enhance current frames.
TAA is also rather lean form of antialiasing when it comes to performance.

By the way, the image with blue circle highlight - there are stippling artefacts, fairly typical for TAA.

3DMark's DLSS feature test does seem to incorporate a good TAA implementation as opposed to the horrible one in Final Fantasy XV. If memory serves right, Final Fantasy XV had some other differences with DLSS as well, lack of DOF and motion blur?
 
@Vayra86 I think you're not seeing the forest because of the trees ;)
DLSS is just like JPEG. It's supposed to look worse up close. The trick is to be (close to?) indistinguishable under normal circumstances.
And since it's a per-title affair, I fully expect some titles to look better with it and some to look worse.

I get that, but Im not seeing the big advantage of it in that case versus, say, just a lower internal render res or other measures. And it also puts the idea of a 'free' AA in quite a different perspective. Its not free at all.
 
I get that, but Im not seeing the big advantage of it in that case versus, say, just a lower internal render res or other measures. And it also puts the idea of a 'free' AA in quite a different perspective. Its not free at all.
Well, I won't argue with that (except that afaik DLSS realy is about rendering at lower res internally). Ever since we got off SSAA, every single form of alternative AA schemes were not "free", but trade-offs instead.
As someone who has been looking to make the jump to 4k, but has always been put off by the entry barrier, I am now actually considering an RTX 2060 that can almost handle 4k in games. With DLSS, it could push enough titles into playable territory. And plese don't ask why I'm still fixating on games since I have pretty much given up on gaming (save for a couple of things I play on my tablet) ;)
 
Assuming default, 1440p test:
DLSS off = 1440p
DLSS on = 1080p + DLSS
Image quality comparisons are 1080p + TAA

Performance-wise 1080p is usually 30-35% faster than 1440p. In this case DXR effects probably benefit more than that resulting in a better than expected 42% difference in FPS.
This is exactly why image quality comparisons are extremely important about DLSS.

There is supposed to be a DLSS variant - supposedly called DLSS 2x - that will apply the same effect to actual rendering resolution, effectively doing antialiasing. That quality-improving variant is so far nowhere to be seen.

By the way, more details on DLSS are in 3DMark's Technical Guide, pages 150-159:
https://s3.amazonaws.com/download-aws.futuremark.com/3dmark-technical-guide.pdf

Rendering resolutions with DLSS:
- 1440x810 for 1080p output
- 1080p for 1440p output
- 1440p for 2160p output
I think that it is true for 2 reasons;
Firstly, We see render comparison and RTX 2070 is %53 better than GTX 1080 in this benchmark.
RT vs GT.PNG

Second reason is that if Nvidia launches 16 series without RT, RT cards won't sell. For this reason They use powerfull DLSS on RT cards but we can use for 16 series. I think that DLSS On, RT cards are more stable than GTX 16XX. For example, GTX 16XX increases %17-25 percent and RTX 20XX series increases %40-50 FPS.
sad.PNG

sa.PNG

All in all, I think that DLSS is more stable and more performance than TAA but DLSS has little bad quality for me. Sometimes I realize to difference. RTX 2060 DLSS On = GTX 1080 Ti.
Resources
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2070-vs-Nvidia-GTX-1080/4029vs3603
 
Firstly, We see render comparison and RTX 2070 is %53 better than GTX 1080 in this benchmark.
This one is not quite correct. MRender is a single-feature benchmark that benefits directly from architectural changes from Pascal to Turing and its result cannot be extended to general performance.
Second reason is that if Nvidia launches 16 series without RT, RT cards won't sell. For this reason They use powerfull DLSS on RT cards but we can use for 16 series. I think that DLSS On, RT cards are more stable than GTX 16XX. For example, GTX 16XX increases %17-25 percent and RTX 20XX series increases %40-50 FPS.
All in all, I think that DLSS is more stable and more performance than TAA but DLSS has little bad quality for me. Sometimes I realize to difference. RTX 2060 DLSS On = GTX 1080 Ti.
16-series will be positioned below RTX cards in performance and have no real effect on RTX cards. It is unclear if Tensor cores are still there for 16-series or not.
 
This one is not quite correct. MRender is a single-feature benchmark that benefits directly from architectural changes from Pascal to Turing and its result cannot be extended to general performance.
I know that it isn't general benchmark, we already see page up but RTX 2070 is more powerful in MRender, I think that it is helping games for dlss and others.
16-series will be positioned below RTX cards in performance and have no real effect on RTX cards. It is unclear if Tensor cores are still there for 16-series or not.
I believe that Nvidia doesn't create RT Cores for only Ray Tracing.
 
Back
Top