• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

3DMark Adds NVIDIA DLSS Feature Performance Test to Port Royal

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,684 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Did you see the NVIDIA keynote presentation at CES this year? For us, one of the highlights was the DLSS demo based on our 3DMark Port Royal ray tracing benchmark. Today, we're thrilled to announce that we've added this exciting new graphics technology to 3DMark in the form of a new NVIDIA DLSS feature test. This new test is available now in 3DMark Advanced and Professional Editions.

3DMark feature tests are specialized tests for specific technologies. The NVIDIA DLSS feature test helps you compare performance and image quality with and without DLSS processing. The test is based on the 3DMark Port Royal ray tracing benchmark. Like many games, Port Royal uses Temporal Anti-Aliasing. TAA is a popular, state-of-the-art technique, but it can result in blurring and the loss of fine detail. DLSS (Deep Learning Super Sampling) is an NVIDIA RTX technology that uses deep learning and AI to improve game performance while maintaining visual quality.



Comparing performance with the NVIDIA DLSS feature test
The NVIDIA DLSS feature test runs in two passes. The first pass renders Port Royal with DLSS disabled to measure baseline performance. The second pass renders Port Royal at a lower resolution then uses DLSS processing to create frames at the output resolution. The result screen reports the frame rate for each run.

DLSS is a proprietary NVIDIA technology, so naturally, you must have an NVIDIA graphics card that supports DLSS, such as a GeForce RTX series, Quadro RTX series or TITAN RTX, to run the test. You must also have the latest NVIDIA drivers for your graphics card. You can find more details in the 3DMark technical guide.

DLSS uses a pre-trained neural network to find jagged, aliased edges in an image and then adjust the colors of the affected pixels to create smoother edges and improved image quality. The result is a clear, crisp image with quality similar to traditional rendering but with higher performance.


3DMark is 85% off in the Steam Lunar Sale
We're celebrating Chinese New Year-and the sixth anniversary of 3DMark's original release-with a special week-long sale.

From now until February 11, 3DMark Advanced Edition is 85% off, only USD $4.49, from Steam and our website.

3DMark Advanced Edition owners who purchased 3DMark before January 8, 2019 will need to buy the Port Royal upgrade DLC to unlock the DLSS test. The upgrade costs USD $2.99. You can find out more about 3DMark updates and upgrades here.

3DMark Professional Edition
The NVIDIA DLSS feature test is available as a free update for 3DMark Professional Edition customers with a valid annual license. Customers with an older, perpetual Professional Edition license will need to purchase an annual license to unlock Port Royal.

View at TechPowerUp Main Site
 
Ahhh... Nothing can showcase a Deep Learning method of AA as good as a canned benchmark. Excellent choice, nVidia.
 
I have ROG RTX 2080. I have to admit this benchmark looks very weird. The non dlss version looks like blurry, and bad shadows, almost like dull version of RTX. Then DLSS verzion all of sudden looks like it has good RTX on lol.

Have to admit non DLSS version looks way worse than the original bench. Original port royal nowhere near looks as bad as the non dlss version. Its like nvidia made the non dlss version looks horrible than it should be lol.
 
Last edited:
Ahhh... Nothing can showcase a Deep Learning method of AA as good as a canned benchmark. Excellent choice, nVidia.
benchmarks are really worth nothing in general.
 
I normally don't consider 'tampering' as that big a thing but in those pictures, the impossible appears to have happened. In even Nvidias own prior tech demos, DLSS is great but it does reduce detail when you zoom in. This benchmark appears to do the opposite, at least, given what we're being shown.
Very dubious indeed.
 
I have yet to try it in action but a lot depends on what the comparison is.
1440p DLSS vs 1440p TAA is pretty much what is on the screenshots. Previous tests and articles put 1440p DLSS roughly on par (including performance) with 1800p TAA, assuming proper TAA.

Edit:
Oh, Port Royale does 1080p DLSS as "1440p". That is going to make the difference more noticeable.
 
Last edited:
I normally don't consider 'tampering' as that big a thing but in those pictures, the impossible appears to have happened. In even Nvidias own prior tech demos, DLSS is great but it does reduce detail when you zoom in. This benchmark appears to do the opposite, at least, given what we're being shown.
Very dubious indeed.

They turned up every single detail from what I can see. It even looks like higher texture res. It's a joke.

It's amazing how much you can fudge results when stacks of cash appear on your doorstep.
 
Do they still expect people to pay for these benchmarks?
Well nothing's free so to speak, heck you even get spyware(?) via BIOS ala ASUS. You get personal information literally mined by FB & to a lesser extent by Google, so even if you don't pay $ you will have to cough up something.
 
Last edited:
Another confirmation that Im staying far away from RTX in general. The whole gen is one big abstract black box on a Pascal base. Not worth it at all.
 
Another confirmation that Im staying far away from RTX in general. The whole gen is one big abstract black box on a Pascal base. Not worth it at all.
Volta base. If nothing else, Turing brings Nvidia cards to feature parity with Vega.
 
Ahhh... Nothing can showcase a Deep Learning method of AA as good as a canned benchmark. Excellent choice, nVidia.
DLSS must be trained for each title individually, so applying it to 3DMark is really nothing special.
Plus, it's a good benchmark of what DLSS can do ;)
 
Ahhh... Nothing can showcase a Deep Learning method of AA as good as a canned benchmark. Excellent choice, nVidia.
Nice troll. Clever, yet unmistakably negative.

3DMark has been a premier performance metric for almost two decades because it is a good showcase for the potential of a PC.
 
Nice troll. Clever, yet unmistakably negative.

3DMark has been a premier performance metric for almost two decades because it is a good showcase for the potential of a PC.

No, he's right. It's a very limited scene that's entirely scripted. All surfaces, textures, reflections, movement, etc is precisely known and can be tuned to make it look good, which would NEVER happen in-game. It's a sham. It's not testing performance at all.
 
Rubbish! That happens in literally EVERY game. All devs tweak and tune their game to run as well as they can. GPU maker do it too, tweaking their drivers to run optimally with each app/game they can.

Every game has DLSS and AMD, too? wut?
 
Every game has DLSS and AMD, too? wut?
The exact setting is irrelevant to the point you were stating. Every dev tweaks and optimizes every setting to run at it best. It's a universal rule that will never change. So your complaint above does not hold water.
 
The exact setting is irrelevant to the point you were stating. Every dev tweaks and optimizes every setting to run at it best. It's a universal rule that will never change. So your complaint above does not hold water.
Actually in case of DLSS there is pretty good reasoning behind that complaint. The entire idea behind DLSS is applying machine learning to upscaling. Nvidia has stated this has to be done per-game on their large server farm. A scripted sequence is much easier to get good results out of than anything with arbitrary movement. The data set here - similarly to the Star Wars demo from launch - is static as opposed to a game where player can move and look where ever they want.

Whether DLSS can be applied to interactive games with the same level of success as scripted sequences we will have to see.
 
i just ran it.. take a look at the frame rate improvement.. that seems to be the potential game changer..

dssl.jpg


trog
 
i just ran it.. take a look at the frame rate improvement.. that seems to be the potential game changer..
Assuming default, 1440p test:
DLSS off = 1440p
DLSS on = 1080p + DLSS
Image quality comparisons are 1080p + TAA

Performance-wise 1080p is usually 30-35% faster than 1440p. In this case DXR effects probably benefit more than that resulting in a better than expected 42% difference in FPS.
This is exactly why image quality comparisons are extremely important about DLSS.

There is supposed to be a DLSS variant - supposedly called DLSS 2x - that will apply the same effect to actual rendering resolution, effectively doing antialiasing. That quality-improving variant is so far nowhere to be seen.

By the way, more details on DLSS are in 3DMark's Technical Guide, pages 150-159:
https://s3.amazonaws.com/download-aws.futuremark.com/3dmark-technical-guide.pdf

Rendering resolutions with DLSS:
- 1440x810 for 1080p output
- 1080p for 1440p output
- 1440p for 2160p output
 
Last edited:
No, he's right. It's a very limited scene that's entirely scripted. All surfaces, textures, reflections, movement, etc is precisely known and can be tuned to make it look good, which would NEVER happen in-game. It's a sham. It's not testing performance at all.
It's good for RELATIVE performance... which is all these synthetics are good for anyway... so, this isn't a new concept. :)
 
  • Like
Reactions: bug
for many dlss will make the difference between ray tracing being usable and not usable.. i would accept a little trickery for such a situation..

trog
 
for many dlss will make the difference between ray tracing being usable and not usable.. i would accept a little trickery for such a situation..

trog
This I can understand and agree with and it's a nice option, but the concern is transparency as to how Nvidia has gone about it and compromises. The end users deserves to know how it is impacting image quality. In a lot of ways it looks better, but there are cases you can point to where it looks worse as well or some has negative trade offs at the very least that can be pointed to. One of the bigger issues is most direct comparisons are against TAA and not against a given resolution with no AA or a better form of AA that isn't dodgy to begin with on image quality. I mean I guess it's better than Nvidia directly comparing it to FXAA, but at the same time it's still not exactly doing them any favors either of convincing the more image critical among us.


I'm pondering if there is a bit of LOD bias alongside the upscaling with DLSS based on Nvidia's Final Fantasy XV comparison and LOD bias could impact performance/image quality a bit depending upon how it's used. In fact in the comparison another aspect I spot is it seems like the rock up top in the middle area is less jagged looking on the DLSS said so I think the bump mapping is certainly being negatively impacted by DLSS as well. If it were compared to no AA that would be more pronounced and the added upscale sharpness to blurry AA technique would be less transparent too.

The foliage in this comparison is radically different between the two AA techniques and jarring. So I've been trying to figure out what's causing it. I suppose it could be due to the nature of TAA in the first place being a image to performance compromise technique to begin with relative to other AA processes.
http://images.nvidia.com/geforce-co...-dlss-interactive-comparison-clarity-001.html
 
Last edited:
I am not sure what they've done with this if it's some crooked trick but with DLSS the benchmark looks waaaaaay better , these are my results :

1549397693914.png
 
Back
Top