• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Releases Comparison Benchmarks for DLSS-Accelerated 4K Rendering

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
DLSS needs an engine with temporal anti-aliasing support to be implemented.
Temporal Anti Aliasing is the most popular AA method nowadays.
DLSS implementation is relatively easy and doesn't need that much effort.

If it were, current games would already be benefiting from it and yet they're not.

Heck: Shadow of the Tomb Raider doesn't have it and this game was one of the few showcased with it enabled when Turing cards were released, during that presentation by nVidia's CEO. How long ago was that, exactly? If it were easy, as you claim, it should be already enabled, no?
 

bug

Joined
May 22, 2015
Messages
13,222 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
If it were, current games would already be benefiting from it and yet they're not.
That doesn't automatically mean it's hard to implement. It could be because of the non-existing installed base of hardware to run it. The eternal chicken-and-egg problem with every new addition.

SotTR devs didn't even have enough time to implement RT in all the scenes in the demo (game will/did launch without RT, to be added in a later patch).
 

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
That doesn't automatically mean it's hard to implement. It could be because of the non-existing installed base of hardware to run it. The eternal chicken-and-egg problem with every new addition.

SotTR devs didn't even have enough time to implement RT in all the scenes in the demo (game will/did launch without RT, to be added in a later patch).

No way: only an nVidia 2070+ card is required for it, as far as we've been told: what other base of hardware are you talking about?

But but but it's supposed to be so easy to implement ... :rolleyes:
 

bug

Joined
May 22, 2015
Messages
13,222 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
No way: only an nVidia 2070+ card is required for it, as far as we've been told: what other base of hardware are you talking about?

But but but it's supposed to be so easy to implement ... :rolleyes:
How many users are currently running at least a GTX 2070 to make it worth the devs effort?

And there are upcoming titles making use of DLSS, make no mistake about that: https://nvidianews.nvidia.com/news/...racing-and-ai-to-barrage-of-blockbuster-games
SotTR is one of them.
Already released titles won't do it because they've already got your money.
 

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
How many users are currently running at least a GTX 2070 to make it worth the devs effort?

And there are upcoming titles making use of DLSS, make no mistake about that: https://nvidianews.nvidia.com/news/...racing-and-ai-to-barrage-of-blockbuster-games
SotTR is one of them.
Already released titles won't do it because they've already got your money.

Exactly why i highly doubt we'll be seeing ray tracing take off in this generation.

Mantle was in the exact same position and look how that turned out ...

We can talk all we want about upcoming titles that are supposed to hit the market with the tech enabled but, until they hit the market and we can actually find out if it's worth it from both the visual and the performance aspects, it's all talk.

nVidia has the advantage of being in a higher market position, when compared to AMD @ the time they were introducing Mantle, and that can have a leverage effect.

We shall see ...
 

bug

Joined
May 22, 2015
Messages
13,222 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I'm not holding my breath for performance. It will suck. But the developers need to start somewhere, don't they?
 

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
I'm not holding my breath for performance. It will suck. But the developers need to start somewhere, don't they?

True, but i just think nVidia is trying too hard in the sense the difference is too pronounced, hence the required powerful card for it to work (2070+).

I clearly remember the presentation and the difference could be described as "night and day".

If they made "only" very noticeable improvements, the performance hit wouldn't be as big and so even lower 2000 cards could work, meaning developers wouldn't just be catering to enthusiasts but mainstream as well and would therefore would be far more inclined to have the technology enabled in their games from the get go, and even patching already released games.
 

bug

Joined
May 22, 2015
Messages
13,222 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
True, but i just think nVidia is trying too hard in the sense the difference is too pronounced, hence the required powerful card for it to work (2070+).

I clearly remember the presentation and the difference could be described as "night and day".

If they made "only" very noticeable improvements, the performance hit wouldn't be as big and so even lower 2000 cards could work, meaning developers wouldn't just be catering to enthusiasts but mainstream as well and would therefore would be far more inclined to have the technology enabled in their games from the get go, and even patching already released games.
My feeling is, with the huge die their margins are probably already razor-thin, so besides getting the tech in the hands of enthusiasts, Nvidia doesn't actually want to sell many of these. But again, that's ust my feeling.
 

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
My feeling is, with the huge die their margins are probably already razor-thin, so besides getting the tech in the hands of enthusiasts, Nvidia doesn't actually want to sell many of these. But again, that's ust my feeling.

100% false, and no: i don't have any sources for that.

From what we know of nVidia is that they don't sell for low profit: @ least not huge die chips.

But it's true that they are most likely having yield issues, due to the die's size. That's the problem with huge dies: just ask Intel about it, regarding high end server chips.

For example, the 2080 has a die size of 545 mm2. If we get the square root, it's just under 23.35 and we'll round that to 23.4 for this example.

If we use the die per wafer calculator for this, and assuming a low density defect for a mature process (which i'm not entirely sure of), we get:

Screenshot from 2018-10-16 16-31-02.png

Less then 60% yield rate, and that's before binning, because there are "normal" 2080s, FE 2080s and AIB 2080s. This adds to the cost as you said, but then there's "the nVidia tax" which inflates stuff even more.
 
Joined
Jan 13, 2018
Messages
157 (0.07/day)
System Name N/A
Processor Intel Core i5 3570
Motherboard Gigabyte B75
Cooling Coolermaster Hyper TX3
Memory 12 GB DDR3 1600
Video Card(s) MSI Gaming Z RTX 2060
Storage SSD
Display(s) Samsung 4K HDR 60 Hz TV
Case Eagle Warrior Gaming
Audio Device(s) N/A
Power Supply Coolermaster Elite 460W
Mouse Vorago KM500
Keyboard Vorago KM500
Software Windows 10
Benchmark Scores N/A
If it were, current games would already be benefiting from it and yet they're not.

Heck: Shadow of the Tomb Raider doesn't have it and this game was one of the few showcased with it enabled when Turing cards were released, during that presentation by nVidia's CEO. How long ago was that, exactly? If it were easy, as you claim, it should be already enabled, no?

That doesn't automatically mean it's hard to implement. It could be because of the non-existing installed base of hardware to run it. The eternal chicken-and-egg problem with every new addition.

SotTR devs didn't even have enough time to implement RT in all the scenes in the demo (game will/did launch without RT, to be added in a later patch).

Since the first weeks of releasing a new game are the ones where most people buy and play it I dont see any incentive from developer to add RTX and/or DLSS in SOTR.

My feeling is, with the huge die their margins are probably already razor-thin, so besides getting the tech in the hands of enthusiasts, Nvidia doesn't actually want to sell many of these. But again, that's ust my feeling.

Actually when margins are thin, you increase production since you need to sell more to achieve the projected profit, is low margin high volume business model.
 
Joined
Feb 14, 2012
Messages
2,323 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
There was no public release of this 416.25 driver.
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Allegedly DLSS doesn't look much better than 1440p and TAA. Which would make sense, at the end of the day this is still just a scaling algorithm that takes a fully rendered frame from a lower resolution source and upscales it to a 4K output. I feel like there is wasted potential here, using the Tensor Cores for smarter sparse rendering techniques would have proved more useful. Nvidia invested so much into this AI field that they now try to shove it in consumer products in order to get something out of it, whether it makes sense or not.
 
Last edited:

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
Allegedly DLSS doesn't look much better than 1440p and TAA. Which would make sense, at the end of the day this is still just a scaling algorithm that takes a fully rendered frame from a lower resolution source and upscales it to a 4K output. I feel like there is wasted potential here, using the Tensor Cores for smarter sparse rendering techniques would have proved more useful. Nvidia invested so much into this AI field that they now try to shove it in consumer products in order to get something out of it, whether it makes sense or not.

That's another thing, and i have to wonder really: were it AMD that came up with this method and implemented it 1st, would nVidia not cry foul?

You're not actually seeing images being rendered @ 4K but rather @ a lower resolution, applied all the enhancements @ that resolution via DLSS, and then upscaled to 4K: is this not the same as watching a full HD clip in fullscreen but with a game instead of a video, minus the enhancements part?

Kudos to nVidia to have come up with a way to do it in real time, but it's still cheating, IMO.
 
Joined
Jul 24, 2011
Messages
90 (0.02/day)
Location
phliadelphia
Processor i5 6500 @ 4.5
Motherboard Asus z170
Memory 16gb ddr4 3000
Video Card(s) gtx 1070
Storage 1tb Seagate 7200 rpm
Case Antec 1200
Power Supply Corsair 750
Software windows 10 pro
Last edited:

bug

Joined
May 22, 2015
Messages
13,222 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Since the first weeks of releasing a new game are the ones where most people buy and play it I dont see any incentive from developer to add RTX and/or DLSS in SOTR.
SotTR has been announced to receive these in a patch, whether you see it or not.

Actually when margins are thin, you increase production since you need to sell more to achieve the projected profit, is low margin high volume business model.
That's a blanket statement that doesn't fit here. The margin are thin precisely because when the chip is big, you have to throw away a big part of the waffer. Between that and everybody else fighting for 14nm production, good luck increasing production.
 
Joined
Dec 21, 2005
Messages
480 (0.07/day)
Location
USA
System Name Eric's Battlestation
Processor Core i7 6700k
Motherboard GIGABYTE G1 Gaming GA-Z170X-Gaming 7
Cooling Fractal Design Celsius S24
Memory Patriot Viper Steel Series DDR4 32GB 3200MHz
Video Card(s) MSI Mech 6750 XT
Storage Samsung 850 EVO 1TB, Crucial MX500 1TB, Intel 660p 2TB
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply EVGA G2-XR 80 Plus Gold 750W
Mouse Steelseries Rival 3
Keyboard Logitech G810
Software Microsoft Windows 10 Home
DLSS, and RTX are both promising but until Nvidia can brag how many games actually support it NOW I don't think it really matters. No way can they convince me to break out the wallet on hardware that costs 50% than it should for features that aren't benefiting me now.
 
Joined
Sep 15, 2011
Messages
6,467 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Looking at current presentation slides, the quality is kind of sub mediocre. It's quality is way way worst than TSSAAx8 for example, while being marginally faster. And don't even get me started with SMAA which has zero impact on performance and can be used on any generation card from both vendors.
 
Joined
Jan 13, 2018
Messages
157 (0.07/day)
System Name N/A
Processor Intel Core i5 3570
Motherboard Gigabyte B75
Cooling Coolermaster Hyper TX3
Memory 12 GB DDR3 1600
Video Card(s) MSI Gaming Z RTX 2060
Storage SSD
Display(s) Samsung 4K HDR 60 Hz TV
Case Eagle Warrior Gaming
Audio Device(s) N/A
Power Supply Coolermaster Elite 460W
Mouse Vorago KM500
Keyboard Vorago KM500
Software Windows 10
Benchmark Scores N/A
That's a blanket statement that doesn't fit here. The margin are thin precisely because when the chip is big, you have to throw away a big part of the waffer. Between that and everybody else fighting for 14nm production, good luck increasing production.

That is why it is not good that your whole production relays in a single foundry, specially if it is the single foundry of others too.

Actually what determines if the IQ of DLSS is equivalent to X resolution? The user eyes? A mathematical algo? The developer graphics settings? The NVIDIA's marketing team?
 

bug

Joined
May 22, 2015
Messages
13,222 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
That is why it is not good that your whole production relays in a single foundry, specially if it is the single foundry of others too.
Now you just sound like a teen noob (that's not derogatory, I used to be one myself). You think fabs and production capacity in general grow on trees? You think having to source your stuff from multiple sources lowers your costs?

Actually what determines if the IQ of DLSS is equivalent to X resolution? The user eyes? A mathematical algo? The developer graphics settings? The NVIDIA's marketing team?
That would be primarily a diff between the original and anti-aliased image. The more you manage to stick to actual edges in the image and alter nothing else, the better the IQ.
 
Last edited:
Top