• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial NVIDIA: Image Quality for DLSS in Metro Exodus to Be Improved in Further Updates, and the Nature of the Beast

Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Don't know why TPU keeps using this image from Port Royal bench, when all images from real games show DLSS to be WAY more blurry.

To make it look better.
Why would they want it to look better than it is, is another question: perhaps stock? Or that wonderful NDA? Or basic green brains syncrome?

Or, perhaps, just lazy copypasta from green infected place.
 
Joined
Feb 19, 2019
Messages
324 (0.17/day)
Hi all, I am new here :),

Metro Dev: Ray Tracing Is Doable via Compute Even on Next-Gen Consoles, RT Cores Aren’t the Only Way

It doesn’t really matter – be it dedicated hardware or just enough compute power to do it in shader units, I believe it would be viable. For the current generation – yes, multiple solutions is the way to go.
This is also a question of how long you support a parallel pipeline for legacy PC hardware. A GeForce GTX 1080 isn’t an out of date card as far as someone who bought one last year is concerned. So, these cards take a few years to phase out and for RT to become fully mainstream to the point where you can just assume it. And obviously on current generation consoles we need to have the voxel GI solution in the engine alongside the new ray tracing solution. Ray tracing is the future of gaming, so the main focus is now on RT either way.

In terms of the viability of ray tracing on next generation consoles, the hardware doesn’t have to be specifically RTX cores. Those cores aren’t the only thing that matters when it comes to ray tracing. They are fixed function hardware that speed up the calculations specifically relating to the BVH intersection tests. Those calculations can be done in standard compute if the computer cores are numerous and fast enough (which we believe they will be on the next gen consoles). In fact, any GPU that is running DX12 will be able to “run” DXR since DXR is just an extension of DX12.

Other things that really affect how quickly you can do ray tracing are a really fast BVH generation algorithm, which will be handled by the core APIs; and really fast memory. The nasty thing that ray tracing does, as opposed to something like say SSAO, is randomly access memory. SSAO will grab a load of texel data from a local area in texture space and because of the way those textures are stored there is a reasonably good chance that those texels will be quite close (or adjacent) in memory. Also, the SSAO for the next pixel over will work with pretty much the same set of samples. So, you have to load far less from memory because you can cache and awful lot of data.

Working on data that is in cache speeds things up a ridiculous amount. Unfortunately, rays don’t really have this same level of coherence. They can randomly access just about any part of the set of geometry, and the ray for the next pixels could be grabbing data from and equally random location. So as much as specialised hardware to speed up the calculations of the ray intersections is important, fast compute cores and memory which lets you get at your bounding volume data quickly is also a viable path to doing real-time RT.
https://wccftech.com/metro-dev-ray-tracing-doable-compute/

So IMO looks like NV knew that next gen Console could support DXR so they tried to launch RTX cards this year to be first to market and gain sales before the consoles are out with DXR and take high volume RTX GPU sales from NV?

P.S- if fast memory with low latency is so important to DXR- maybe next gen consoles will use HBM2?
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
If it increases those samples it would need more resources.
Technically nvidia could do it itself offline ("teach" neural network), then just ship weights as part of the driver.
 
Joined
Mar 10, 2014
Messages
1,793 (0.49/day)
Hi all, I am new here :),

Metro Dev: Ray Tracing Is Doable via Compute Even on Next-Gen Consoles, RT Cores Aren’t the Only Way


https://wccftech.com/metro-dev-ray-tracing-doable-compute/

So IMO looks like NV knew that next gen Console could support DXR so they tried to launch RTX cards this year to be first to market and gain sales before the consoles are out with DXR and take high volume RTX GPU sales from NV?

P.S- if fast memory with low latency is so important to DXR- maybe next gen consoles will use HBM2?

Problem with that it's needs awful lot compute power. Heck OC Titan V loses to RTX 2060 on Port Royal and you can't currently get more compute power than that. Will future consoles get some form of RT, maybe they will but I doubt it's build on DXR that needs acceleration HW for BVH.
 
Joined
Feb 18, 2017
Messages
688 (0.27/day)
"Image Quality for DLSS in Metro Exodus to Be Improved in Further Updates"

I'm sure every buyer of a $1100-1200 2080Ti was waiting for the future, further words in 5 months' time.

So, DLSS works best on "DEMO" or "On rail benchmarks" but epic failed in the real game compared to simple downscale method ?

Yes, you are exactly right, sir. As every feature NV announced with the RTX series is a failure in nearly half year's time. 2 games available with RT with pathetic performance, 1 RT game (FFXV) cancelled, 1 is still waiting for patch (Tomb Raider), which produced FHD 30ish results in the NV launch event. DLSS makes graphic quality look worse compared to switching it off. Now add the minimal performance boost to the last gen and the astonishingly increased prices. Then you get the result what NV financial reports indicated: a near 50% drop in gaming market sales.
 
Last edited:
Joined
Feb 15, 2019
Messages
1,525 (0.82/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Then you get the result what NV financial reports indicated: a near 50% drop in gaming market sales.
And they blamed AMD for it.
 
Joined
May 15, 2007
Messages
773 (0.13/day)
System Name Daedalus | O'Neill | ZPM Hive |
Processor M3 Pro (11/14) | Epyc 7402p | i5 12400F |
Motherboard Apple M3 Pro | SM H11SSL-i | TUF B660M-E D4 |
Cooling Pure Silence | Noctua NH-U12S TR4 | Noctua NH-D14 |
Memory 18GB Unified | 128GB DDR4 | 16GB DDR4 |
Video Card(s) M3 Pro | RTX 4070FE (VM) | ARC A750 LE |
Storage 512GB NVME | ALOT of SSD's | 1TB NVME |
Display(s) 14" 3024x1964 | IPMI | 1440p UW |
Case Macbook Pro 14" | NZXT H510 Flow| BC1 Test Bench |
Audio Device(s) Onboard | None | Onboard |
Power Supply ~ 77w Magsafe | EVGA 750w G3 | HX1000i |
Mouse Razer Basilisk
Keyboard Logitech G915 TKL
Software MacOS Sonoma | Proxmox 8 | Win 11 x64 |
I'm not even sure what to make of this. We now have an "AA" (upscaling) method that requires "training" on some cluster somewhere, on a per game, per resolution basis. It seems overly complicated... like flying a spaceship to work, which is 3 blocks away from your house.

Add to this that even once its trained it will probably only look on par with the image otherwise upscaled (as per HWUB BFV video where the upscaled to 4K image was very close to native 4K yet DLSS was miles off).

Checker-boarding (Horizon Zero Dawn as an example on the PS4 Pro) and upscaling already do a good job, so its a wonder what the point of DLSS actually is. The actual software that supports it only reaffirms this position.
 
Joined
Oct 14, 2017
Messages
210 (0.09/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a

DLSS.jpeg


how about DLBS: blursampling
 
Last edited:
Joined
Nov 4, 2005
Messages
11,654 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Technically nvidia could do it itself offline ("teach" neural network), then just ship weights as part of the driver.


Why not then just include AI learning that can use one of the many CPU cores and it reads a file with the information, kinda like they did precooking PhysX interactions, but then even that "couldn't" run on competitive hardware that had more accurate rendering (FP32/24 math was broken on older Nvidia cards as a way to get a performance boost https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/). Its a gimmic that doesn't work and they of course have the ONLY real solution, PR for buy their crap.

Manufacture a problem, engineer a solution, try to profit.
 
Last edited:
Joined
Jun 16, 2016
Messages
409 (0.14/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
Nvidia may see that DLSS improving over time is a bonus, but I see it as proof that the feature is inconsistent and will never be good at launch of a game. At least SMAA, TAA, and MSAA are stable, meaning that they look good at the beginning and you don't have to worry that you're playing with "beta" anti-aliasing.

They should have held off on any DLSS release until it was strictly better. Now we just are seeing how the sausage is made and it's gross. DLSS is a pipe dream.
 
Joined
Mar 31, 2012
Messages
828 (0.19/day)
Location
NL
System Name SIGSEGV
Processor INTEL i7-7700K | AMD Ryzen 2700X
Motherboard QUANTA | ASUS Crosshair VII Hero
Cooling Air cooling 4 heatpipes | Corsair H115i | Noctua NF-A14 IndustrialPPC Fan 3000RPM
Memory Micron 16 Gb DDR4 2400 | GSkill Ripjaws 32Gb DDR4 3200 3400(OC) 14-14-14-34 @1.38v
Video Card(s) Nvidia 1060 6GB | Gigabyte 1080Ti Aorus
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo
Display(s) 15,5" / 27"
Case Black & Grey | Phanteks P400S
Audio Device(s) Realtek
Power Supply Li Battery | Seasonic Focus Gold 750W
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint KDE |UBUNTU | Windows 10 PRO
Benchmark Scores i dont care about scores
Joined
Feb 19, 2019
Messages
324 (0.17/day)
What is the law’s definition of false advertising?
Generally, false advertising laws say that consumers have proved their case if they show: (a) that the advertising was false or misleading; (b) that the falsity was “material,” often meaning the company lied about something important; (c) the consumer saw the false advertisement; and (d) the consumer relied on the false advertising in purchasing the product or service. Consumers may show reliance be proving they wouldn’t have bought the product or service if not for the false advertising.They may also show they relied on a false advertisement if a false statement caused them to pay more for the company’s product or service than they otherwise would have.

A false advertisement may directly say something that is not true, or is misleading. By an advertisement may also be “false” based on what it doesn’t say. If important information is omitted from an advertisement and the consumer wouldn’t have bought the product or service had they known the truth, the consumer may be able to sue the company for this failure to disclose.
https://www.classlawgroup.com/consumer-protection/false-advertising/laws/

I feel like RTX owners could unite and sue Nvidiay for maybe false advertising?.
The beautiful demos without the FPS hit with RTX ON on RTX launch , No games to test when reviews went live, the unrealistic DEMO of DLSS with Port Royal, and some articles before launch "Just Buy It".

I think that if there is AMD lawsuit over false Bulldozer chip marketing, then this is even more possible?.
Of course I am no a lawyer, but this days everybody can sue for everything lol- who knows?:), just posting my thoughts.
 
Last edited:
Joined
Feb 15, 2019
Messages
1,525 (0.82/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
I think that if there is AMD lawsuit over false Bulldozer chip marketing, then this is even more possible.

Don't worry,
Nvidia 's Cuda "core" falls into the same category as bulldozer "core" and will get sued when it actually went through xD.
 
Joined
Sep 17, 2014
Messages
20,773 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
They also said with more time, it learns better... so not sure how to take that.

Of course it does, all machine learning works like that by nature, because the longer it is learning, the more variables have been tried and can be omitted from future runs. But that doesn't mean that results can be copied over between games. MAYBE between engines, but even those are never identical between games.

But its also a very weak excuse to buy time and keep consumers in the dark about the technology and extract sales out of curiosity and promise that it will improve. This sounds a whole lot like FineWine to me and we know that it tastes sour.

Manufacture a problem, engineer a solution, try to profit.

This!!! is what Turing is by design. RTRT is the problem and DLSS supposed to be the (bandaid) fix until performance is acceptable at native res, which it probably will never be seeing as we are now only getting a very limited RT implementation and it already slashes FPS in half.

Let. It. Die.
 
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
This!!! is what Turing is by design. RTRT is the problem
By that statement, you clearly do not understand RTRT nor what it can do for gaming in the future.
and DLSS supposed to be the (bandaid) fix until performance is acceptable at native res
Wrong. DLSS is supposed to be a replacement for Anti-aliasing, not a fix for anything else.
Let. It. Die.
Where is your head? In the sand? RTRT and DLSS are here to stay. No amount of pointless, meritless whining is going to change that. Let. It. Go.
 
Joined
Sep 17, 2014
Messages
20,773 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
By that statement, you clearly do not understand RTRT nor what it can do for gaming in the future.

Wrong. DLSS is supposed to be a replacement for Anti-aliasing, not a fix for anything else.

Where is your head? In the sand? RTRT and DLSS are here to stay. No amount of pointless, meritless whining is going to change that. Let. It. Go.

Its a case of agree to disagree isn't it ;)
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Why not then just include AI learning that can use one of the many CPU cores.
There must be a reason why nobody does machine learning on CPUs these days, don't ya think?

On the other side, yeah, looks like nothing but PR stunt, a shame that TPU is helping them push the FUD notoriously posting misleading crops.
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
You're ridiculous.

There is nothing misleading about it. Clearly it's a best case scenario... but come on.. stop with your incessant anti nvidia toxicity. It's a joke.

Another meatball that goes on ignore here.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,955 (1.13/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
Would have been nice if they at least included HDR you know something everyone can use and isn’t a performance tanking “shiny thing” not ready for prime time
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
They also said with more time, it learns better... so not sure how to take that.
That's their way of saying that they needed to put the tensor cores to use to make the investment in an RTX card to be worth it to be honest. All they did was use machine learning to "fill in the blanks" because that hardware is available for possible acceleration. This is really just nVidia trying to use all the extra cruft that they added to these GPUs, at least that's the vibe I'm getting.
Why not then just include AI learning that can use one of the many CPU cores and it reads a file with the information, kinda like they did precooking PhysX interactions, but then even that "couldn't" run on competitive hardware that had more accurate rendering (FP32/24 math was broken on older Nvidia cards as a way to get a performance boost https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/). Its a gimmic that doesn't work and they of course have the ONLY real solution, PR for buy their crap.
One word. Latency. We already can see the latency hit it takes when using the tensor cores (hence why the GPU needs to be under heavy load, otherwise the framerate will be too high to make it worth it.) Offloading this stuff to the CPU will make the latency problem worse, at least that's my take on it.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
That's their way of saying that they needed to put the tensor cores to use to make the investment in an RTX card to be worth it to be honest. All they did was use machine learning to "fill in the blanks" because that hardware is available for possible acceleration. This is really just nVidia trying to use all the extra cruft that they added to these GPUs, at least that's the vibe I'm getting.
Right, yep. I get it. Makes sense to use the hardware they put on the card.

I was simply trying to convey that, over time, things should improve. 3DMark is clearly a best case scenario due to its static and limited FPS in the benchmark, but I don't feel it is misleading/FUD/PR stunt. Time will tell how much improvement we will see.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
I was simply trying to convey that, over time, things should improve. 3DMark is clearly a best case scenario due to its static and limited FPS in the benchmark, but I don't feel it is misleading. Time will tell how much improvement we will see.
The problem is that the nature of machine learning is that when stuff is done incorrectly then the machine adjusts. It relies on being wrong some portion of the time, otherwise there would be no purpose to "learning" because a static algorithm could be applied ahead of time without trying to figure out if it got it right or not. ML is good at filling in the blanks when there isn't a good way to definitively get the right answer, but the reality is that it's going to get some of those blanks wrong, that's the nature of ML.

Let me put it another way, ML is a lot more like lossy compression. You lose some level of accuracy by using it.
 
Top