• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Microsoft Releases DirectX Raytracing - NVIDIA Volta-based RTX Adds Real-Time Capability

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
30,443 (6.46/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Just to abandon it like physx/sli...
 
Joined
Feb 3, 2017
Messages
2,360 (1.96/day)
Processor i5-8400
Motherboard ASUS ROG STRIX Z370-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-3200 CL16
Video Card(s) Gainward GeForce RTX 2080 Phoenix
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Logitech G700
Keyboard Corsair K60
Denoise? Sounds like they are trying to optimize what is essentially upscaling or antialiasing equivalent for raytracing.
Nvidia surely does not mind, raytracing will eat up all the hardware you are able to throw at it and they are more than happy to help provide that for you.

AMD probably intends to leverage Vega 16-bit compute capabilities for the same purpose as Nvidia does with Tensor cores.

Having an API, even more so in DX, is very VERY important milestone for raytracing even if no significant game/application takes advantage of it this time around.
 
Joined
Mar 10, 2014
Messages
1,753 (0.77/day)
Denoise? Sounds like they are trying to optimize what is essentially upscaling or antialiasing equivalent for raytracing.
Nvidia surely does not mind, raytracing will eat up all the hardware you are able to throw at it and they are more than happy to help provide that for you.

AMD probably intends to leverage Vega 16-bit compute capabilities for the same purpose as Nvidia does with Tensor cores.

Having an API, even more so in DX, is very VERY important milestone for raytracing even if no significant game/application takes advantage of it this time around.
There's another AI:ish difference between Pascal and Volta SMs, Volta can do fp32 and int32 at the same time while Pascal can't. Just remembered that Nvidias research paper of AI denoising filter, so maybe their denoising filter really need tensor cores for to make it real time:
We implemented the inference (i.e. runtime reconstruction) using fused CUDA kernels and cuDNN 5.11 convolution routines with Winograd optimization. We were able to achieve highly interactive performance on the latest GPUs. For a 720p image (1280×720 pixels), the reconstruction time was 54.9ms on NVIDIA (Pascal) Titan X. The execution time scales linearly with the number of pixels.
The performance of the comparison methods varies considerably. EAW (10.3ms) is fast, while SBF (74.2ms), AAF (211ms), and LBF (1550ms) are slower than our method (54.9ms). The NFOR method has a runtime of 107–121s on Intel i7-7700HQ CPU. Our comparisons are based on the image quality obtainable from a fixed number of input samples, disregarding the performance differencies. That said, the performance of our OptiX-based path tracer varies from 70ms in SponzaGlossy to 260ms in SanMiguel for 1 sample/pixel. Thus in this context, until the path tracer becomes substantially faster, it would be more expensive to take another sample/pixel than it is to reconstruct the image using our method.
Furthermore, our method is a convolutional network, and there is a strong evidence that the inference of such networks can be accelerated considerably by building custom reduced-precision hardware units for it, e.g., over 100× [Han et al. 2016]. In such a scenario, our method would move from highly interactive speeds to the realtime domain.
 

bug

Joined
May 22, 2015
Messages
7,533 (4.12/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
My guess is even if this will work on Volta, it will still bring it to its knees. Historically, none of the features being introduced ever worked fine on the first generation hardware that supported them. Tessellation, PS 3.0, PS 2.0, vertex shaders, TnL, even 8 bits per channel was too much for the hardware at its time.

So everybody just take it easy, for the time being this is aimed at developers so they get get their feet wet. We'll get this in a usable form in the next generation of GPUs. Or the one after that.
 
Joined
Mar 10, 2014
Messages
1,753 (0.77/day)
My guess is even if this will work on Volta, it will still bring it to its knees. Historically, none of the features being introduced ever worked fine on the first generation hardware that supported them. Tessellation, PS 3.0, PS 2.0, vertex shaders, TnL, even 8 bits per channel was too much for the hardware at its time.

So everybody just take it easy, for the time being this is aimed at developers so they get get their feet wet. We'll get this in a usable form in the next generation of GPUs. Or the one after that.
Sure, GDC is game developers conference after all. The rumor is though there will be games with DXR features coming this year, obviously they will be optional but I don't think we are that far a way from hardware side either.
 

bug

Joined
May 22, 2015
Messages
7,533 (4.12/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Sure, GDC is game developers conference after all. The rumor is though there will be games with DXR features coming this year, obviously they will be optional but I don't think we are that far a way from hardware side either.
Yeah, enable DXR and enjoy the game at 12fps. However this is still invaluable for developers that can validate the rendering, the drivers at whatnot. Consumers, I think we're looking a two year waiting at the minimum. Maybe less you're willing to buy high-end and SLI/Crossfire.

However, ray tracing is worth any wait ;)
 
Joined
Feb 18, 2005
Messages
2,585 (0.46/day)
Location
United Kingdom
My guess is even if this will work on Volta, it will still bring it to its knees. Historically, none of the features being introduced ever worked fine on the first generation hardware that supported them. Tessellation, PS 3.0, PS 2.0, vertex shaders, TnL, even 8 bits per channel was too much for the hardware at its time.

So everybody just take it easy, for the time being this is aimed at developers so they get get their feet wet. We'll get this in a usable form in the next generation of GPUs. Or the one after that.
What's your basis for saying that? Why would NVIDIA create hype and game devs embrace this if it can't create playable framerates?
 
Joined
Oct 30, 2013
Messages
25 (0.01/day)
What's your basis for saying that? Why would NVIDIA create hype and game devs embrace this if it can't create playable framerates?
Every first generation hardware that is using some next gen feature will be only formally playable on that hardware.
Someone old enough maybe remembers T&L (Transforming and Lightning) on the first Geforce ever made.
When game designers finally implemented T&L Geforce1 was already to weak to run any game that had that feature.
And i am sure there are more similar up to date examples
 
Last edited:

bug

Joined
May 22, 2015
Messages
7,533 (4.12/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
What's your basis for saying that? Why would NVIDIA create hype and game devs embrace this if it can't create playable framerates?
I just told you in my previous post: to lay the groundwork for what's to come.
And I also told you what's my basis: that's how all new technologies have been introduced since I care to remember.
 
Joined
Jan 8, 2017
Messages
5,011 (4.06/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) OEM Dell GTX 1080
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
Why would NVIDIA create hype and game devs embrace this if it can't create playable framerates?
Come on , don't look so surprised. Remember the frame rate crippling tessellation from the early days ?
 
Joined
Feb 3, 2017
Messages
2,360 (1.96/day)
Processor i5-8400
Motherboard ASUS ROG STRIX Z370-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-3200 CL16
Video Card(s) Gainward GeForce RTX 2080 Phoenix
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Logitech G700
Keyboard Corsair K60
Come on , don't look so surprised. Remember the frame rate crippling tessellation from the early days ?
You mean TruForm? :)
But tessellation is actually a fairly good example of a long adoption period. The tech was there in hardware since 2001 but it only gradually gained traction in mid to late 2000s. Getting tessellation into an API (primarily Direct3D) was a huge step in making that happen.
 
Joined
Mar 10, 2014
Messages
1,753 (0.77/day)
Yeah, enable DXR and enjoy the game at 12fps. However this is still invaluable for developers that can validate the rendering, the drivers at whatnot. Consumers, I think we're looking a two year waiting at the minimum. Maybe less you're willing to buy high-end and SLI/Crossfire.

However, ray tracing is worth any wait ;)
Which part of it to enable though, why to go full-monty if one does not have to? I.E. There's already other good methods for AO and shadows, maybe use raytracing just for reflections if it's feasible by performance penalty(Wonder how would mirrors edge look-like with raytraced reflections :D)... It's PC after all, settings for adding/removing graphical fidelity is already very large.
 

bug

Joined
May 22, 2015
Messages
7,533 (4.12/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Which part of it to enable though, why to go full-monty if one does not have to? I.E. There's already other good methods for AO and shadows, maybe use raytracing just for reflections if it's feasible by performance penalty(Wonder how would mirrors edge look-like with raytraced reflections :D)... It's PC after all, settings for adding/removing graphical fidelity is already very large.
Yeah, I don't think ray tracing works like that. I don't know the specifics of this implementation, but when I learnt about it, ray tracing was pretty much a scene-wide affair.
 
Joined
Feb 3, 2017
Messages
2,360 (1.96/day)
Processor i5-8400
Motherboard ASUS ROG STRIX Z370-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-3200 CL16
Video Card(s) Gainward GeForce RTX 2080 Phoenix
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Logitech G700
Keyboard Corsair K60

(3:35-5:50 looks especially impressive)
 
Top