• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces OptiX 5.0 SDK - AI-Enhanced Ray Tracing

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.33/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
At SIGGRAPH 2017, NVIDIA introduced the latest version of their AI-based, GPU-enabled ray-tracing OptiX API. The company has been at the forefront of GPU-powered AI endeavors in a number of areas, including facial animation, anti-aliasing, denoising, and light transport. OptiX 5.0 brings a renewed focus on AI-based denoising.

AI training is still a brute-force scenario with finesse applied at the end: basically, NVIDIA took tens of thousands of image pairs of rendered images with one sample per pixel and a companion image of the same render with 4,000 rays per pixel, and used that to train the AI to predict what a denoised image looks like. Basically (and picking up the numbers NVIDIA used for its AI training), this means that in theory, users deploying OptiX 5.0 only need to render one sample per pixel of a given image, instead of the 4,000 rays per pixel that would be needed for its final presentation. Based on its learning, the AI will then be able to fill in the blanks towards finalizing the image, saving the need to render all that extra data. NVIDIA quotes a 157x improvement in render time using a DGX station with Optix 5.0 deployed against the same render on a CPU-based platform (2 x E5-2699 v4 @ 2.20GHz). The Optix 5.0 release also includes provisions for GPU-accelerated motion blur, which should do away with the need to render a frame multiple times and then applying a blur filter through a collage of the different frames. NVIDIA said OptiX 5.0 will be available in November. Check the press release after the break.





Running OptiX 5.0 on the NVIDIA DGX Station -- the company's recently introduced deskside AI workstation -- will give designers, artists and other content-creation professionals the rendering capability of 150 standard CPU-based servers. This access to GPU-powered accelerated computing will provide extraordinary ability to iterate and innovate with speed and performance, at a fraction of the cost.



"Developers using our platform can enable millions of artists and designers to access the capabilities of a render farm right at their desk," said Bob Pette, Vice President, Professional Visualization, NVIDIA. "By creating OptiX-based applications, they can bring the extraordinary power of AI to their customers, enhancing their creativity and dramatically improving productivity."

OptiX 5.0's new ray tracing capabilities will speed up the process required to visualize designs or characters, dramatically increasing a creative professional's ability to interact with their content. It features new AI denoising capability to accelerate the removal of graininess from images, and brings GPU-accelerated motion blur for realistic animation effects.



OptiX 5.0 will be available at no cost to registered developers in November.

Rendering Appliance Powers AI Workflows
By running NVIDIA OptiX 5.0 on a DGX Station, content creators can significantly accelerate training, inference and rendering. A whisper-quiet system that fits under a desk, NVIDIA DGX Station uses the latest NVIDIA Volta-generation GPUs, making it the most powerful AI rendering system available.



To achieve equivalent rendering performance of a DGX Station, content creators would need access to a render farm with more than 150 servers that require some 200 kilowatts of power, compared with 1.5 kilowatts for a DGX Station. The cost for purchasing and operating that render farm would reach $4 million over three years compared with less than $75,000 for a DGX Station.

Industry Support for AI-based Graphics
NVIDIA is working with many of the world's most important technology companies and creative visionaries from Hollywood studios to set the course for the use of AI for rendering, design, character generation and the creation of virtual worlds. They voiced broad support for the company's latest innovations:

  • "AI is transforming industries everywhere. We're excited to see how NVIDIA's new AI technologies will improve the filmmaking process." -- Steve May, Vice President and CTO, Pixar
  • "We're big fans of NVIDIA OptiX. It greatly reduced our development cost while porting the ray tracing core of our Clarisse renderer to NVIDIA GPUs and offers extremely fast performance. With the potential to significantly decrease rendering times with AI-accelerated denoising, OptiX 5 is very promising as it can become a game changer in production workflows." -- Nicolas Guiard, Principal Engineer, Isotropix
  • "AI has the potential to turbocharge the creative process. We see a future where our artists' creativity is unleashed with AI -- a future where paintbrushes can truly 'think' and empower artists to create images and experiences we could hardly imagine just a few years ago. At Technicolor, we share NVIDIA's vision to chart a path that enhances the toolset for creatives to deepen audience experiences." -- Sutha Kamal, Vice President, Technology Strategy, Technicolor.

View at TechPowerUp Main Site
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
I love it how everything is VR this and AI that these days. It's like returning to the 1980's... The AI thing is total bullshit to sell this stuff to impressed idiots. Algorithms don't make an AI by itself. AI can be a very complex set of algorithms. Bouncing some light around and detecting basic shapes doesn't make it an AI. It's just an algorithm. This AI claim is about as silly as the one about Ryzen having a "neural network" built in. It has an ARM secure coprocessor that can potentially adapt to different cache usage scenarios (if they even use it for this), but it doesn't have a "neural network" in it.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,463 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
I love it how everything is VR this and AI that these days. It's like returning to the 1980's... The AI thing is total bullshit to sell this stuff to impressed idiots. Algorithms don't make an AI by itself. AI can be a very complex set of algorithms. Bouncing some light around and detecting basic shapes doesn't make it an AI. It's just an algorithm. This AI claim is about as silly as the one about Ryzen having a "neural network" built in. It has an ARM secure coprocessor that can potentially adapt to different cache usage scenarios (if they even use it for this), but it doesn't have a "neural network" in it.

What is a neural network but a learning system? It doesn't matter if it's in Ryzen or Nvidia's VR, if a system of algorithms is designed to learn from observations and creating novel or 'expected' outrcomes from stimuli, it is a basic 'neural network'. Don't worry man, when quantum computing comes to fruition the neural networks will properly enslave us all for our ill mannered planet destroying ways. Hail Skynet!
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
25,894 (3.79/day)
Location
Alabama
System Name Rocinante
Processor I9 14900KS
Motherboard EVGA z690 Dark KINGPIN (modded BIOS)
Cooling EK-AIO Elite 360 D-RGB
Memory 64GB Gskill Trident Z5 DDR5 6000 @6400
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 500GB 980 Pro | 1x 1TB 980 Pro | 1x 8TB Corsair MP400
Display(s) Odyssey OLED G9 G95SC
Case Lian Li o11 Evo Dynamic White
Audio Device(s) Moondrop S8's on Schiit Hel 2e
Power Supply Bequiet! Power Pro 12 1500w
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11
Benchmark Scores I dont have time for that.
What is a neural network but a learning system? It doesn't matter if it's in Ryzen or Nvidia's VR, if a system of algorithms is designed to learn from observations and creating novel or 'expected' outrcomes from stimuli, it is a basic 'neural network'. Don't worry man, when quantum computing comes to fruition the neural networks will properly enslave us all for our ill mannered planet destroying ways. Hail Skynet!

But can skynet play crysis?
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,463 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Joined
Feb 8, 2012
Messages
3,013 (0.68/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
silly as the one about Ryzen having a "neural network" built in
You mean using perceptrons in a branch predictor? Perceptron being a smallest possible neural network was too much to pass for a marketing team I suppose ... you should wait until ryzen's branch predictor passes a Turing test.
 
Top