• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Launches Free Open Image Denoise Library for Ray-tracing

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
De-noising is a vital post-processing component of ray-traced images, as it eliminates visual noise generated by too few rays intersecting pixels that make up an image. In an ideal world, a ray should hit every pixel on the screen, but in the real world, computing hasn't advanced enough to do that in reasonable/real-time. Denoising attempts to correct and reconstruct such images. Intel today launched a free Open Image Denoise (OIDN) library for ray-tracing.

Governed by the Apache 2.0 license, OIDN is part of Intel Rendering framework. From the looks of it, the library is CPU-based, and leverages 64-bit x86 CPU (scaling with multi-core and exotic instruction-sets), to de-noise images. Intel says OIDN works on any device with a 64-bit x86 processor (with at least SSE4.2 instruction-set), although it can take advantage of AVX2 and AVX-512 to speed things up by an order of magnitude. The closest (and closed) alternative to OIDN would be NVIDIA's AI De-noiser. NVIDIA "Turing" GPUs use a combination of ad-hoc deep-learning neural networks and GPU compute to de-noise. You can freely access OIDN on Intel's Git.



View at TechPowerUp Main Site
 
Since Raja joined Intel he has been firing to all cylinders. The amounts of ideas and news from the Intel Graphics' twitter is insane, also the open-ness for ideas from the community
 
Since Raja joined Intel he has been firing to all cylinders. The amounts of ideas and news from the Intel Graphics' twitter is insane, also the open-ness for ideas from the community

I think having a big budget, and seemingly being left alone to do with it what he wants is a big boon for him and any ideas he and his team has. Good move for him, for sure.
 
Since Raja joined Intel he has been firing to all cylinders. The amounts of ideas and news from the Intel Graphics' twitter is insane, also the open-ness for ideas from the community


Maybe he is building with Intel money what AMD needs to succeed.
 
Maybe he is building with Intel money what AMD needs to succeed.
I think IP puts that idea in doubt , he's working with what he can, so be realistic.
 
Maybe he is building with Intel money what AMD needs to succeed.

Heh, strangely with this announcement, you may very well have a point.

Also nice how Intel found a way to market AVX to gamers... possibly.
 
there are around hundred tools already available on github, on university and research company servers FREE to use for over 3 years about denoising, superresolution, segmentation and all other deep learning stuff. nothing new, nothing inovative from chipzilla.
 
Ray tracing sucks, no really. You can get infinitely better results using beam tracing and since the beams are actual solutions of the Maxwell equations, all the physics is built-in. Unlike a ray which is not a physical entity, where you have yo fudge things like refraction and colour etc, that all comes for free from a Gaussian beam. It might only take 500 beams to produce the same results as 40,000,000 rays, and you can get funky physics like negative refractive index, diffraction, colour etc. Shame to see ray-tracing still being pushed in 2019. We need the BTX30xx series.
 
Ray tracing sucks, no really. You can get infinitely better results using beam tracing and since the beams are actual solutions of the Maxwell equations, all the physics is built-in. Unlike a ray which is not a physical entity, where you have yo fudge things like refraction and colour etc, that all comes for free from a Gaussian beam. It might only take 500 beams to produce the same results as 40,000,000 rays, and you can get funky physics like negative refractive index, diffraction, colour etc. Shame to see ray-tracing still being pushed in 2019. We need the BTX30xx series.

there are several thousand scientific advancements going on at the same time, some other company will implement beam tracing and try to sell us.
 
Ray tracing sucks, no really. You can get infinitely better results using beam tracing and since the beams are actual solutions of the Maxwell equations, all the physics is built-in. Unlike a ray which is not a physical entity, where you have yo fudge things like refraction and colour etc, that all comes for free from a Gaussian beam. It might only take 500 beams to produce the same results as 40,000,000 rays, and you can get funky physics like negative refractive index, diffraction, colour etc. Shame to see ray-tracing still being pushed in 2019. We need the BTX30xx series.
Very interesting. This is the first time I've heard about this. Why hasn't any tech site written an article to make your point? Perhaps you could suggest that to some of them. If you're correct then it's very important for gamers to know that ray tracing isn't what they're supposed to think it is — the ideal goal of current graphics innovation.
there are several thousand scientific advancements going on at the same time, some other company will implement beam tracing and try to sell us.
So, you're saying it's a good thing for people to be sold something on false pretenses so industry players can milk them for more money than they should have to pay?
 
Not sure why graphics companies and most likely movie industry have eschewed beam tracing. Some lens design software like Code V using Gaussian beams as opposed to rays like Zeemax. We used to run our beam tracing code in time similar to ray tracing and the hardware today is much better. Maybe it would be harder to accelerate beams, but better fp performance would be a start. Each beam might be more complex and take more compute power initially but requiring so few to get the same noise performance as 10's of millions of rays compensates.
 
So, you're saying it's a good thing for people to be sold something on false pretenses so industry players can milk them for more money than they should have to pay?

Where did I say something even closer to that? If you don't like ray tracing, you don't have to buy it, wait for beam tracing, Nvidia not pushing you to buy RTX, intel not pushing you to buy overpriced chipzilla shit, toshiba never pushed people to buy crap quality notebooks. it is a freemarket, driven by innovation, there is always something around the corner.
 
De-noising is a vital post-processing component of ray-traced images, as it eliminates visual noise generated by too few rays intersecting pixels

I love how this wording hides that "noise" is actually "gaps in the picture" and "denoising" is actually "making things up to fill the gaps
 
Nice work Intel, up yours nVidia!
 
Back
Top