Tuesday, January 29th 2019

Intel Launches Free Open Image Denoise Library for Ray-tracing

De-noising is a vital post-processing component of ray-traced images, as it eliminates visual noise generated by too few rays intersecting pixels that make up an image. In an ideal world, a ray should hit every pixel on the screen, but in the real world, computing hasn't advanced enough to do that in reasonable/real-time. Denoising attempts to correct and reconstruct such images. Intel today launched a free Open Image Denoise (OIDN) library for ray-tracing.

Governed by the Apache 2.0 license, OIDN is part of Intel Rendering framework. From the looks of it, the library is CPU-based, and leverages 64-bit x86 CPU (scaling with multi-core and exotic instruction-sets), to de-noise images. Intel says OIDN works on any device with a 64-bit x86 processor (with at least SSE4.2 instruction-set), although it can take advantage of AVX2 and AVX-512 to speed things up by an order of magnitude. The closest (and closed) alternative to OIDN would be NVIDIA's AI De-noiser. NVIDIA "Turing" GPUs use a combination of ad-hoc deep-learning neural networks and GPU compute to de-noise. You can freely access OIDN on Intel's Git.
Source: Intel Open Image Denoise Library (Github)
Add your own comment

15 Comments on Intel Launches Free Open Image Denoise Library for Ray-tracing

#1
dj-electric
Since Raja joined Intel he has been firing to all cylinders. The amounts of ideas and news from the Intel Graphics' twitter is insane, also the open-ness for ideas from the community
Posted on Reply
#2
Slizzo
dj-electric said:
Since Raja joined Intel he has been firing to all cylinders. The amounts of ideas and news from the Intel Graphics' twitter is insane, also the open-ness for ideas from the community
I think having a big budget, and seemingly being left alone to do with it what he wants is a big boon for him and any ideas he and his team has. Good move for him, for sure.
Posted on Reply
#3
Steevo
dj-electric said:
Since Raja joined Intel he has been firing to all cylinders. The amounts of ideas and news from the Intel Graphics' twitter is insane, also the open-ness for ideas from the community
Maybe he is building with Intel money what AMD needs to succeed.
Posted on Reply
#4
theoneandonlymrk
Steevo said:
Maybe he is building with Intel money what AMD needs to succeed.
I think IP puts that idea in doubt , he's working with what he can, so be realistic.
Posted on Reply
#5
Vayra86
Steevo said:
Maybe he is building with Intel money what AMD needs to succeed.
Heh, strangely with this announcement, you may very well have a point.

Also nice how Intel found a way to market AVX to gamers... possibly.
Posted on Reply
#6
XXL_AI
there are around hundred tools already available on github, on university and research company servers FREE to use for over 3 years about denoising, superresolution, segmentation and all other deep learning stuff. nothing new, nothing inovative from chipzilla.
Posted on Reply
#7
Minus Infinity
Ray tracing sucks, no really. You can get infinitely better results using beam tracing and since the beams are actual solutions of the Maxwell equations, all the physics is built-in. Unlike a ray which is not a physical entity, where you have yo fudge things like refraction and colour etc, that all comes for free from a Gaussian beam. It might only take 500 beams to produce the same results as 40,000,000 rays, and you can get funky physics like negative refractive index, diffraction, colour etc. Shame to see ray-tracing still being pushed in 2019. We need the BTX30xx series.
Posted on Reply
#8
Xzibit
Siggraph 2018

<div class="youtube-embed" data-id="lg44qMh1QTc"><img src="https://i.ytimg.com/vi/lg44qMh1QTc/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=lg44qMh1QTc" target="_blank" class="youtube-title"></a></div>
Posted on Reply
#9
theoneandonlymrk
Xzibit said:
Siggraph 2018

<div class="youtube-embed" data-id="lg44qMh1QTc"><img src="https://i.ytimg.com/vi/lg44qMh1QTc/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=lg44qMh1QTc" target="_blank" class="youtube-title"></a></div>
Thanks , missed that , very interesting.
Posted on Reply
#10
XXL_AI
Minus Infinity said:
Ray tracing sucks, no really. You can get infinitely better results using beam tracing and since the beams are actual solutions of the Maxwell equations, all the physics is built-in. Unlike a ray which is not a physical entity, where you have yo fudge things like refraction and colour etc, that all comes for free from a Gaussian beam. It might only take 500 beams to produce the same results as 40,000,000 rays, and you can get funky physics like negative refractive index, diffraction, colour etc. Shame to see ray-tracing still being pushed in 2019. We need the BTX30xx series.
there are several thousand scientific advancements going on at the same time, some other company will implement beam tracing and try to sell us.
Posted on Reply
#11
RichF
Minus Infinity said:
Ray tracing sucks, no really. You can get infinitely better results using beam tracing and since the beams are actual solutions of the Maxwell equations, all the physics is built-in. Unlike a ray which is not a physical entity, where you have yo fudge things like refraction and colour etc, that all comes for free from a Gaussian beam. It might only take 500 beams to produce the same results as 40,000,000 rays, and you can get funky physics like negative refractive index, diffraction, colour etc. Shame to see ray-tracing still being pushed in 2019. We need the BTX30xx series.
Very interesting. This is the first time I've heard about this. Why hasn't any tech site written an article to make your point? Perhaps you could suggest that to some of them. If you're correct then it's very important for gamers to know that ray tracing isn't what they're supposed to think it is — the ideal goal of current graphics innovation.
XXL_AI said:
there are several thousand scientific advancements going on at the same time, some other company will implement beam tracing and try to sell us.
So, you're saying it's a good thing for people to be sold something on false pretenses so industry players can milk them for more money than they should have to pay?
Posted on Reply
#12
Minus Infinity
Not sure why graphics companies and most likely movie industry have eschewed beam tracing. Some lens design software like Code V using Gaussian beams as opposed to rays like Zeemax. We used to run our beam tracing code in time similar to ray tracing and the hardware today is much better. Maybe it would be harder to accelerate beams, but better fp performance would be a start. Each beam might be more complex and take more compute power initially but requiring so few to get the same noise performance as 10's of millions of rays compensates.
Posted on Reply
#13
XXL_AI
RichF said:
So, you're saying it's a good thing for people to be sold something on false pretenses so industry players can milk them for more money than they should have to pay?
Where did I say something even closer to that? If you don't like ray tracing, you don't have to buy it, wait for beam tracing, Nvidia not pushing you to buy RTX, intel not pushing you to buy overpriced chipzilla shit, toshiba never pushed people to buy crap quality notebooks. it is a freemarket, driven by innovation, there is always something around the corner.
Posted on Reply
#14
medi01
btarunr said:
De-noising is a vital post-processing component of ray-traced images, as it eliminates visual noise generated by too few rays intersecting pixels
I love how this wording hides that "noise" is actually "gaps in the picture" and "denoising" is actually "making things up to fill the gaps
Posted on Reply
#15
stimpy88
Nice work Intel, up yours nVidia!
Posted on Reply
Add your own comment