Thursday, May 2nd 2019

Intel Xe GPUs to Support Raytracing Hardware Acceleration

Intel's upcoming Xe discrete GPUs will feature hardware-acceleration for real-time raytracing, similar to NVIDIA's "Turing" RTX chips, according to a company blog detailing how the company's Rendering Framework will work with the upcoming Xe architecture. The blog only mentions that the company's data-center GPUs support the feature, and not whether its client-segment ones do. The data-center Xe GPUs are targeted at cloud-based gaming service and cloud-computing providers, as well as those building large rendering farms.

"I'm pleased to share today that the Intel Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel Rendering Framework family of API's and libraries," said Jim Jeffers, Sr. Principal Engineer and Sr. Director of Intel's Advanced Rendering and Visualization team. Intel did not go into technical details of the hardware itself. NVIDIA demonstrated that you need two major components on a modern GPU to achieve real-time raytracing: 1. a fixed-function hardware that computes intersection of rays with triangles or surfaces (which in NVIDIA's case are the RT cores), and 2. an "inexpensive" de-noiser. NVIDIA took the AI route to achieve the latter, by deploying tensor cores (matrix-multiplication units), which accelerate AI DNN building and training. Both these tasks are achievable without fixed-function hardware, using programmable unified shaders, but at great performance cost. Intel developed a CPU-based de-noiser that can leverage AVX-512.
Source: Intel
Add your own comment

59 Comments on Intel Xe GPUs to Support Raytracing Hardware Acceleration

#26
jabbadap
bugIn other words, Pascal can't do ray tracing in real time.
1080p@30 FPS or even 720p@30FPS is considered to be real time. Of course it can't do that all time, but so can be said for most powerful RTX card also. Too much RT will bring those down too, which is the reason we have hybrid rendering in the first place.
notb?
"de-noising" (Nvidia DLSS) is one thing, ray tracing is another.

The phenomenon you're describing is called "programming". It makes it possible to build universal hardware that can do different things. Great stuff!

...

Yes, "hardware raytracing" means a dedicated chip that does exactly what's supposed to be doing (just like hardware encoding/decoding, hardware compressing, hardware random number generating).
There is no games that uses Tensor cores for RT de-noising, yet. DLSS is only feature which currently uses tensor cores for gaming.
Posted on Reply
#27
BorgOvermind
Intel does not have GPUs. It has GEUs (Graphics Emulator Units). Adding that feature would be just an equivalent of a check-box on paper, just like it was on older nV cards.
Posted on Reply
#28
notb
BorgOvermindIntel does not have GPUs. It has GEUs (Graphics Emulator Units). Adding that feature would be just an equivalent of a check-box on paper, just like it was on older nV cards.
Whaaaat...?
Posted on Reply
#29
bug
notbWhaaaat...?
He works at Intel, he knows stuff we don't :P
Posted on Reply
#30
M2B
BorgOvermindIntel does not have GPUs. It has GEUs (Graphics Emulator Units). Adding that feature would be just an equivalent of a check-box on paper, just like it was on older nV cards.
What's the difference between a GPU and a GEU?
Posted on Reply
#32
medi01
SoNic67Nobody knows if the Xe de-noising will be fast enough to be real time, but if it will be... yeah, no need to waste the GPU silicon on RT.
I love how denoising is "RT" nowadays.
GJ Huang.
bugIn other words, Pascal can't do ray tracing in real time.
Neither can Turing. All it does it some basic gimmicks with heavy denoising.
Posted on Reply
#33
bug
medi01I love how denoising is "RT" nowadays.
GJ Huang.
What's Hwang got to do with anything? Computer graphics have been about faking reality from day 1.
Posted on Reply
#34
notb
bugActually if you look here: www.anandtech.com/show/14289/intel-to-support-hardware-ray-tracing-acceleration-on-data-center-xe-gpus
you can see it's aimed at everything from data center to integrated graphics. Doesn't mean everything will be released at once though.
Exactly. Intel is not targeting any particular scenario. They're entering GPU market as wide as they can.
Of course there are segments when they have been active already: integrated chips and datacenters. These will be natural for them. But since they don't need much advertising, we won't know much before the launch.

Gaming GPUs are heavily advertised already. They will happen for sure.
Posted on Reply
#35
eidairaman1
The Exiled Airman
Vya DomusJust so you people know, "support for hardware raytracing" may not necessarily mean actual dedicated hardware inside the GPU but rather it can simply mean they adapted existing hardware for some new functions that they now expose at the ISA level.
Yup discrete level may not even be a part of this.

Tbf Intel wants the piece of the AI pie.
Posted on Reply
#37
bug
I don't know about you guys, but it's so fun being able to who owns an AMD GPU. Their comments are usually sour when it comes to RTRT :D
Posted on Reply
#38
M2B
bugI don't know about you guys, but it's so fun being able to who owns an AMD GPU. Their comments are usually sour when it comes to RTRT :D
Even their leader AdoredTV believes that Ray/PathTracing is the future.
Posted on Reply
#39
notb
M2BEven their leader AdoredTV believes that Ray/PathTracing is the future.
Navi or Navi+ will get hardware RTRT as well and they'll all praise it then. Even the extreme specimen one who just gave you a -1 :P
Posted on Reply
#40
B-Real
Tsukiyomi91what type of ray-tracing method it's going to be using? real-time or not? If it's real-time then Nvidia has a competitor while AMD doesn't (yet again), but if it's rendered after rasterization, then Nvidia has no competition & will continue to milk the market.
1. AMD Navi is supposed to use HW RT. So what?
2. "milk the market" Have you seen the GPU sales of NV? Dropped by 50%. "milk" :D
Posted on Reply
#41
eidairaman1
The Exiled Airman
Rtrt is not anything special like AAWas in early 2000s.
Posted on Reply
#42
bug
notbNavi or Navi+ will get hardware RTRT as well and they'll all praise it then. Even the extreme specimen one who just gave you a -1 :p
In a way, it's like watching the Apple mob. They also dismissed 3G and even copy/paste when Apple's phones didn't have that.
Me, I just take that as a measure of (lack of) objectivity.

And yes, RTRT is a huge transformation. It will simplify the graphics pipeline and in turn graphics APIs while yielding results closer to reality. But it will take the better part of a decade or more to get there. That is why I root for speedy adoption, not because I automatically like everything Nvidia does.
eidairaman1Rtrt is not anything special like AAWas in early 2000s.
Yes, that. That's why all Hollywood blockbusters make heavy use of rasterization. Oh, wait...
Posted on Reply
#43
HwGeek
NV gonna have really hard time to justify it's stock prices, AMD and Intel will offer RTRT GPU's, Tesla prove that other company's can develop better Asic's for Autonomous cars, and thus they only left with Gaming GPU's.
NV needs other new innovations to keep it's Value.
Posted on Reply
#44
FreedomEclipse
~Technological Technocrat~
That cooler blows through.





heehee
Posted on Reply
#45
notb
eidairaman1Rtrt is not anything special like AAWas in early 2000s.
Move to ray tracing fundamentally changes how game graphics are generated.
It's not just an additional block in the pipeline (like AA). It's a major development - like when we moved from 2D to 3D.

If you don't understand the difference, I'd suggest some reading. Otherwise you'll have a very hard time understanding the changes gaming will undergo in the next few years.
HwGeekNV gonna have really hard time to justify it's stock prices, AMD and Intel will offer RTRT GPU's, Tesla prove that other company's can develop better Asic's for Autonomous cars.
As of today Intel's RT officially exists as a mention on a blog and AMD's RT as a PS5-related rumor. Nvidia's product is on a shelf in the PC store near you. That's the difference.

And Tesla's "better AI chip" is a render (nomen omen). As of today all Teslas leaving the factory are still equipped with the "dumped" Nvidia chip.
thus they only left with Gaming GPU's
Gaming is slightly under 60% of Nvidia's revenue. Automotive is 5%. The rest is Datacenters, pro cards and OEM.
NV needs other new innovations to keep it's Value.
That's the whole point of being an innovative company. You have to keep making new stuff. And don't worry. Nvidia will be fine.
Posted on Reply
#46
Vayra86
bugAnd yes, RTRT is a huge transformation. It will simplify the graphics pipeline and in turn graphics APIs while yielding results closer to reality. .
That is an assumption. You speak of a decade (which I think is very realistic, if not longer) but I think as long as games need to use some form or rasterization, RT adoption is going to have a hard time growing beyond gimmick status. Things like Metro's global illumination. And that mix most certainly isn't 'simplified', its additional complexity because for the optimal experience, the RT part and the rasterized part are developed in tandem so we don't get weird looking scenes in either situation.

And there is a cost aspect as well keeping adoption in the problematic area for quite a while. Those first gen Intel GPUs won't do much for that problem, and neither will Navi. The fact that Intel right here is announcing a focus on professional RT use cases is telling - it obviously doesn't see a market in the consumer segment yet.
Posted on Reply
#47
bug
Vayra86That is an assumption. You speak of a decade (which I think is very realistic, if not longer) but I think as long as games need to use some form or rasterization, RT adoption is going to have a hard time growing beyond gimmick status. Things like Metro's global illumination. And that mix most certainly isn't 'simplified', its additional complexity because for the optimal experience, the RT part and the rasterized part are developed in tandem so we don't get weird looking scenes in either situation.

And there is a cost aspect as well keeping adoption in the problematic area for quite a while. Those first gen Intel GPUs won't do much for that problem, and neither will Navi. The fact that Intel right here is announcing a focus on professional RT use cases is telling - it obviously doesn't see a market in the consumer segment yet.
I believe we've been over this before. The more heads we get pushing for this transition, the faster we get there ;)
And about the simplified rendering, there's no assumption. RT gives you shadows (AO included) and reflections in a single step, for example.
Posted on Reply
#48
ZoneDymo
dear lord the comments going on here...
Posted on Reply
#49
Steevo
I will believe any and all when I see our own W1zzard post glowing reviews. Until then it's all speculation and vaporware, Intel has promised a lot and delivered little in the way of graphics, and yet we are to believe they are using parts of CPUs (with security issues) to make graphics better.


I wish for the best, but won't hold my breath.
Posted on Reply
#50
TheGuruStud
SteevoI will believe any and all when I see our own W1zzard post glowing reviews. Until then it's all speculation and vaporware, Intel has promised a lot and delivered little in the way of graphics, and yet we are to believe they are using parts of CPUs (with security issues) to make graphics better.


I wish for the best, but won't hold my breath.
They're 0 and how many, now? No 10nm, no 5G modem, no mobile x86, failed at security software, sold arm division (just lol), and we all remember Larrabee and itanic. They've spent multiple 10s of billions (50 bil at least rough guess not counting normal investment in 10nm) just on this stuff and failed miserably with zero return.

It's no wonder they're desperate to try GPU. If they lose CPUs, then they're toast lol

Next time they get a bright idea, they should just give it to AMD to develop and buy their stock lolz
Posted on Reply
Add your own comment
Apr 19th, 2024 18:59 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts