Thursday, May 2nd 2019

Intel Xe GPUs to Support Raytracing Hardware Acceleration

Intel's upcoming Xe discrete GPUs will feature hardware-acceleration for real-time raytracing, similar to NVIDIA's "Turing" RTX chips, according to a company blog detailing how the company's Rendering Framework will work with the upcoming Xe architecture. The blog only mentions that the company's data-center GPUs support the feature, and not whether its client-segment ones do. The data-center Xe GPUs are targeted at cloud-based gaming service and cloud-computing providers, as well as those building large rendering farms.

"I'm pleased to share today that the Intel Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel Rendering Framework family of API's and libraries," said Jim Jeffers, Sr. Principal Engineer and Sr. Director of Intel's Advanced Rendering and Visualization team. Intel did not go into technical details of the hardware itself. NVIDIA demonstrated that you need two major components on a modern GPU to achieve real-time raytracing: 1. a fixed-function hardware that computes intersection of rays with triangles or surfaces (which in NVIDIA's case are the RT cores), and 2. an "inexpensive" de-noiser. NVIDIA took the AI route to achieve the latter, by deploying tensor cores (matrix-multiplication units), which accelerate AI DNN building and training. Both these tasks are achievable without fixed-function hardware, using programmable unified shaders, but at great performance cost. Intel developed a CPU-based de-noiser that can leverage AVX-512.
Source: Intel
Add your own comment

59 Comments on Intel Xe GPUs to Support Raytracing Hardware Acceleration

#51
danbert2000
Clearly Nvidia has the technology lead over Intel in graphics right now, but you have to wonder if they're getting a bit nervous now that AMD has decent CPUs again and Intel is coming out with GPUs. They're going to be sitting there without the ability to come out with APUs beyond crappy ARM versions. I'm looking forward to the Intel cards, more competition is good for the market and AMD has not been enough of a competitor for 5+ years.

I don't think that this news article is any sort of proof that the Intel GPUs will be that good at DXR though.
Posted on Reply
#52
moproblems99
bugThe more heads we get pushing for this transition, the faster we get there
Who's gonna upgrade GPUs to play all those Epic exclusives?

In any case, the race for the decade mark will be determined when DXR is 60hz+ on 1440 for sub $250. Event horizon will show itself then.
Posted on Reply
#53
bug
moproblems99Who's gonna upgrade GPUs to play all those Epic exclusives?

In any case, the race for the decade mark will be determined when DXR is 60hz+ on 1440 for sub $250. Event horizon will show itself then.
I won't dispute that. But still the first step to getting there is getting enough manufacturers and creators/programmers involved.
Posted on Reply
#54
eidairaman1
The Exiled Airman
SteevoI will believe any and all when I see our own W1zzard post glowing reviews. Until then it's all speculation and vaporware, Intel has promised a lot and delivered little in the way of graphics, and yet we are to believe they are using parts of CPUs (with security issues) to make graphics better.


I wish for the best, but won't hold my breath.
Larabee comes to mind
Posted on Reply
#55
Totally
moproblems99Who's gonna upgrade GPUs to play all those Epic exclusives?

In any case, the race for the decade mark will be determined when DXR is 60hz+ on 1440 for sub $250. Event horizon will show itself then.
I'm in awe at your optimism. We've had 4k for nearly 5yrs now and are just now breaching 60+ with 700-1000 dollar flagship cards. Gonna be a long decade.
Posted on Reply
#56
moproblems99
TotallyI'm in awe at your optimism. We've had 4k for nearly 5yrs now and are just now breaching 60+ with 700-1000 dollar flagship cards. Gonna be a long decade.
I think the 2080 can do it now. Next gen, the 3070 should do it. Gen after that, 4060 should do it. If AMD/Intel rocks NV's boat, pricing decline or performance increases should accelerate. We have what we have now because NV doesn't have to work. Complacency has set in.
Posted on Reply
#57
Totally
Still that's 700 for the 2080, that's nowhere near $250, and the xx60 hasn't been a $250 card for 2 gens now. I'll hope for but not expect. RTX 4000 should coincide with 4k's 10th birthday.
Posted on Reply
#58
moproblems99
TotallyStill that's 700 for the 2080, that's nowhere near $250, and the xx60 hasn't been a $250 card for 2 gens now. I'll hope for but not expect. RTX 4000 should coincide with 4k's 10th birthday.
Btw, I am not saying it will happen, just saying that until we hit those points it ain't happening in 10 years. I honestly don't care either way as I would prefer developers spend time on writing then trying to get Ray Tracing to work in their games. If NV wants to write the code for them and finance it, so be it but I would much rather have stories that didn't suck. Or engines like Creation.
Posted on Reply
#59
Midland Dog
bugIn other words, Pascal can't do ray tracing in real time.
real time has to be 30 fps+ so pascal can do RTRT (1080ti barely does it). i would actually like to see how p100 does at it being more compute oriented. considering the v100 sits a fair bit below the rtx 2060 im not sure it would make a difference tho, the fact that volta doesnt cop a pipeline stall when an INT command is issued is probably its only saving grace, maybe p100 has that same capability, not sure havent read the whitepaper yet
Posted on Reply
Add your own comment
Apr 26th, 2024 09:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts