• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Xe GPUs to Support Raytracing Hardware Acceleration

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel's upcoming Xe discrete GPUs will feature hardware-acceleration for real-time raytracing, similar to NVIDIA's "Turing" RTX chips, according to a company blog detailing how the company's Rendering Framework will work with the upcoming Xe architecture. The blog only mentions that the company's data-center GPUs support the feature, and not whether its client-segment ones do. The data-center Xe GPUs are targeted at cloud-based gaming service and cloud-computing providers, as well as those building large rendering farms.

"I'm pleased to share today that the Intel Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel Rendering Framework family of API's and libraries," said Jim Jeffers, Sr. Principal Engineer and Sr. Director of Intel's Advanced Rendering and Visualization team. Intel did not go into technical details of the hardware itself. NVIDIA demonstrated that you need two major components on a modern GPU to achieve real-time raytracing: 1. a fixed-function hardware that computes intersection of rays with triangles or surfaces (which in NVIDIA's case are the RT cores), and 2. an "inexpensive" de-noiser. NVIDIA took the AI route to achieve the latter, by deploying tensor cores (matrix-multiplication units), which accelerate AI DNN building and training. Both these tasks are achievable without fixed-function hardware, using programmable unified shaders, but at great performance cost. Intel developed a CPU-based de-noiser that can leverage AVX-512.



View at TechPowerUp Main Site
 
I'll belive it, when I see it. Will Xe be manufactured on Intels mystic 10nm node?
 
I'll belive it, when I see it. Will Xe be manufactured on Intels mystic 10nm node?

No, they'll use Samsung 5 nm EUV.

They don't want to miss the bus with market-relevance at launch and end up with another Larrabee this time.
 
what type of ray-tracing method it's going to be using? real-time or not? If it's real-time then Nvidia has a competitor while AMD doesn't (yet again), but if it's rendered after rasterization, then Nvidia has no competition & will continue to milk the market.
 
what type of ray-tracing method it's going to be using? real-time or not? If it's real-time then Nvidia has a competitor while AMD doesn't (yet again), but if it's rendered after rasterization, then Nvidia has no competition & will continue to milk the market.

Probably not the gaming related kind if it uses CPU based AVX512 for denoising. If you want any sort of realtime RT then all of the "magic" needs to happen on the graphics card, so you can get at least semi decent framerates and framepacing.
 
I mean it's obviously not gaming related but graphics intensive workflow like model rendering, casting rays on objects etc.
 
Notable that they are talking about studio workflows, not focusing on real-time and talking about Intel Rendering Framework. They are not saying anything about DXR or Vulkan.
 
I'll belive it, when I see it. Will Xe be manufactured on Intels mystic 10nm node?
Outsourced to Samsung or TSMC (likely the former). They've invested so much in the 10nm capable of high clocks - no point in wasting it for GPUs.
As for the hardware ray tracing - Intel is one of the biggest FPGA players. This is definitely within their reach even at this moment. How it fares against RTX is another story.
what type of ray-tracing method it's going to be using? real-time or not? If it's real-time then Nvidia has a competitor while AMD doesn't (yet again), but if it's rendered after rasterization, then Nvidia has no competition & will continue to milk the market.
Ray tracing works works the same. You either push frames fast enough to be considered "real time" or not.
What do you mean by "rendered after rasterization"?
Probably not the gaming related kind if it uses CPU based AVX512 for denoising. If you want any sort of realtime RT then all of the "magic" needs to happen on the graphics card, so you can get at least semi decent framerates and framepacing.
Cloud gaming platform.
Signal is encoded and decoded, goes through multiple routers and switches, travels hundreds of km via wire or wireless.
And you worry about denoising on a die 20cm away.
Notable that they are talking about studio workflows, not focusing on real-time and talking about Intel Rendering Framework. They are not saying anything about DXR or Vulkan.
The first part of the article is about their existing products. Rendering movies and complex static 3D models happens on CPUs.
GPUs are too slow (ray tracing is sequential, i.e. heavily single-threaded). Also, complicated models are way too big for RAM available on GPUs.
The article mentions a few CPU features and libraries that were created to accelerate ray tracing.

The second part is about future GPUs. It doesn't mention cloud gaming explicitely but that's the best use case.
 
Last edited:
The first part of the article is about their existing products. Rendering movies and complex static 3D models happens on CPUs.
GPUs are too slow (ray tracing is sequential, i.e. heavily single-threaded). Also, complicated models are way too big for RAM available on GPUs.
The article mentions a few CPU features and libraries that were created to accelerate ray tracing.

The second part is about future GPUs. It doesn't mention cloud gaming explicitely but the best use case.
The pbolded part is what it seems to be about. Reading between the lines, Intel will implement the same solutions on GPU instead of CPU. This does not mean an explicit hardware support for ray-tracing (similarly to Nvidia's RT Cores or otherwise).
 
When are they coming out?? I want to spend money already, geez...
 
what type of ray-tracing method it's going to be using? real-time or not? If it's real-time then Nvidia has a competitor while AMD doesn't (yet again), but if it's rendered after rasterization, then Nvidia has no competition & will continue to milk the market.
What kind of question is that? If you don't need real-time, you can do ray tracing on pretty much any GPU. Hell, you can do it in software, using no GPU at all. Of course this is about RTRT.
 
It should be called Xe740 like his granddad.
 
  • Like
Reactions: bug
The pbolded part is what it seems to be about. Reading between the lines, Intel will implement the same solutions on GPU instead of CPU. This does not mean an explicit hardware support for ray-tracing (similarly to Nvidia's RT Cores or otherwise).
I don't see how that could happen.
Intel Embree, rendering environment described in the text, is a CPU solution - heavily utilizing AVX instruction set. You can't just move it to GPUs.
And really... GPUs are awful at ray tracing. That's why Nvidia made an ASIC and sacrificed some GPGPU potential in RTX cards.

Intel may be planning to make new PCIe RT accelerators to speed up ray tracing nodes, but that would not be GPUs, but new generation of x86 coprocessors. In fact Xeon Phi are currently used for rendering. But that would be just improving products for a segment Intel already dominates.
RTRT for cloud gaming would mean entering a new market - something Intel must do to grow.
 
Oh yes, a CPU-based de-noiser.
RIP Tensor cores
 
Just so you people know, "support for hardware raytracing" may not necessarily mean actual dedicated hardware inside the GPU but rather it can simply mean they adapted existing hardware for some new functions that they now expose at the ISA level.
 
It should be called Xe740 like his granddad.
Would be a nice touch if they did that. But this time around Intel is talking data center SKUs so their plans must be much bigger than before.
I just wish their parts were as disruptive as i740 was.
 
Just so you people know, "support for hardware raytracing" may not necessarily mean actual dedicated hardware inside the GPU but rather it can simply mean they adapted existing hardware for some new functions that they now expose at the ISA level.
The de-noiser is essential for real time hardware RT. So that might be what they mean.
 
Feels like another Larrabee, I don't think Xe is meant for gaming.
 
What kind of question is that? If you don't need real-time, you can do ray tracing on pretty much any GPU. Hell, you can do it in software, using no GPU at all. Of course this is about RTRT.

Pascal can do DXR in real time albeit the performance is another question... So until Intel says it's supports fully HW accelerated DXR/Vulkan this has nothing for gamers. I see this as direct competitor to AMDs RadeonRays or Nvidia Optix.
 
Feels like another Larrabee, I don't think Xe is meant for gaming.
Makes sense, otherwise Intel Graphics would be talking about gaming, pricing of cards for gamers, game engines and other gaming aspects in social media and events.


oh wait.
 
Nobody knows if the Xe de-noising will be fast enough to be real time, but if it will be... yeah, no need to waste the GPU silicon on RT.
?
"de-noising" (Nvidia DLSS) is one thing, ray tracing is another.
Just so you people know, "support for hardware raytracing" may not necessarily mean actual dedicated hardware inside the GPU but rather it can simply mean they adapted existing hardware for some new functions that they now expose at the ISA level.
The phenomenon you're describing is called "programming". It makes it possible to build universal hardware that can do different things. Great stuff!

...

Yes, "hardware raytracing" means a dedicated chip that does exactly what's supposed to be doing (just like hardware encoding/decoding, hardware compressing, hardware random number generating).
 
Back
Top