• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel's Raja Koduri Teases Xe HPG Mesh Shading in Action, A Hint at DirectX 12 Ultimate Readiness?

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,878 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel's head for architecture, software, and graphics divisions, Raja Koduri, on Wednesday (10/02) teased a Xe HPG discrete GPU prototype running the upcoming 3DMark DirectX 12 Mesh Shaders feature test. Mesh Shaders are one of the four key features for graphics solutions to achieve the DirectX 12 Ultimate logo readiness, the other three being Real-time raytracing, Variable Rate Shading, and Sampler Feedback. Intel has already been supporting VRS tier-1 since Gen11, and the new Gen12 Xe LP carries forward VRS support.

The Xe HPG architecture is being pushed by Intel as the company's first high-performance gaming discrete graphics architecture. The company earlier released entry-level dGPUs based on the same Xe LP architecture as the Gen12 iGPU found in its "Tiger Lake" processors. The presence of VRS and Mesh Shader support, along with foundational work Intel has done in the area of real-time raytracing, hints at the likelihood of Intel gunning for DirectX 12 Ultimate readiness for the Xe HPG.



View at TechPowerUp Main Site
 
good good, now HURRY up!

imagine for a sec if Intel can produce a gpu equal to about a RX5700 but build on 14nm in their own fabs (aka availability) and priced at about 250 dollars, now that would be pretty sweet.
 
Last edited:
It's now or never. Even if it's slow and underwhelming, it's still gonna sell like hotcakes. In my region you can basically buy GT1030 and an old GT730. Everything else either got rounded up by scalpers, or dishonest retailers bumped prices to ridiculous levels on their own. Heck, even some shitty GTX1660 or RX5500XT costs more than my RTX2060S did at launch, and beat-up|abused|exploited by several waves of mining RX580's now score upwards of $400/ea on used market.
 
good good, now HURRY up!

imagine for a sec if Intel can produce a gpu equal to about a RX5700 but build on 14nm in their own fabs (aka availability) and priced at about 250 dollars, now that would be pretty sweet.
That would have been a great achievement. It would be so damn great, I would consider purchasing this for mid range rigs or around.
I pretty exited about Intel getting in the GPU market. They need time but who knows. Maybe in a few years mark Intel's GPUs will be similar to NV and AMD. That would have been awesome to have 3 players in the game :)
 
good good, now HURRY up!

imagine for a sec if Intel can produce a gpu equal to about a RX5700 but build on 14nm in their own fabs (aka availability) and priced at about 250 dollars, now that would be pretty sweet.

Hurry up ?, for what may i ask ?. Chances are you aint getting one even if they release it tomorrow.

This is just saying HEY! were here still, lucky AMD had to deal with SONY and MS for the consoles which has fucked up supply for every thing else.
 
The Xe HPG architecture is being pushed by Intel as the company's first high-performance gaming discrete graphics architecture.

Its gonna take some time before this truly happens. We still wont see a working consumer grade card til 2023 ( give or take several months).
 
Miners: I will take your entire stock
Think of how much face Intel could save if they prevented mining on their dGPUs...pinch me, I'm dreaming
 
So far smoke and mirrors....

Hard to take anything serious they keep putting out with Xe, with LPDDR4, only to certain customers,, won't operate without specific CPU and MOBO, benchmarks look meh....
 
Getting twitter tweets of something that may or may not be a real product, with no price, specs or release date, instead of real videocards on shop shelves for adequate prices
We live in a society.
 
UL have updated 3DMark with the Mesh Shader test, if you have a DX12 Ultimate card knock yourself out...

 
So much hype on this Koduri person, but he has been disappointment in regards to his reputation so far in his previous jobs.
 
So much hype on this Koduri person, but he has been disappointment in regards to his reputation so far in his previous jobs.

Plus 1 for you. While most media people who interview him say he is fantastic and knows what he's doing, based on his past accomplishments, I think he's overhyped.
 
UL have updated 3DMark with the Mesh Shader test, if you have a DX12 Ultimate card knock yourself out...


Up to 865% performance increase will be ignored by game devs.
 
Think of how much face Intel could save if they prevented mining on their dGPUs...pinch me, I'm dreaming
Think how much money Intel would make if they optimized Xe HPG discrete GPU's to mine!
AS they are in the business to make money... I would expect that would put smiles on investors faces.

While Raja Koduri, is a media engineer working to influence the internet personally... more than tactile engineering.
 
Think how much money Intel would make if they optimized Xe HPG discrete GPU's to mine!
AS they are in the business to make money... I would expect that would put smiles on investors faces.

While Raja Koduri, is a media engineer working to influence the internet personally... more than tactile engineering.
While that's certainly true and likely why neither AMD nor Nvidia have developed anti-mining hardware designs, I see it as a potential way for Intel to win back favor in the DIY space amidst process node delays, unnecessary product segmentation, etc. Do I think it'll happen? Not a chance in hell. But it's uplifting to hope that some Intel strategist somewhere might be looking at this GPU availability hell that can be partially attributed to the mining craze and planning a way to combat that.

I can dream can't I?
 
i'm starting to believe that Raja was "fired" from AMD because he's all smoke and zero execution.
When he was at AMD the producst under his wing where all underwhelming "value" turds plus all he does his "teasers" and "reveals" and never anything concrete.

Now on intel all he does is tease and tease and smoke and mirrors, he's been teasing this Xe shit for years instead of releasing the consumer high end GPU that should be the priority and is long overdue(in fact, when they release a dgpu it will be 2 years out of date at this point), not this "irrelevant" HPC crap.

¿my bet?, this will end up dying like larrabee, too delayed too slow for consumer market ended up languishing in HPC and even then cancelled due to lack of interest of the industry.

Xe could have been competitive with Pascal/Turing, but cannot hold a candle to Ampere and even less future architectures like RDNA3 if they keep delaying
 
i'm starting to believe that Raja was "fired" from AMD because he's all smoke and zero execution.
When he was at AMD the producst under his wing where all underwhelming "value" turds plus all he does his "teasers" and "reveals" and never anything concrete.

Now on intel all he does is tease and tease and smoke and mirrors, he's been teasing this Xe shit for years instead of releasing the consumer high end GPU that should be the priority and is long overdue(in fact, when they release a dgpu it will be 2 years out of date at this point), not this "irrelevant" HPC crap.

¿my bet?, this will end up dying like larrabee, too delayed too slow for consumer market ended up languishing in HPC and even then cancelled due to lack of interest of the industry.

Xe could have been competitive with Pascal/Turing, but cannot hold a candle to Ampere and even less future architectures like RDNA3 if they keep delaying

I think Raja's problem is he doesn't seem to value efficiency, he wants his chips to do everything and have unlimited power budgets. I don't know though, that might just be a hot take though.
 
i don't know... the guy seems to be a snake oil salesman, i think he's over-hyped on his capacity and always fizzles and under-delivers by a ton.

Look at nvidia they don't go teasing useless chips 3+ years before they're available, they just launch them by surprise practically.

Worst is that intel does not even have the capacity to build those Xe GPUs with their broken fabs, and good luck getting allotment for a consumer GPU from TSMC nowadays
 
Back
Top