• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Xe GPUs to Support Raytracing Hardware Acceleration

I will believe any and all when I see our own W1zzard post glowing reviews. Until then it's all speculation and vaporware, Intel has promised a lot and delivered little in the way of graphics, and yet we are to believe they are using parts of CPUs (with security issues) to make graphics better.


I wish for the best, but won't hold my breath.

They're 0 and how many, now? No 10nm, no 5G modem, no mobile x86, failed at security software, sold arm division (just lol), and we all remember Larrabee and itanic. They've spent multiple 10s of billions (50 bil at least rough guess not counting normal investment in 10nm) just on this stuff and failed miserably with zero return.

It's no wonder they're desperate to try GPU. If they lose CPUs, then they're toast lol

Next time they get a bright idea, they should just give it to AMD to develop and buy their stock lolz
 
Last edited:
Clearly Nvidia has the technology lead over Intel in graphics right now, but you have to wonder if they're getting a bit nervous now that AMD has decent CPUs again and Intel is coming out with GPUs. They're going to be sitting there without the ability to come out with APUs beyond crappy ARM versions. I'm looking forward to the Intel cards, more competition is good for the market and AMD has not been enough of a competitor for 5+ years.

I don't think that this news article is any sort of proof that the Intel GPUs will be that good at DXR though.
 
The more heads we get pushing for this transition, the faster we get there

Who's gonna upgrade GPUs to play all those Epic exclusives?

In any case, the race for the decade mark will be determined when DXR is 60hz+ on 1440 for sub $250. Event horizon will show itself then.
 
Who's gonna upgrade GPUs to play all those Epic exclusives?

In any case, the race for the decade mark will be determined when DXR is 60hz+ on 1440 for sub $250. Event horizon will show itself then.
I won't dispute that. But still the first step to getting there is getting enough manufacturers and creators/programmers involved.
 
I will believe any and all when I see our own W1zzard post glowing reviews. Until then it's all speculation and vaporware, Intel has promised a lot and delivered little in the way of graphics, and yet we are to believe they are using parts of CPUs (with security issues) to make graphics better.


I wish for the best, but won't hold my breath.

Larabee comes to mind
 
Who's gonna upgrade GPUs to play all those Epic exclusives?

In any case, the race for the decade mark will be determined when DXR is 60hz+ on 1440 for sub $250. Event horizon will show itself then.

I'm in awe at your optimism. We've had 4k for nearly 5yrs now and are just now breaching 60+ with 700-1000 dollar flagship cards. Gonna be a long decade.
 
I'm in awe at your optimism. We've had 4k for nearly 5yrs now and are just now breaching 60+ with 700-1000 dollar flagship cards. Gonna be a long decade.

I think the 2080 can do it now. Next gen, the 3070 should do it. Gen after that, 4060 should do it. If AMD/Intel rocks NV's boat, pricing decline or performance increases should accelerate. We have what we have now because NV doesn't have to work. Complacency has set in.
 
Still that's 700 for the 2080, that's nowhere near $250, and the xx60 hasn't been a $250 card for 2 gens now. I'll hope for but not expect. RTX 4000 should coincide with 4k's 10th birthday.
 
Still that's 700 for the 2080, that's nowhere near $250, and the xx60 hasn't been a $250 card for 2 gens now. I'll hope for but not expect. RTX 4000 should coincide with 4k's 10th birthday.

Btw, I am not saying it will happen, just saying that until we hit those points it ain't happening in 10 years. I honestly don't care either way as I would prefer developers spend time on writing then trying to get Ray Tracing to work in their games. If NV wants to write the code for them and finance it, so be it but I would much rather have stories that didn't suck. Or engines like Creation.
 
In other words, Pascal can't do ray tracing in real time.
real time has to be 30 fps+ so pascal can do RTRT (1080ti barely does it). i would actually like to see how p100 does at it being more compute oriented. considering the v100 sits a fair bit below the rtx 2060 im not sure it would make a difference tho, the fact that volta doesnt cop a pipeline stall when an INT command is issued is probably its only saving grace, maybe p100 has that same capability, not sure havent read the whitepaper yet
 
Back
Top