• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

XeSS a "Second Generation" Upscaling Technology at Par with DLSS 2.0 and FSR 2.0: Digital Foundry

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,848 (7.39/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel XeSS is a second-generation upscaling technology that's at par with DLSS 2.0 and FSR 2.0, says Digital Foundry, which got an exclusive early-access an Arc A770 graphics card along with XeSS, and the latest version of "Shadow of the Tomb Raider," which features XeSS besides DLSS. The publication compared the A770 + XeSS with a GeForce RTX 3070 + DLSS, and in its testing, found XeSS to offer comparable or better image quality at low frame-times. XeSS uses a AI-ML algorithm to reconstruct details, which is accelerated by the XMX cores on Xe-HPG "Alchemist" GPUs. XeSS hence currently only supports Arc GPUs, however, the company is working on a DP4a (programming model) version so XeSS could work on other GPU architectures, across brands.

An upscaling algorithm by design adds to frame-times (time taken to render a frame to display), and in Digital Foundry's testing, the most aggressive preset of XeSS, Performance, which upscales 720p to 1440p, adds 2 ms to the frame-time. A 1080p to 4K upscaling in the same mode, adds 3.4 ms to the frame-time, which jumps from 8.8 ms to 12.2 ms (an increase of 3.4 ms). The increase in frame-times is a good trade-off when you consider the performance gained—a staggering 88 percent increase in frame-rates for 1080p to 4K upscaling, and 52 percent increase with 720p to 1440p upscaling. Frame-times increase as you move up the presets toward the Quality mode, which renders the game at resolutions closer to native-resolution, so the performance-gained is smaller. XeSS offers a preset it calls "Ultra Quality," which renders the game at a resolution closest to native, while still yielding a 16-23 percent frame-rate gain, with an output that's practically indistinguishable from native-resolution.



View at TechPowerUp Main Site | Source
 
All these upscaling mumbo jumbo with no hardware to run at.
 
Intel A770 is no ware to be seen on planet Earth

Billowing Puff Pass GIF by Fourwind Films
 
Very interesting video, and a pretty darn good first showing for XeSS imo. The moire patterns were about the worst thing to my eye, and it was interesting to see so very many shots where XeSS and DLSS we're clearly exceeding Native+TAA.
 
the battle of 4 2nd-gen upscalers begin :D

DLSS - XeSS - FSR - TSR
 
IMO XESS don't need to compete with DLSS at all, just beat or compete with FSR and people will seriously consider it as a valuable feature set, huge win if Intel figure out how to implement some form of XESS driver side with little to no dev input (don't think its likely but hey Intel engineer is talented).
 
It's great to see, now Intel should include this to their iGPU, it would a great selling point for their CPUs.
 
IMO XESS don't need to compete with DLSS at all, just beat or compete with FSR and people will seriously consider it as a valuable feature set, huge win if Intel figure out how to implement some form of XESS driver side with little to no dev input (don't think its likely but hey Intel engineer is talented).
IMO Intel shouldn't have even tried with XeSS, just work on FSR support for their GPU's. They tried to do too many things at once.
 
IMO Intel shouldn't have even tried with XeSS, just work on FSR support for their GPU's. They tried to do too many things at once.
Allegedly XESS benefit from some kind of hardware acceleration in Intel GPU, and FSR would just work out of the box anyway, might well invest a bit to differentiate themselves.
 
IMO Intel shouldn't have even tried with XeSS, just work on FSR support for their GPU's. They tried to do too many things at once.
They are planning "everything at one go" if this works then its a banger for team blue or else 0.
 
They must have spent so much time on this, and yet had some pathetic drivers to show for the card. Absurd.
 
No doubt! its quite promising but need to wait n wait n wait.......
still will be waiting... not sure when these products will be out or will be seeing these leaks only
 
This is really where Intel needs to have success with these cards too this technology could easily trickle down into it's integrated graphics.
 
IMO XESS don't need to compete with DLSS at all, just beat or compete with FSR and people will seriously consider it as a valuable feature set, huge win if Intel figure out how to implement some form of XESS driver side with little to no dev input (don't think its likely but hey Intel engineer is talented).
First they need to make XeSS run on other cards. With their nonexistant volume, low user base and driver issues i cant see many people buying Arc to use XeSS.
 
What does "At Par" even mean? If you have watched the entire video you'd soon realise that whilst XeSS is a good first effort from intel, it still isn't as good as an old version of DLSS in a lot of cases.
 
First they need to make XeSS run on other cards. With their nonexistant volume, low user base and driver issues i cant see many people buying Arc to use XeSS.

why would they care to make XeSS run on other cards? they make money selling their gpus, not the competition's gpus.
 
why would they care to make XeSS run on other cards? they make money selling their gpus, not the competition's gpus.
Adoption. By limiting it to their own cards they limit XeSS adoption. And unlike Nvidia who has ~80% dGPU market share Intel is a newcomer in dGPU space where their market share is <1%. Aside from accepting cash from Intel what incentive do game developers have to dedicate man hours to integrate XeSS into their games?
 
First they need to make XeSS run on other cards. With their nonexistant volume, low user base and driver issues i cant see many people buying Arc to use XeSS.
XeSS already runs on other cards. It was even briefly tested in the very video this article is about, which apparently nobody watched before rushing to offer their hot take on it. Including the person who wrote the article.
 
Adoption. By limiting it to their own cards they limit XeSS adoption. And unlike Nvidia who has ~80% dGPU market share Intel is a newcomer in dGPU space where their market share is <1%. Aside from accepting cash from Intel what incentive do game developers have to dedicate man hours to integrate XeSS into their games?

the product that makes money isn't XeSS, it's the gpu. The problem is if no one adopts Arc not if no one adopts XeSS

what would Nvidia prefer, sell lots of Arc and no one cares about XeSS or the opposite? easy. XeSS is a feature not a product.
 
Well this is positive. Now we need to see the complete package. Maybe a GPU that an everyday person can buy perhaps.
 
meh being as good as dlss 2.0 isn't a selling point
its basicly proclaiming "hey we aren't entirely useless" in this market
 
Back
Top