Friday, April 21st 2023

Intel XeSS Provides 71% FPS Uplift in Cyberpunk 2077

CD Projekt RED, the developer of Cyberpunk 2077, has advertised including various super sampling technologies like NVIDIA DLSS, AMD FSR, and now Intel XeSS supersampling. With the inclusion of XeSS version 1.1, Intel's Arc Alchemist graphics cards can record a significant performance uplift. Thanks to the Intel game blog, we compare XeSS enabled versus XeSS disabled, measuring the ability to play Cyberpunk 2077 at 1080p Ultra settings with medium ray tracing enabled. The FPS comparison was conducted with Intel Arc A750 Limited Edition GPU, which was paired with Intel Core i9-13900K and 32 GB of RAM.

With XeSS off, the A750 GPU struggled and only reached 39 FPS. However, with XeSS set to performance, the number jumped to 67 FPS, making for a smooth user experience and gameplay. This is a 71% performance uplift, enabled by a new update in the game. Interestingly, Intel XeSS is computed on Arc's XMX Units, while NVIDIA and AMD compute their super sampling on shader units.
Source: Intel
Add your own comment

16 Comments on Intel XeSS Provides 71% FPS Uplift in Cyberpunk 2077

#1
john_
XeSS performance, so worst possible quality from that upscaling tech. How about XeSS quality mode? And no comparison with RTX 3060 and DLSS Performance? I wonder why. Intel loves comparing with RTX 3060.

But seriously, hope they keep improving and persist in trying to become a third option in the market.
Posted on Reply
#2
Kyan
Cool but what are the uplift with a cpu that don't cost 2 A750 ?
Posted on Reply
#3
Owen1982
While I appreciate companies trying to improve their products, all of this XeSS, DLSS blah blah xyz, is just a workaround for slow GPUs or badly coded software. They are not actually addressing the issue, it's just bandaid - they are just trying to cover up the problem. Some users may not notice (or care) that there are artifacts and other weird side effects that degrade the picture quality... but I do. I wish they would just release faster graphics cards.
Posted on Reply
#4
Vya Domus
I tested XeSS in cyberpunk myself and at 1440p "Performance" barely looks acceptable, comparable with FSR in "Performance", at 1080p I bet it looks pretty horrendous just like FSR would.
Posted on Reply
#5
Daven
ReBAR mandatory for Xe and soon XeSS mandatory for Xe.
Posted on Reply
#6
TheinsanegamerN
KyanCool but what are the uplift with a cpu that don't cost 2 A750 ?
Uhh.....still the same, since it doesnt take much CPU power to run CP2077 and with an A750 you will always be GPU limited.
Owen1982While I appreciate companies trying to improve their products, all of this XeSS, DLSS blah blah xyz, is just a workaround for slow GPUs or badly coded software. They are not actually addressing the issue, it's just bandaid - they are just trying to cover up the problem. Some users may not notice (or care) that there are artifacts and other weird side effects that degrade the picture quality... but I do. I wish they would just release faster graphics cards.
Well they try, then people whine about "muh $200 GPU" and such.

Your real issue here is that developers do not optimize their code well, but that issue isnt new and will never stop existing.
Posted on Reply
#7
Kyan
TheinsanegamerNUhh.....still the same, since it doesnt take much CPU power to run CP2077 and with an A750 you will always be GPU limited.
it's just plain stupid not to try to make a more or less realistic test if there is almost no difference
Posted on Reply
#8
Aquinus
Resident Wat-man
You know, I'm hearing that the experience with these Intel GPUs is gradually improving as they harden their drivers, which isn't totally unexpected for a first gen of discrete GPU. I'm interested to see how Intel's GPU lineup evolves over time because I'm all for a 3rd option. This is also why I don't tend to be an early adopter of tech. Let other people be the guinea pigs.
Posted on Reply
#9
Daven
AquinusYou know, I'm hearing that the experience with these Intel GPUs is gradually improving as they harden their drivers, which isn't totally unexpected for a first gen of discrete GPU. I'm interested to see how Intel's GPU lineup evolves over time because I'm all for a 3rd option. This is also why I don't tend to be an early adopter of tech. Let other people be the guinea pigs.
From all the news I’m reading, Intel is heading in the opposite direction from releasing future improved generations of GPUs. If its indeed going in that direction, Intel will need to implement a different strategy. IDM 2.0 or whatever its called is just trying to stop the bleeding it seems but not much else.
Posted on Reply
#10
TheinsanegamerN
Kyanit's just plain stupid not to try to make a more or less realistic test if there is almost no difference
This has been hammered to death already, in a GPU test you dont introduce other aspects of the test that could change the results. If you think these should be tested with low end CPUs you are welcome to deal with the hordes of opinions of people discussing what is an "appropriate" CPU to test with each GPU and why none of the results are comparable.

The uplift is not CPU dependant. The game is GPU limited. We dont need to test with multiple CPUs to see if it will change (it wont). If you want to know how the game will work with another CPU go look up a CPU scaling review, they are out there, that will answer your question. We dont need to do that for every GPU result. It's not "stupid" to avoid re doing testing to confirm something we already know.
Posted on Reply
#11
PapaTaipei
Intel could very well kill nvidia by releasing an APU that would combine a beast GPU and CPU.
Posted on Reply
#12
RandallFlagg
PapaTaipeiIntel could very well kill nvidia by releasing an APU that would combine a beast GPU and CPU.
I suspect that's what the plan is (or was) with the tiled GPUs. I would definitely go for adopting a small form factor, say Mac Studio size, with ~4050 levels of performance vs ATX tower and big GPU.

So far it doesn't look like they are able to get to that level though, even with discrete. We'll see once they get Intel 4 rolling.
Posted on Reply
#13
Easo
PapaTaipeiIntel could very well kill nvidia by releasing an APU that would combine a beast GPU and CPU.
Doubtful, same goes for AMD. How are you going to deal with heating, not to mention technical challenges of attempting to make it so small to fit into CPU socket?
Posted on Reply
#14
RandallFlagg
EasoDoubtful, same goes for AMD. How are you going to deal with heating, not to mention technical challenges of attempting to make it so small to fit into CPU socket?
Apple is able to do this with a 15W chip.

Posted on Reply
#15
N3utro
Owen1982While I appreciate companies trying to improve their products, all of this XeSS, DLSS blah blah xyz, is just a workaround for slow GPUs or badly coded software. They are not actually addressing the issue, it's just bandaid - they are just trying to cover up the problem. Some users may not notice (or care) that there are artifacts and other weird side effects that degrade the picture quality... but I do. I wish they would just release faster graphics cards.
that's how things are now whether you like it or not
Posted on Reply
#16
Aquinus
Resident Wat-man
PapaTaipeiIntel could very well kill nvidia by releasing an APU that would combine a beast GPU and CPU.
I just want to see more mid-level GPU offerings to improve competition because prices of GPUs are still pretty insane. I want the days where I could buy a 1 gen old Vega 64 for $300 USD back and we're nowhere near that.
Posted on Reply
Add your own comment
May 16th, 2024 15:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts