• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,758 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Star Wars Jedi: Survivor is EA's new AAA title from the epic franchise. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a selection of modern graphics cards, with ray tracing enabled and disabled.

Show full review
 
Does the review include the day-1 patch?
 
Does the review include the day-1 patch?
Yessir, As mentioned two times in the text, I've tested the public release version, not the press build. No way to download the game without that patch
 
Last edited:
Since Jedi Survivor is an AMD-sponsored title, I suspect that the low-fi RT decision was made, so that AMD hardware doesn't get hit too hard from the RT penalty. I guess AMD's sponsorship contract for smaller studios also includes "no support for DLSS/XeSS", as the only upscaling technology implemented is AMD FSR. Maybe AMD's contract should include "no stutters on the PC platform" instead, as that would actually make a meaningful difference to stop the exodus of gamers to consoles.
Our savior AMD, low-fi RT and including only their upscaler, not that it's their fault but the game launching in such a horrible state is no boon either.

Multiple issues here, and EA basically coming out and saying it's barely their fault doesn't help.

This game released before it was ready, simple. And only including FSR is becoming hilariously inexcusable, AMD if you're reading this, purposefully not including DLSS/XeSS in sponsored games isn't your route to beat them.
As expected, VRAM requirements are high. With 9 GB in 1080p, older hardware probably won't have an easy time running the new Star Wars adventure. At higher resolutions like 1440p and 4K we've measured 9.6 GB and 10.5 GB, which should be fine, given the base GPU requirements.
What was interesting to me was that despite the high VRAM usage, the 8/10GB cards didn't seem to fall apart in averages or min FPS, at least as the graphs show it. I suspect that once the VRAM buffer is saturated, what ensues varies on a per game basis.
 
Our savior AMD, low-fi RT and including only their upscaler, not that it's their fault but the game launching in such a horrible state is no boon either.

Multiple issues here, and EA basically coming out and saying it's barely their fault doesn't help.

This game released before it was ready, simple. And only including FSR is becoming hilariously inexcusable, AMD if you're reading this, purposefully not including DLSS/XeSS in sponsored games isn't your route to beat them.

What was interesting to me was that despite the high VRAM usage, the 8/10GB cards didn't seem to fall apart in averages or min FPS, at least as the graphs show it. I suspect that once the VRAM buffer is saturated, what ensues varies on a per game basis.
What about the Games that only support DLSS?
 
Lots of weird assumptions being made here, but shall we not gloss over that FSR works for...ya know...literally everyone? including hardware old enough that Nvidia does not even care enough to give it DLSS support?

So really there is no reason to include anything else, everyone can use FSR.
I get that you might want to use DLSS preferably because FSR is currently still suffering from some shimmering bugs, but hey, ask the good people over at Nvidia to make their solution open to everyone or to share how they solved it and all problems are gone.
 
Our savior AMD, low-fi RT and including only their upscaler, not that it's their fault but the game launching in such a horrible state is no boon either.

Multiple issues here, and EA basically coming out and saying it's barely their fault doesn't help.

This game released before it was ready, simple. And only including FSR is becoming hilariously inexcusable, AMD if you're reading this, purposefully not including DLSS/XeSS in sponsored games isn't your route to beat them.

What was interesting to me was that despite the high VRAM usage, the 8/10GB cards didn't seem to fall apart in averages or min FPS, at least as the graphs show it. I suspect that once the VRAM buffer is saturated, what ensues varies on a per game basis.
Well JHH either didn't pay them enough or paid them nothing to include DLSS. Though with RT & DLSS being their bread & butter I'm almost sure they'll come around soon enough. What we need is an open standard kinda like free sync or Vulkan so that we don't have to be on the mercy of corporations buying their way into a game ~ but this is year 2023 on earth 616 & I don't see it happening any time soon!
 
Last edited:
Our savior AMD, low-fi RT and including only their upscaler, not that it's their fault but the game launching in such a horrible state is no boon either.

Multiple issues here, and EA basically coming out and saying it's barely their fault doesn't help.

This game released before it was ready, simple. And only including FSR is becoming hilariously inexcusable, AMD if you're reading this, purposefully not including DLSS/XeSS in sponsored games isn't your route to beat them.

What was interesting to me was that despite the high VRAM usage, the 8/10GB cards didn't seem to fall apart in averages or min FPS, at least as the graphs show it. I suspect that once the VRAM buffer is saturated, what ensues varies on a per game basis.
In fact, it makes more sense to use an option that works for everything than to implement 3 technologies that have the same (horrible) result. Not always saturating the vram will cause loss of performance, sometimes you will see textures disappearing and rendering issues.

Anyway, that's the last thing you have to point out as a problem with this game. :P
 
We've learned what we already knew:
1. News of pre-release issues are way overblown, and
2. Never buy a game on day one.

Thanks for the test. :)
 
"We're now paying $70 to beta-test an unpolished turd that they call AAA game—not the first time this year. I'm starting to wonder if these companies aren't slowly eroding their customer base by delivering broken products over and over again."

Well said W1zzard and there are a lot of gamers fed up with this trend.

As always, I would recommend waiting for the game to be properly patched and polished before buying and playing. If they never patch the problems effectively, and there is always that possibility, then don't buy until you can pick it up for $20 on a big Steam Sale or just pass on it altogether. Don't reward incompetency by paying full price.
 
Lots of weird assumptions being made here, but shall we not gloss over that FSR works for...ya know...literally everyone? including hardware old enough that Nvidia does not even care enough to give it DLSS support?

So really there is no reason to include anything else, everyone can use FSR.
I get that you might want to use DLSS preferably because FSR is currently still suffering from some shimmering bugs, but hey, ask the good people over at Nvidia to make their solution open to everyone or to share how they solved it and all problems are gone.

The reason to include DLSS is that FSR is a piece of shit. It's not really a good thing that it works on everything but it's so bad that it's only real use is at 4k and max quality and even then it's filled with errors top to bottom. Use it at lower resolutions and you might as well just put every detail on lowest possible because the image is destroyed completely. FSR only sounds good on paper, but i'd rather not play a game if it doesnt run well enough and only had fsr.
 
I don't see any problem on the FSR-only implementation.

Look at RE4RE, it is also a recent FSR-only game by default, and it runs super great.

These poor performance is just EA being EA.

Blaming the FSR approach for the poor launch performance is pure emotional and just non-sense.
 
Wonder if an 7900xtx version like the nitro or xfx will score significantly better than shown in this test?
The 7900xtx reference has a tiny cooler compared to all the 4080 and 4090 cards
 
This game released before it was ready, simple. And only including FSR is becoming hilariously inexcusable, AMD if you're reading this, purposefully not including DLSS/XeSS in sponsored games isn't your route to beat them.
What is the way to beat them though? Because AMD making everything open source and trying to be the "good guy" hasn't worked for them either.
 
What is the way to beat them though? Because AMD making everything open source and trying to be the "good guy" hasn't worked for them either.

The Windows drivers are not open source and we can't fix issues, Well, except for playing in Linux under Wine, but that opens another can of worms.
 
@W1zzard

Great overview!

One question, would it be possible to do mid & min. grfx settings to see how VRAM usage drops?
 
"I guess AMD's sponsorship contract for smaller studios also includes "no support for DLSS/XeSS," as the only upscaling technology implemented is AMD FSR. Maybe AMD's contract should include "no stutters on the PC platform" instead, as that would actually make a meaningful difference to stop the exodus of gamers to consoles."

Maybe that's the plan..given the performance of AMD tagged games in recent years. I guess "runs like a crap on anything else but our cards" wasn't enough.
 
I've had a great time with the game so far - no issues, and great visuals.

zghOJ95.jpg


zXcST2j.jpg
 
Come on, don't do that, don't use a 4090 to tells us what vram the game uses at 1080p, this makes no sense. If you tested all the cards at different resolutions why no tell us what vram each card used (reported allocation) by gpu.
 
What's all this ruckus about the game supporting only FSR and not DLSS? It's not like using FSR with an Nvidia GPU was impossible or anything.

Thinking of all the people who didn't cry when games only had DLSS in them... :rolleyes:
 
What about the Games that only support DLSS?
Few and far between these days, Nvidia welcomes the comparison, because it's favorable.
Lots of weird assumptions being made here, but shall we not gloss over that FSR works for...ya know...literally everyone? including hardware old enough that Nvidia does not even care enough to give it DLSS support?

So really there is no reason to include anything else, everyone can use FSR.
The reason is most people don't own Radeon cards, and when the work is done for one it's almost entirely done for any, every new game if it's going to have one, it should include them all.
In fact, it makes more sense to use an option that works for everything than to implement 3 technologies that have the same (horrible) result.
they don't all have the same result ;)
I don't see any problem on the FSR-only implementation.

Look at RE4RE, it is also a recent FSR-only game by default, and it runs super great.

These poor performance is just EA being EA.

Blaming the FSR approach for the poor launch performance is pure emotional and just non-sense.
FSR legitimately looks terrible in that game, TAA ain't much better, yet the DLSS 'mod' looks good, would have been nice not to have to mod it in.
What is the way to beat them though? Because AMD making everything open source and trying to be the "good guy" hasn't worked for them either.
The way is being open and not blocking competitor solutions, play to the strength of adoption and try to be in as many games as possible, and be as good/current as possible, but not while actively disallowing objectively good alternative image scaling technologies from sponsored games.
 
Ended up refunding it myself. Will try again in a year when it costs $20 and they have patched it a dozen times.

EA service rep on the chat was actually rather nice about it. Totally understood.


Though I am rather disappointed in all this. I liked the first game, was excited to play the sequel and it was a choppy mess. I have a 4080 and a 7950X3D, and I am trying to play a game that looks visually worse than Battlefront 2 from 2017 and I need an upscaler to keep above 60fps and the only option is FSR which looks as bad as DLSS 1.0 did!

Meanwhile my machine easily runs Battlefront 2 at 144fps flawlessly, and the graphics are more impressive! The quality of work being done by game developers has fallen off a cliff! We peaked in the late 2010s and it has been downhill since then.
 
My hypothesis is that Respawn did test the game plenty and it ran very well on PC. Then, the execs decided they wanted Denuvo, tacked it on with no testing and here we are. That's why the message from EA on Twitter is so obviously made up; it came from the execs and not the developers that actually spent time on the game.
 
Back
Top