• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis

Alex is honestly a tool... you can clearly see in the osd that the game is using the e-waste cores all the time... if he disabled the e-waste cores (which you should regardless, unless you're using a laptop) most of the stuttering would be gone...

I have no stuttering issues with my 7800x3d.

im sorry but that is an incredibly uninformed remark, E cores are always busy doing background stuff, they basically never get used for games and he also specifically showed the 3600 which does not even have e cores AND that has nothing to do with the shader compilation issues.

"I have no stuttering issues" is just a complete lie, its impossible for you not to have them, its a problem with the unreal engine, you at most just dont have the ability to see them because you are not sensitive enough for that sorta thing...

there is a tool here yeah, but its not Alex....
 
im sorry but that is an incredibly uninformed remark, E cores are always busy doing background stuff, they basically never get used for games and he also specifically showed the 3600 which does not even have e cores AND that has nothing to do with the shader compilation issues.

"I have no stuttering issues" is just a complete lie, its impossible for you not to have them, its a problem with the unreal engine, you at most just dont have the ability to see them because you are not sensitive enough for that sorta thing...

there is a tool here yeah, but its not Alex....

Yes, your post is increadibly uninformed - the fact that all the e-waste cores has 30+ % load in many instances in alex video, that is just background tasks... sure...

They aren't supposed to be used by games, but fact of the matter is that they often do, hence why performance usually improves a bit by disabling them, and frametime consistency increases drastically.

cyberpunk-2077-1280-720.png


As for me just not being able to see the stutters, im using the AB frametime graph, which can also be seen on the screenshots i posted in this thread - so wrong again.

And as for your last remark, yeah, you just posted...
 
Alex is honestly a tool... you can clearly see in the osd that the game is using the e-waste cores all the time... if he disabled the e-waste cores (which you should regardless, unless you're using a laptop) most of the stuttering would be gone...

I have no stuttering issues with my 7800x3d.
Most games run better with the ewaste cores on. You need to be a special kind of a brainiac to turn them off, lol
 
Most games run better with the ewaste cores on. You need to be a special kind of a brainiac to turn them off, lol

Except most don't. Even if the average fps increases, most games will see a sharp decline in frametime consistency with them enabled.
 
Except most don't. Even if the average fps increases, most games will see a sharp decline in frametime consistency with them enabled.
I don't know man according to TPU e-cores on got a 0.9% advantage. I can lose up to 9% of perf with e-cores disabled in some games, seems like a toss to me *shrug*
The biggest takeaway from this article is that you really shouldn't worry about the E-Cores. Even in worst case scenarios, the performance differences aren't huge—up to 10% is something you'd barely notice—even with the fastest graphics card available.
1682818966600.png
 
Last edited:
Thanks for the performance review. Won’t be buying this for a while
 
On my old 1070 Ti and overkill CPU it is "usually" playable on High on 1080p, but it still has the same random FPS drops and stuttering as others - but generally it ran better than I thought it will, especially after all the rage posts (I tend to partially ignore them, as I generally have little issues with problematic games. Going to guess that people simply do not update their drivers/firmware). Am I angry at EA? Yes, screw them. Screw Respawn? Yep, I refuse to believe it was all just because EA pushed them.

Technical issues aside the game IS good without a doubt and a great improvement over the first one, which already was a breath of fresh air. SteamDB shows patches are in the works, I just wonder how long will be needed to make it run stable and use all resources your PC can provide. The tone deaf answer on the Twitter was fascinating. They knew, we know they knew, they knew we will know they knew...

P.S.
On the first glance in screenshots and trailers graphics overall did not look that better compared to Fallen Order. However, ingame the difference becomes obvious. Lightning and shadows are vastly improved and complex, especially indoors. Enviroments have greater detail density, again - especially indoors.
 
Last edited:
I don't see any problem on the FSR-only implementation.

Look at RE4RE, it is also a recent FSR-only game by default, and it runs super great.

These poor performance is just EA being EA.

Blaming the FSR approach for the poor launch performance is pure emotional and just non-sense.
Poor Performance is mostly the result of it being an Unreal Engine 4 game. All Unreal Engine 4 games run like garbage. The only UE4 game that I've ever seen run decently is Fortnite, made by the developers of UE4, and even it's no longer on UE4.
 
Poor Performance is mostly the result of it being an Unreal Engine 4 game. All Unreal Engine 4 games run like garbage. The only UE4 game that I've ever seen run decently is Fortnite, made by the developers of UE4, and even it's no longer on UE4.
that UE4 runs pretty well, and looks good too. I think that the issue is UE 4 reaching its limits
1682826072427.png

1682826139119.png

1682826106838.png
 
You know, I see a lot of blaming EA here, but this game is running the same engine as the first game. With likely some of the same assets. I've often wondered this, dont tell me I'm right, these devs are not rebuilding these entire games from scratch each time, right? Surely they'd be smart enough to re use the same engine and all the work that went into it, and simply make new levels, rather then waste tons of time and resources re building the engine implementations for each release.

Right? Surely, they are not THAT shortsighted, right? Right? This isnt another case of EA being too ahnds off an the dev de railing like bioware, right?

Oh who am I kidding......
 
You know, I see a lot of blaming EA here, but this game is running the same engine as the first game. With likely some of the same assets. I've often wondered this, dont tell me I'm right, these devs are not rebuilding these entire games from scratch each time, right? Surely they'd be smart enough to re use the same engine and all the work that went into it, and simply make new levels, rather then waste tons of time and resources re building the engine implementations for each release.

Right? Surely, they are not THAT shortsighted, right? Right?

Oh who am I kidding......

It has a lot of the same issues the first game had at launch.
 
It's crazy to me people are defending the notion to only ship with FSR when DLSS is barely more than a checkbox in a UE game, it's disgusting.

This notion that since FSR works on everything it's the only upscaler everyone will ever want or need is the height of hubris.
 
It's crazy to me people are defending the notion to only ship with FSR when DLSS is barely more than a checkbox in a UE game, it's disgusting.

This notion that since FSR works on everything it's the only upscaler everyone will ever want or need is the height of hubris.

Personally I just dont care that much if it's included or not this game has more important issues that need to be ironed out vs being fixated over what upscaling techniques it uses.

Some games only have DLSS, some games only have FSR, some games have both as well as Xess life goes on.

My preference is that they have all 3 but it isn't the end of the world if they have none also as long as the game is properly optimized without shader compilation issues and random zone stutter.

I care more about the game being good and in a good state.
 
Except most don't. Even if the average fps increases, most games will see a sharp decline in frametime consistency with them enabled.
Νοt true at all. Warzone 2, spiderman,, last of us and cyberpunk gets a huge boost to 0.1% with ewaste cores on.
 
@W1zzard

Dunno if you are aware but this game also has issues with using ray tracing: quite similar to Howgarts Legacy.

Dunno which video i saw this being demonstrated (using a 7800x3D + 4090) but, in order to enable RT, you have to DISABLE IT FIRST, and then enable it again: if not, the game "says" it's on, but it's not.

Just a heads up.

I heard of those reports but had no issues switching RT, I did check that FPS go down, maybe the report was based on the press build?
 
@W1zzard

Dunno if you are aware but this game also has issues with using ray tracing: quite similar to Howgarts Legacy.

Dunno which video i saw this being demonstrated (using a 7800x3D + 4090) but, in order to enable RT, you have to DISABLE IT FIRST, and then enable it again: if not, the game "says" it's on, but it's not.

Just a heads up.
I saw the same video and the author (Daniel Owen) was wrong. If you start the game with RT on, it actually is on. But, if you toggle it off then back on then the frame rate tanks, but not because rt just got enabled (it was already enabled), but because of a bug. DF showed this issue in their video and I’ve verified it myself.
 
maybe the report was based on the press build?

No idea.

I saw the same video and the author (Daniel Owen) was wrong. If you start the game with RT on, it actually is on. But, if you toggle it off then back on then the frame rate tanks, but not because rt just got enabled (it was already enabled), but because of a bug. DF showed this issue in their video and I’ve verified it myself.

That's likely it, then: i think that's the video i saw.
 
I heard of those reports but had no issues switching RT, I did check that FPS go down, maybe the report was based on the press build?

No it still happens in the current live build. If you enable rt without restarting the game, the fps is much lower than it's supposed to be - but after a restart, it's where it should be :)
 
@W1zzard read the review, but shame about your RT comments, I think a dev making the effort to keep lighting good without RT is a good thing? They shouldnt be making players rely on RT just to support a vendor tech,

Interesting about the accessibility features of the game.

Watched streamers play the game, seems quite buggy, stuttery and some texture mud issues (on pc not console) (possible vram starvation?).

Do you plan to test this with older generation CPU's ? as your comment on that I find really alarming. Sounds like a 9900k would struggle to hit 60 fps on this game.

--

These new games make my 10 gig 3080 seem obsolete really fast, in older gens it didnt feel so "rapid".

But I do think only using RT lightly in games is the way to go.

Looking at current GPU prices it is a tough one, I was considering 6950XT, but in this game its really only the bank busting top of the line GPU's that I consider to have reasonable performance.

As was said in the VRAM thread, Dev's will only optimise to the absolute minimum, so new hardware gets released, means less optimisation in games, the only way this rapid obsolescence can slow down is to slow down consumer hardware releases. I personally wouldnt mind GPU generational gaps being as long as console generational gaps are.

Also all these bad ports, UE4 seems a common factor.

-- Found a 9900k user on reddit, says its unplayable regardless of settings. o_O Also a 7800X user with CPU bottlenecking. O_o
 
Last edited:
Personally I just dont care that much if it's included or not this game has more important issues that need to be ironed out vs being fixated over what upscaling techniques it uses.

Some games only have DLSS, some games only have FSR, some games have both as well as Xess life goes on.

My preference is that they have all 3 but it isn't the end of the world if they have none also as long as the game is properly optimized without shader compilation issues and random zone stutter.

I care more about the game being good and in a good state.

I played the game a little, TAA looks quite bad (blurry and lots of disocclusion artifacts), DLSS will certainly be the superior AA option (if it were included)
 
This pretty much sums it up, It's asinine to blame AMD or NVIdia when EA has a LOOOOOOOOOOOOOOOOOOOOOOOONG history of releasing incomplete garbage which will be fixed on DLC patches later. The memes are more abundant than the bugs this situation causes.
Exactly this!

It's strange to see that whenever a well-done game gets released with DLSS in it, it's the game developer who's done the great work, but then, whenever a badly made port gets released, it's immediately AMD's fault that the studio didn't bother with proprietary Nvidia technologies, and if somebody genuinely likes the game, we have to spoil their fun by saying "you only like it because you're a sucker for Star Wars". Why?

Let's not forget that we're talking about EA here, guys. They're notoriously famous for rushing out half-baked console games, so why is this time any different? Just because it's Star Wars? C'mon! :shadedshu:

Also, it is possible for other people to like a game that you don't like, believe it or not. Just like many people pay good money for every iteration of CoD and BF even though they're exactly the same as the last one.
 
I played the game a little, TAA looks quite bad (blurry and lots of disocclusion artifacts), DLSS will certainly be the superior AA option (if it were included)

I'm not a huge fan of most TAA implementation and agree DLSS assuming it's implemented properly could be better at the same time I'm not so entitled that I think it should automatically be included with every game.

Personally it annoys me more when games only support DLSS due to it locking out anyone who doesn't have an RTX gpu.
 
There's a lot to unpack here but let's go.

18GB VRAM for 1440p usage seems to be a bug/memory-leak maybe with some clickbaity sensastionalism thrown in. W1zzard's testing is a bit more thorough than one site uploading a clip to Youtube without controlled testing.

Performance is horrible and it cannot be blamed on Denuvo entirely, The PS5 is the least-bad release but still has plenty of issues and the Series X is broken as hell, possibly even more broken than the PC version given that the majority of the Series X issues are in cutscenes and this is a story-driven game.

FSR is the right answer. DLSS does, when it's given extra special attention and tuning from both Nvidia and the developers, sometimes look a lot better than FSR. But it only runs on some of one brand of GPU and it looks as bad (or worse) than FSR in a few scenarios so while it's a better solution when showcased and cherry-picked, it's often not great; All upscaling is a garbage crutch to hide the fact that RT is too demanding for the resolutions gamers expect, and if a developer has to put upscaling tech into their game I'd rather it be a platform-agnostic scaler that doesn't further divide the PC gaming market.

Ended up refunding it myself. Will try again in a year when it costs $20 and they have patched it a dozen times.
This is the way.
 
Exactly this!

It's strange to see that whenever a well-done game gets released with DLSS in it, it's the game developer who's done the great work, but then, whenever a badly made port gets released, it's immediately AMD's fault that the studio didn't bother with proprietary Nvidia technologies, and if somebody genuinely likes the game, we have to spoil their fun by saying "you only like it because you're a sucker for Star Wars". Why?

Let's not forget that we're talking about EA here, guys. They're notoriously famous for rushing out half-baked console games, so why is this time any different? Just because it's Star Wars? C'mon! :shadedshu:

Also, it is possible for other people to like a game that you don't like, believe it or not. Just like many people pay good money for every iteration of CoD and BF even though they're exactly the same as the last one.

The game is a turd right now. Maybe it will get fixed or partially fixed. We'll see. Yes the blame lies with EA as always. EA, the Developer killer. It is well known that Star Wars games sell pretty well even though many are subpar imo just because a lot of people love Star Wars.

I really enjoyed KOTOR. Great game, but where did Star Wars games of that caliber go to? Publishers/Developers make a half-assed attempt on Star Wars games after that and count on the Star Wars theme to sell more units than it should.
 
Back
Top