• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Star Wars Jedi: Survivor has Major CPU and GPU issues

Well, we will see in few hours, but this is... I don't know, I am not even sad anymore.
 
New unoptimized AAA console port. Are we surprised in the slightest that this is happening? RE4, TLoU, and now this.
 
Thats not good and considering that I've played and finished the first game with a 1600x/GTX 1070 on max settings '2560x1080 res' with no real issues to speak of I'm wondering what the hell happened here since I can't see how this game looks that much better than the first.
I'm interested in the game cause I did like the first but I wont be playing it anytime soon, maybe when its added to Gamepass or discounted.

because the 1st game, does not have RT features...... in this game, does have RT features, so if someone using RT, this pc games will become skyrocket of VRAM & RAM usage .....
 
Have read it is working fine on AMD GPUs' so could be an NV driver issue.
Its not that amd’s drivers have more overhead for cpu’s.
because the 1st game, does not have RT features...... in this game, does have RT features, so if someone using RT, this pc games will become skyrocket of VRAM & RAM usage .....
Im all for pushing hardware limits thats why i got a 7950x3d 64gb’s ddr5 and a 4090 but there’s no reason you shouldn’t be able to run 60fps maxed at 1440p with a 5900x and a 3080ti. I know nvidia should’ve got past 8gb’s a long time ago on their gpu’s but it doesn’t change the bulk of pc gamers having 8gb’s or less of vram. If you believe steam surveys most people use stuff like a 1650, 1060 and so on. Its sad bought more games on ps5 and sx lately because of bad pc ports. I was impressed with dead island 2 it even ran good on the ps4 and xbox one. Ray tracing is not worth the performance hit for most gamers and in most games you can’t tell a difference. There are a couple games that I use it in but its few and far between.
 
because the 1st game, does not have RT features...... in this game, does have RT features, so if someone using RT, this pc games will become skyrocket of VRAM & RAM usage .....

I was talking about RT off, the game has the same issues even with RT off and if you set it off then it doesn't exactly look better than the first game.
I've watched a budget channel's performance stream last night and the game couldn't even 100% utilize a GTX 1650 with a i5 10400 on 1080p medium with low textures yet it was getting 30-40 fps with ~70% GPU utilization so thats how bad is it and that has nothing to do with RT in this game.
 
I just got word from the IT folks that they will be adding more cowbells, so all should be resolved.
 
UE4/5 games are curse to PC gaming.

Absolutely true. And not just PC gaming.

On the last console generation there were almost no Unreal Engine 4 games. And we didn't have many terrible PC ports. Of course we all remember one of the worst - Arkham Knight (which was actually UE3).

Why were there no UE4 games? Because that engine has abysmal multi-threading support and it was almost impossible to run it on Jaguar CPUs, where multi-threading was crucial. Only at the end of that generation UE4 games started to show up, once developers got to know the hardware really well.

The new consoles have just enough CPU power to get this engine to 60 FPS in the hands of somewhat competent developers (definitely not those responsible for Gotham Knights or The Outer Worlds next-gen update). And this translates to PC, where CPU IPC is the only relevant thing if you want to go above 60 FPS in these games.

I've always hated Unreal Engine. It always had traversal stutter and all kinds of technical issues. The only good thing is the visual quality, but it's not worth the price. I'm really dreading the prospect of UE5 becoming popular. So many games use UE4 these days and they all run like trash (except for Dead Island 2, which doesn't really look next-gen). With UE5, 60 FPS might not even be possible.
 
Not really, the problem is that devs need to realize that and make use of it and not be lazy.
It's very much a skill problem, devs aren't lazy people by nature. The idea is absurd.
 
Back
Top