• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Starfield discussion thread

CPU performance isn't fine lol, it's what's bogging the game down. It wastes >80% of the CPU cycles it consumes just flushing the pipeline, it's all machine clears. Try the Intel vtune profiler on your setup, see if you can replicate my results earlier in the thread
Really? No matter what I do, im GPU bound. Im currently at 720p with 50% resolution scale on top of that, still 99% gpu usage. I don't know man, it's extremely GPU heavy
 
Really? No matter what I do, im GPU bound. Im currently at 720p with 50% resolution scale on top of that, still 99% gpu usage. I don't know man, it's extremely GPU heavy

turn off resolution scale. run at native, 100%, turn off fsr2, turn it to CAS, change your settings to medium all the way down, report back

Sorry, didn't mean to derail..

was my fault no one elses, @Solaris17 will move forward and no more comments on it, I wasn't thinking :toast:
 
turn off resolution scale. run at native, 100%, turn off fsr2, turn it to CAS, change your settings to medium all the way down, report back
Ok now i went from 125 to 160 fps, but still 99% gpu usage :P
 
Ok now i went from 125 to 160 fps, but still 99% gpu usage :p


well it should look good, and can't complain about 160 fps, so I guess my work here is done
 
Was in grabbing an espresso and ran into Todd. Dude behind him was pretty surprised too.

Todd.jpg
 
Just tried the game, 4k native with no upscaling and stuff, a 4090 loses even the 60 fps mark sometimes. CPU performance looks fine, no issues on a 12900k, if I drop resolution in order to not get GPU bottlenecked it gets 130-160 fps, haven't gone far yet though.
I'm using the DLSS3 frame generation mod. Pretty much doubles fps.
 
well it should look good, and can't complain about 160 fps, so I guess my work here is done
Just tested some more, if you have an intel CPU with ecores, turn off HT. Boosted performance in the big city by a crapton. So HT off Ecores ON gives me the best results.
 
Just tested some more, if you have an intel CPU with ecores, turn off HT. Boosted performance in the big city by a crapton. So HT off Ecores ON gives me the best results.

@Super Firm Tofu care to confirm with your 13900k? I'm just curious. If you don't care that's fine too.
 
@Super Firm Tofu care to confirm with your 13900k? I'm just curious. If you don't care that's fine too.
My results are, after turning off HT I got around a 10-15 fps boost in atlantis, or whatever the big city is called. Turning off both HT and ecores gave me a similar boost but all cores were utilized at 100% (lol!) and 1% lows suffered as a result.
 
@P4-630 pesky ecore monsters! :roll:

edit: on a more serious note, i am going to disable hyperthreading on my 5600x3d, see if that gives me any boost :roll:
 
Last edited:
Cyberpunk was just bad, its issues couldn't be chalked to Nvidia or anyone but CD Projekt themselves. It launched in a horribly broken state and it left a terrible taste in my mouth. Not sure if you ever saw crowbcat's video on Cyberpunk, as comically awful as it sounds today, every bit of it was true back when it first came out. I never finished or even returned to the game, even though I constantly hear that it's improved plenty. On the other hand, Starfield has been everything I wanted out of Cyberpunk and a little more :)

As with every new player, Intel's GPU will have some kinks. To their credit, however, they are constantly working, improving, iterating on their software and creating a solid base faster than anyone else. I think they are worthy of being given more than a fair chance, they fully deserve our support. People buying Arc today are largely aware of this, Intel has made no effort to hide that their first-generation product is under heavy development. I don't know about you but if Intel has a good driver for it by the game's official launch in two days as they've promised, then I'm more than happy to let it slide. It's not like the Nvidia driver is super performant (despite working correctly) and AMD unsurprisingly has admitted their driver is broken at launch and listed open issues for the game from the get go.



Agreed. But ultimately, not Intel's fault either.

No, cyberpunk was not and is not "just bad".

I played it from day 1 and had no issues bar a couple of visual glitches, and the game is immense.
 
As others pointed out I barely see any CPU utilization...Seems off.
 
My 4080 gets destroyed by last gen cards???
Who made this game!!!?
 
That console optimization finally paying off.
 
Dose this game have aliens or such beings, or is it only humans fighting it out there ?
 
The 6800 beats the 3090. Amd at it again, bravo.
Looking forward to trying the game, the results are shocking especially that the useless RT is not used.
I wonder how the Series X fares;)
 
Back
Top