• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Fallout 4, 9700k @ 5.1 Ghz., 32 GiB DDR4000: only a slight FPS diff. between 1080ti and 4090!

Joined
Dec 12, 2020
Messages
1,755 (1.09/day)
1080ti (2152/5670 @ 1.2V Vcore)
52-53 FPS looking down on city from skyscraper, 71-74% GPU load initially (yet increasing over time), one CPU core had higher usage than the rest, but none were pegged.

Gigabyte 4090 Gaming OC at stock
56 FPS and 26% GPU load at same exact spot as above

How is this possible?
 
What resolution, quality and is the hd texture pack included?

Because we need more info
 
The Fallout series is notorious for running poorly. Aside from that, low FPS and (relatively) low GPU usage, as the figures you quoted, especially the 26% usage with the 4090, point to something else being the holdup. Just because you don't have 100% CPU usage, even on a single core, doesn't mean a faster CPU wouldn't help. It could be some part of the CPU internally waiting on something else that might be better in a newer model (not that the 9700k is a slouch by any means), and you won't see this through a CPU usage graph. In this case, there's no simple answer, other than the Fallout series just runs badly.
 
Fallout 4 on rtx 4090 - impressive. I guess game engine in inside is burning to hell while locked to 60fps :D
 
Fallout 4 settings used in tests:

in-game settings:
2804x1577
Antialiasing: TAA
AF: 16 samples
advanced - everything ultra except DOF {which is standard(low)} and Ambient Occlusion {which is HBAO+(ultra)}
advanced - every box ticked
View Distance - every slider maxed and everything set to ultra

nvidia control panel settings
i. image sharpening: on (this setting no longer exists in Nvida CP though)
ii. antialiasing gamma correction: on
iii. antialiasing transparency: 2x (supersample)
iv. low latency mode: Ultra
v. monitor technology: G-sync
vi. MFAA: on
vii. threaded optimization: on

At least I'm saving power, my 4090 is almost idling.
 
Fallout 4 is based on a terrible game engine that is very ST limited.

Pretty sure even if gou run 2.25x DLDSR you'd still be getting the same framerate with a 4090
 
I'm getting some bizarre issues in GTAV too, the FPS seems locked at 61-62FPS with the 4090 whereas with the 1080ti it would fluctuate from 45 FPS to > 100FPS.

Could this be because I'm using PCIe 3.0 and don't have resizable bar?
 
1080ti (2152/5670 @ 1.2V Vcore)
52-53 FPS looking down on city from skyscraper, 71-74% GPU load initially (yet increasing over time), one CPU core had higher usage than the rest, but none were pegged.

Gigabyte 4090 Gaming OC at stock
56 FPS and 26% GPU load at same exact spot as above

How is this possible?

Fallout 4 is heavily cpu limited by the main render thread.
 
I'm getting some bizarre issues in GTAV too, the FPS seems locked at 61-62FPS with the 4090 whereas with the 1080ti it would fluctuate from 45 FPS to > 100FPS.

Could this be because I'm using PCIe 3.0 and don't have resizable bar?

GTA V is basically a CPU test at this point. Knowing how wildly fps fluctuates depending on location/activity, all that tells me is that you didn't test comprehensively apples to apples.

ReBAR on Nvidia doesn't just work like it gives free performance for Radeon. Very few games are actually whitelisted by Nvidia - unless it's one of those games or you manually force it on (just being on in BIOS does nothing), nothing changes with ReBAR.
 
Your next goal should be updating your platform. While the 9700k was pretty decent it's getting long in the tooth.

4090 at lower than 4k resolution is best paired with Ryzen 7000/X3D and 12th/13th gen intel.
 
I can test it with my 12900K and 4090 and see if thats better,

I can remember there was a BUG with shadows rendering so it rendered for ever if set to MAX, try limiting Shaddow distance and Quality to "medium"
 
Last edited:
Bottleneck.

Doubt, as with 1080 ti and 6700k back in my days did push much more fps in GTA V. Only for know it is unknown what settings are enabled on GTA V. OP did not provide information.
 
I've had 9700F, since 3600xc to 4700ti upgrade it was a mess, i upgraded to 7800X3D, now it's perfect, far from what games rendered with 9700F... at 1440p btw.
 
I'm getting some bizarre issues in GTAV too, the FPS seems locked at 61-62FPS with the 4090 whereas with the 1080ti it would fluctuate from 45 FPS to > 100FPS.

Could this be because I'm using PCIe 3.0 and don't have resizable bar?

I play GTA V 120~158fps (Low latency mode at ultra) G-Sync @ 1440p (MSAA x2 + MFAA / FXAA)

20230326173823_1.jpg
 
Last edited:
I manually adjusted settings in the config file for GTA V (My Documents/Rockstar Games/GTA V/settings.xml) and then set it to read-only so maybe that has something to do w/the issues in GTAV. I don't have time to look into ATM.
 
Here I adjusted the distance scaling and population density to max:

20230417194757_1.jpg
 
1080ti (2152/5670 @ 1.2V Vcore)
52-53 FPS looking down on city from skyscraper, 71-74% GPU load initially (yet increasing over time), one CPU core had higher usage than the rest, but none were pegged.

Gigabyte 4090 Gaming OC at stock
56 FPS and 26% GPU load at same exact spot as above

How is this possible?
you could try out installing dxvk on fallout 4. I had a substancial performance boost using it, but boston downtown was still a lagfest
 
you could try out installing dxvk on fallout 4. I had a substancial performance boost using it, but boston downtown was still a lagfest
That's exactly where I'm getting this issue. I'm on top of the skyscraper where you save somebody from supermutants looking down, it's a lagfest alright. Could it have anything to do with system memory speed? If Fallout 4 was designed for the consoles that have superfast GDDR5 maybe that could explain the lagfest? Because the CPU isn't getting anywhere near 100% utilization on any core.
 
Because the CPU isn't getting anywhere near 100% utilization on any core.
Just because it doesn't show as 100% utilized in task manager doesn't mean the CPU isn't a bottleneck
 
That's exactly where I'm getting this issue. I'm on top of the skyscraper where you save somebody from supermutants looking down, it's a lagfest alright. Could it have anything to do with system memory speed? If Fallout 4 was designed for the consoles that have superfast GDDR5 maybe that could explain the lagfest? Because the CPU isn't getting anywhere near 100% utilization on any core.

On console it also runs like shite....
 
I'm getting some bizarre issues in GTAV too, the FPS seems locked at 61-62FPS with the 4090 whereas with the 1080ti it would fluctuate from 45 FPS to > 100FPS.

Could this be because I'm using PCIe 3.0 and don't have resizable bar?

No. Rebar should be disabled on your setup.

That's exactly where I'm getting this issue. I'm on top of the skyscraper where you save somebody from supermutants looking down, it's a lagfest alright. Could it have anything to do with system memory speed? If Fallout 4 was designed for the consoles that have superfast GDDR5 maybe that could explain the lagfest? Because the CPU isn't getting anywhere near 100% utilization on any core.

Yes, ram hz and cl matters alot for the performance.

But no, that isn't why you aren't getting 100% cpu utilization - i already told you why.
 
Because the CPU isn't getting anywhere near 100% utilization on any core.
What is your CPU model? If you suspect a CPU bottleneck, you should be watching per thread rather than per core utilization.

EDIT: Nevermind, just saw the title. For the i7-9700K core=thread.

Could you provide a screenshot showing per thread usage in that particular spot?
 
Last edited:
Back
Top