• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Fallout 4 Performance Issues

the build isn't for now i was thinking of doing it around tax time, and I did figure it out its the power supply and i found the receipt from when I built it back in Dec 2011
 
In my case Fo4 no mods at Ultra 1080p(no sunrays its buggy waiting for patch) does not use more than 2,5gb ram and 1,6 vram

Fo4 is not a demanding game you can easely play it at ultra 1080p with system like

i3-6100
8ram
gtx 960 or R7 370
500w Psu

and this is a cheap system if you can spend more i can give you a more detailed recommendation.


I think the game gets "dumbed down" with insufficient cpu cores. Afterall, its designed primarily for current gen consoles with 8 core AMD APUs. My 6 core processor has all cores working above 50% in this game @ 1080P & 60Hz, this is all in vanilla game.
 
Fallout 4 loves system RAM and runs noticeably better when it does as little as possible within/with the Pagefile. So yes, 16GB is no luxury here for maximum performance. But then again, why would you try to squeeze max perf out of a game like this, because optimization is utter shit across the board (frame drops in cities etc) :). For normal/good performance 8GB will get you there easily.

I would also refrain from getting more than 8 GB because you will be spending more money on DDR3 while any future upgrade of your system will be using DDR4 RAM most likely.

As for the OP, stick to the discipline of finding the cause of your crash issue before buying into new components, or this may turn into a very long winded and expensive little issue. I am also leaning towards PSU, when you replace this unit (double check if you can't grab some 5 year warranty on it, never hurts to ask, PSU should be in working order reasonably within 5 years time, it's what you as a customer should be able to expect from it), make sure you replace it with a solid, future/upgrade proof unit, don't skimp on a few bucks and grab a gold rated, well reviewed one with a slightly overkill wattage so you have headroom for overclocks, and for long-term degradation of caps.
 
Last edited:
The game is very stressful on cpu's single core/thread performance because its stuck with DX11 config. All draw calls will be flogged on one cpu core/thread + its a proper 64bit game so all registers in the cpu will be working mighty hard.

This article proves my point but its explaining DX12 compared to DX11.... heck, DX12 games can't come soon enough!
 
The game is very stressful on cpu's single core/thread performance because its stuck with DX11 config. All draw calls will be flogged on one cpu core/thread + its a proper 64bit game so all registers in the cpu will be working mighty hard.

This article proves my point but its explaining DX12 compared to DX11.... heck, DX12 games can't come soon enough!

64 bit DX11 hammers my system so much harder than DX9 games do, with much higher wall power usage and more heat/noise.

DX12 being able to actually use the hardware we have has me worried about what next gen gaming will truly be like - how many systems will overheat, or trip out PSU's that "work fine" in DX9?
 
FO4 is running great on my rig with all settings at Ultra with 60FPS and I'm only using 4GB's of memory. The only thing I turned off/down was blur because it was making me dizzy. Other than that...the game runs smooth as butter at 1080p on a P2414H.
 
64 bit DX11 hammers my system so much harder than DX9 games do, with much higher wall power usage and more heat/noise.

DX12 being able to actually use the hardware we have has me worried about what next gen gaming will truly be like - how many systems will overheat, or trip out PSU's that "work fine" in DX9?

Yes, I've noticed with my electricity power consumption measuring device, I'm like 100 + watts extra with DX11 compared to DX9 gaming (at least comparing Skyrim to FO4)
even with same hardware setup as before when gaming in Skyrim, its a big step up in power consumption no doubt.
Heck, even a game like Far Cry 4 actually drinks a little more than FO4 on my system (about 20 watts more on average) and both games are proper 64 bit DX11 titles.
A little off topic but Far Cry 4 uses even more CPU resources, around 90% on all 6 cores of my cpu.

I also think it depends what part of the FO4 game world your character is in as to CPU usage. I've seen it as high as 93% on one core but lately only as low as 70 80% on all cores, this is according to HWMonitor.

Using GPU-Z yesterday, I discovered 3.2GB of Vram used at its highest reading during that session and this is only @ 1080P + 60Hz, ultra settings but Godrays on 'low'
 
i'm fairly confident that Fo4 automatically scales textures to adapt Vram usage, which is why performance can be odd - someone with way less Vram could have it running smoother because its using half the texture size. i've definitely spawned into combat areas leaving buildings and seen the HD textures pop up a few seconds later.
 
i'm fairly confident that Fo4 automatically scales textures to adapt Vram usage, which is why performance can be odd - someone with way less Vram could have it running smoother because its using half the texture size. i've definitely spawned into combat areas leaving buildings and seen the HD textures pop up a few seconds later.

Yes, I recall only very recently when I had the minimum spec AMD card HD7870 but GHz version (5% OC) running this game @ 1200P, 60Hz & medium settings. Chewed up nearly all its paltry 2GB Vram no doubt, but 40 FPS were on average according to FRAPS, this in combo of outdoor and indoor areas. Still with the same CPU Clock speed.
 
i'm fairly confident that Fo4 automatically scales textures to adapt Vram usage, which is why performance can be odd - someone with way less Vram could have it running smoother because its using half the texture size. i've definitely spawned into combat areas leaving buildings and seen the HD textures pop up a few seconds later.

THats Mipmaps at work and its not optimized per situation it just loads the lowest mipmap and works up from their. so Say a texture is 2048 x 2048 it will load 128x128 then 256x256 512x512 1024x1024 etc etc untill it gets to the full size. Distance to an object also impacts this. Since it takes time to load / stream textures usually the game will work its way up to max quality as its loading aka minimum load necessary to work then scale up the quality soon as possible.

Another good example of say Mipmap screwup would be Garrus in Mass Effect, the game still to this day loads the lowest resolution texture for his face even at ultra quality settings. because the Mipmap setting for him is reversed. aka game loads the smallest size up close and the biggest clearest size when hes far away. Fallout 4 is doing nothing different than Skyrim other than the DX11 aspect
pic03.jpg
 
Yeah, was staring at a wall console for half a minute before it showed higher res. In some places it loads fast or isn't noticable, but in other places it's very obvious and varies from few seconds to much longer.
 
DX12 being able to actually use the hardware we have has me worried about what next gen gaming will truly be like - how many systems will overheat, or trip out PSU's that "work fine" in DX9?
So you think DX12 will use the CPU more therefore needs more energy off the wall. Makes me think of Crysis - compared to that game, nearly nothing will change, because the CPU isn't the main energy user in the system, it's the GPU by far. What are most people using anyway? i5 4460 with 84W TDP? That's not much and 84W is only the theoretic maximum, it never reaches it I guess. Only old CPUs or FX processors and i7-E CPUs need a lot of energy, but these people usually know that and have a suited PSU for it.
 
So you think DX12 will use the CPU more therefore needs more energy off the wall. Makes me think of Crysis - compared to that game, nearly nothing will change, because the CPU isn't the main energy user in the system, it's the GPU by far. What are most people using anyway? i5 4460 with 84W TDP? That's not much and 84W is only the theoretic maximum, it never reaches it I guess. Only old CPUs or FX processors and i7-E CPUs need a lot of energy, but these people usually know that and have a suited PSU for it.

DX11 certainly does over DX9, and DX12 enables better CPU threading. I see well over a 100W difference between DX9 and DX11 titles on my system, due to it using more of my hardware.

Think about it - if we're going from DX9 (mostly single threaded) to DX11 (mostly dual threaded) to DX12 (ALL THE THREADS!) then its going to kick up the power consumption to get that performance. Clearly you could just say, run a DX11 GPU benchmark (heaven 4.0) and a CPU stress test and get your synthetic maximums, which DX12 will come closer to than any existing game engines.
 
Last edited:
DX11 certainly does, and DX12 enables better CPU threading. I see well over a 100W difference between DX9 and DX11 titles on my system, due to it using more of hardware.

I bet someone will come with a new Batman to clog up all your CPU threads with rubbish while producing... cinematic 30FPS :D

BTW now CATS I mean bloody... ahem... crimson drivers are out with Elite booster fix
 
Yes, you see a big difference, but in which games? Please try it with Crysis or something similar (if it exists), but if you mean average DX9 game, well yeah, that would be true.

Well I could also say, just play BF4 or some other heavy game like Farcry 4, Witcher 3 or something. Compared to these resource heavy games I doubt a DX12 game will increase much in power consumption. If your CPU is chilling at 50% you have what? 40-50W power consumption on CPU alone? In a ressource heavy game its a bit more then, I'd say 20-30W more at max. The same applies to a heavy DX12 game. I don't think DX12 has any real downsides, but it has upsides for people with more than 4 cores or 8 thread cpus, because it could tap into unused power then. Or do you mean GPU will use more power too, because DX12 will tap into unused GPU power? Could be right, but I see it more on the red camp, I don't think GeForce GPUs have problems with usage, at least not if you use them right (980 Ti with 1440p+ for example). Radeon cards have problems with being really used to the maximum, but if you increase resolution or play something like Crysis 3 it disappears.

I understand what you're saying, but it depends on which games you are comparing - if you compare heavy DX11 with heavy DX12, I don't see a real difference (AMD Hawaii/Fiji GPUs with 1440p+ so they have high usage). But average DX9 to heavy DX11/DX12 is a big jump, yes.
 
It could be that in heavy single thread mode the CPU eats way less than medium across all cores. Especially if the CPU is uber high TDP AMD FX.
 
I dont have any of the crysis games installed to compare, sorry. Grab a killawatt or whatever the local alternative is in your country and see what your system does with wattages.


The only reason i brought it up for relevance here, is that games like fallout 4 definitely draw more power than other, older titles on the same hardware - and that can result in overheating, performance issues, and even hard reboots from PSU over current protection on otherwise 'stable' hardware.
 
Well no problem, I work with data I see in the internet so far, I don't care much about doing it myself.

Maybe so that Fallout 4 is scaling well, but people should have suitable PSUs to their hardware anyway, so I don't see a problem - maybe the problem will be their rising electric bills, yeah. ;) But that's it.

It could be that in heavy single thread mode the CPU eats way less than medium across all cores. Especially if the CPU is uber high TDP AMD FX.
Thats a fact in DX9 games, in modern DX11 games it's more likely that 2-4 cores get used, I see medium to high usage on my i7 3820 clocked at 4.3 GHz in GTA 5 on all 8 threads (would be 100-130W) + maximum usage on GPU (225-250W), btw. - so it really depends (overall power consumption on my system is about 400-500W in heavy games, DX12 wouldn't rise it much more). Ressource heavy DX11 games use almost all hardware they can get, they only stop if you have more than 6 cores and some don't like HTT so they ignore it or work worse with it (but that means decreased power consumption then).
 
DX11 certainly does over DX9, and DX12 enables better CPU threading. I see well over a 100W difference between DX9 and DX11 titles on my system, due to it using more of my hardware.

Think about it - if we're going from DX9 (mostly single threaded) to DX11 (mostly dual threaded) to DX12 (ALL THE THREADS!) then its going to kick up the power consumption to get that performance. Clearly you could just say, run a DX11 GPU benchmark (heaven 4.0) and a CPU stress test and get your synthetic maximums, which DX12 will come closer to than any existing game engines.

It's the only logical conclusion, I doubt you'd even require a killawatt to determine that this phenomenon exists.

If you increase load on components, you increase power consumption. Simple. One thread versus multiple threads also give the GPU more wiggle room. Whichever way you twist it, it will increase. But you could also counter that with the fact that until DX12, games were the applications that actually never got to the maximum and expected power usage they should have always had.
 
Thats a fact in DX9 games, in modern DX11 games it's more likely that 2-4 cores get used

There are Batman bad DX11 games too... anyways the it depends on the package leakage and how advanced is the power gating.
 
Well lots of variables to consider, it's not as easy as saying DX12 will need more power than any DX11 game - it depends. And we will see it soon I hope
 
i'm fairly confident that Fo4 automatically scales textures to adapt Vram usage, which is why performance can be odd - someone with way less Vram could have it running smoother because its using half the texture size. i've definitely spawned into combat areas leaving buildings and seen the HD textures pop up a few seconds later.

Really? I"ve never noticed that in FO4 so far, your version of the game is patched up and using latest drivers for your gear? I use to see that phenomenom a bit in Skyrim.
 
Last edited:
Well lots of variables to consider, it's not as easy as saying DX12 will need more power than any DX11 game - it depends. And we will see it soon I hope

Exactly, I love how folks on enthusiasts forums speculate or make "educated guesses" over future tech based on historical or current data.. lol... Its far more complex than most think it is..
 
Back
Top