• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Black Ops III: 12 GB RAM and GTX 980 Ti Not Enough

Windows has a lot to do with that, I have 8 GB of RAM and when I am using only 3 GB or 4 GB I keep seeing hard faults which is kind of weird since there is so much RAM available this should not be happening .

Maybe Windows is also guilty for this, a while back I was getting "Close programs to prevent information loss" constantly after leaving Firefox or some other program open for a few hours. I was getting this on Windows 8.1 but it never happened to me on Windows 7, well expect when I formated external USB drive and explorer started eating all the memory because of the memory leak (Thank you Microsucks for not fixing that, idiots).
A hard fault is any time the "page file" is accessed. So you can be idling and Windows will decide to take stuff out of memory to make more room. You can disabled the page file and still get hard faults, the difference is that the hard faults aren't accessing a real drive but, rather essentially a ram disk instead of a real disk.

You don't need to fill system memory in Windows for it to start hard faulting, it just does it a lot more often when you start running out of space. Another reason why I run without a page file.
 
Huh?

On side note, anyone fancy a BF4 Conquest game? I just put this new 60Hz server up on PC:

-=THE EAGLES=- 60Hz |No Limits|Conquest 24/7|4FUN|Free 4 All

Anyone from EU (and surrounding areas) are very welcome to join!

cheers lads! :)
 
It's awesome! Let's have some uncompressed textures, put them inside the game and be done with it. No wonder the game is more than 50GB....
Ridiculous!!
 
Must be the shitty old engine code there still using they should of just paid unreal %5 and used unreal.
 
@Xzibit will do on my main rig (Rig 1) & also to see if there's any impact on Rig 2. Will run without those anti-aliasing methods to gauge performance impact.
 
Latest game stresses latest hardware on top-shelf "ultra" benchmark settings. News at 11.
 
Well, we better see some killer number crunching graphics cards from nVidia or AMD. It would seem most of the latest game titles are very demanding. I am sure Fallout 4 and Deus-Ex: Mankind Divided will be so too.
 
You couldn't be more wrong with that statement, Ultra doesn't necessarily means having x8 of MSAA, which is what taxes the most the performance of a game when you setup the "Ultra" preset of a game. I'm pretty sure the sample quality isn't part of a developer goal when targeting the final visual results in a video game, anything above x2 MSAA, heck even above any post-processing Anti Aliasing technique is pretty much a complement for more sharpness in the image.
MSAA didn't exist in 2005. It has nothing to do with an Ultra preset.

Ultra = the most hardware taxing visual display a game can muster, optimization be damned. And a 1000$ machine shouldn't be able to handle it, at least not based on the precedents set by games like doom 3, quake 4, crysis, heavily modded skyrim. etc. You want eye candy? It's going to cost you.
 
That's a load of bollocks @yogurt_21 and you know it.

Firstly, MSAA did exist long before 2005. I didn't bother to do that much extensive search, but GeForce 3 back in 2001 had MSAA support. Not to mention the fact that "anti-aliasing" has been used even long before that. I know my GeForce 2MX and GeForce 2 Pro had anti-aliasing which I extensively used even back then. Probably year 2000 or so.

Secondly, providing superior visual fidelity by not using ANY optimizations is just stupendous wasting of resources if you can barely tell the difference in the end or even worse, if you can't tell the difference at all! That's why Far Cry used poly-bumpmapping which gives player the illussion you're using 3 million polygons on a model, but in reality, you're only using 3000. Sure some took it a bit too far which resulted in square-ish heads in stills, but frankly, in-game, you've rarely noticed it even on such far end extremes.

And this isn't the only optimization. Megatextures, texture streaming, LOD, tessellation etc, all this stuff means you can spend resources on things that matter and cleverly obscure those that are less important. It's why we can have wast worlds because engine is cleverly balancing graphic card capabilities for things that matter and things that don't. If you spend all resources on a character you can't even admire properly, you've just wasted resources that could be spent on 500 trees in the distance. Sure there are people who say faking something is not the same, but when faked effect is nearly impossible to be distinguished from the reference one, does it even matter at that point?

Besides, Ultra setting often doesn't mean you're disabling optimizations, it just means you're pushing the settings to start tessellating items sooner and using distance LOD later. Which is logically more demanding...
 
And why are you comparing these two, exactly?
The game may be ridiculously unoptimized, doesn't mean it has 2007 graphics.
you are right, ut3 actually looks better than bloopers3, all those mushy textures, yuck, i shouldnt have compared it to a 07 game, next time i might do comparison between blondes&blacks3 and ut2003

Sounds to me like an Nvidia based game.....
oh its based alright, higher ram usage than a kite
 
That's a load of bollocks @yogurt_21 and you know it.

Firstly, MSAA did exist long before 2005. I didn't bother to do that much extensive search, but GeForce 3 back in 2001 had MSAA support. Not to mention the fact that "anti-aliasing" has been used even long before that. I know my GeForce 2MX and GeForce 2 Pro had anti-aliasing which I extensively used even back then. Probably year 2000 or so.

Secondly, providing superior visual fidelity by not using ANY optimizations is just stupendous wasting of resources if you can barely tell the difference in the end or even worse, if you can't tell the difference at all! That's why Far Cry used poly-bumpmapping which gives player the illussion you're using 3 million polygons on a model, but in reality, you're only using 3000. Sure some took it a bit too far which resulted in square-ish heads in stills, but frankly, in-game, you've rarely noticed it even on such far end extremes.

And this isn't the only optimization. Megatextures, texture streaming, LOD, tessellation etc, all this stuff means you can spend resources on things that matter and cleverly obscure those that are less important. It's why we can have wast worlds because engine is cleverly balancing graphic card capabilities for things that matter and things that don't. If you spend all resources on a character you can't even admire properly, you've just wasted resources that could be spent on 500 trees in the distance. Sure there are people who say faking something is not the same, but when faked effect is nearly impossible to be distinguished from the reference one, does it even matter at that point?

Besides, Ultra setting often doesn't mean you're disabling optimizations, it just means you're pushing the settings to start tessellating items sooner and using distance LOD later. Which is logically more demanding...

http://www.tomshardware.com/reviews/anti-aliasing-nvidia-geforce-amd-radeon,2868.html

Quincunx Anti-Aliasing is not MSAA

its a precursor or rather something NV tried to do to make better aa performance when the hardware couldn't handle it. I do see MSAA as a part of the Open GL 1.5 standard from late 2003 though as well as DirectX 9.0c from August 2004. So you are correct it was around in 2005, but SSAA (games settings just had it as AA) was still the standard then. Either way the ultra setting wasn't just about aa, draw distance/FOV, shadows, hdr, water effects, other animations like arrow trails, bullet effects, and etc.

So thanks for zeroing in on some random points in my post. The point was that Ultra was not as he described locked in with an 8x MSAA boost.

secondly who says it has to look the same? you? are you ranting against yourself? As described above Ultra typically adds a ton of different effects to a game, many of which are quite noticeable. Though ideally the game will still look and play fine with them off for the majority of people.

Some people are just fine adding an area rug and calling it a day. Some people pay millions to interior decorators to make their homes look like a palace or the set of their fav sci-fi show.

Ultra mode is for the latter group.

The rest will do fine on Medium or High or better yet their own custom preset of the effects they care about and none that they don't.

But chewing up 12GB of ram isn't a feat these days and as good as a single 980 ti is, you can still go better (sli, tri-sli, sli titan-z's)
 
But chewing up 12GB of ram isn't a feat these days and as good as a single 980 ti is, you can still go better (sli, tri-sli, sli titan-z's)
Absolutely but, it seems that it is also hungry enough for VRAM where the 4GB versus 6/8GB argument starts becoming more of a thing.
Once we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.
So tell me, how does the 390 and 390X behave with it? This article seems to be 100% "omg, nvidia cards can barely run it," which means nothing if not pitted up against the competition. :confused:
 
Blame MS not the developers. Any rig with a 4GB GPU will need 16GB running a DX11 title..... it's the way WDDM works...(or doesn't work lol)

Drivers map system memory into GPU address space, but in W7/8.1 it's a bit broken, as there's no Dynamic Resources or reclaim support
This means the maximum available is mapped to the GPU...usually equvilant to the VRAM on your card.


It was supposed to be fixed in 8.1 but MS postponed it till W10..... no biggie, as Wiz said witth 16GB it's all good... :)
 
Last edited:
Comparison

Beta
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III_Beta-test-BlackOps3_vram.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III_Beta-test-BlackOps3_ram2.jpg


Final
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III-test-blackops3_vram.jpg



http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Black_Ops_III-test-blackops3_ram2.jpg
 
Perhaps it's just an oversight/bug and they probably didn't think about 12GB systems and manualy treat as as 16GB ones? As I wrote earlier, the rest is good, any AAA game should use as much resources as it can, it's actually a good thing to fill up the system and the video memory as much as possible.
 
any AAA game should use as much resources as it can, it's actually a good thing to fill up the system and the video memory as much as possible.

Any game should use as much resources as it needs.
Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards should always be capped by what the game can offer. Go higher and it's a case of poor optimization.

Drivers map system memory into GPU address space, but in W7/8.1 it's a bit broken, as there's no Dynamic Resources or reclaim.
This basically means the maximum available is mapped into the GPU adress space...usually equvilant to the VRAM on your card.

I thought this was fixed with Windows 7 (and d3d10/11)
 
Any game should use as much resources as it needs.
Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards should always be capped by what the game can offer. Go higher and it's a case of poor optimization.



I thought this was fixed with Windows 7 (and d3d10/11)
They're talking about DWM and 0 copy between RAM and VRAM. I'm referring to unified addressing where RAM is mapped into GPU space......

but it's not fixed, and never will be in W7... kernel issue..



Here's some info on DX if you want to read ....

https://msdn.microsoft.com/en-us/library/windows/desktop/dn899121(v=vs.85).aspx
 
Last edited:
Crysis came out in 2007 and it was unique in many ways. It's 2015, BLOPS3 is not breaking any grounds, aside from sales.

I do think it looks cool however I also liked it when it was Call of Duty 2 dunno why:p
I thought I read it stutters on the GTX 980ti but that disappears with the Titan X?

Oh I missed that bottom comment about 16gb of ram and the GTX 980ti working fine... Ok so we now have 6 Cards that can run it efficiently on Ultra.


Eh yea, I actually like the Black Ops/WaW games just for the zombies personally as the multiplayer got to boring and pretty easy if you know the right ways to play it (Meaning right guns, positions, etc).
I di too but... Wish
Let's hope Bethesda's Fallout 4 does better

Hope so just pre-ordered :p
 
I do think it looks cool however I also liked it when it was Call of Duty 2 dunno why:p

Probably because COD 2 was better. COD 1, 2 and MW1 were the most fun to me but I haven't given up on the series. I'm way behind though. I've still got half of Black Ops 1 to finish and then on to MW3 later on. I'm in no hurry. Newer COD games aren't a priority for me. I still like to go back and play 1, 2 and 4 every couple of years.

And welcome to TPU fellow Tennesean. :toast:
 
Any game should use as much resources as it needs.
Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards should always be capped by what the game can offer. Go higher and it's a case of poor optimization.
Absolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.
 
Absolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.
Yep for sure.....:)




Execption being Superfetch imo. :P Never did like it lol
 
Absolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.

IIRC Windows' been doing similar stuff since XP and Vista (Prefetcher and Superfetch, respectively), and I can't count the times I had to disable the latter because it was screwing up the system.

You are right. Running a game from ram is better, but that still wouldn't justify caching data for segments that won't be needed for minutes/hours to come, or ones that aren't needed any more. What matters is what's being displayed now and what will be in the very near future (for the game), in other words: what the game "needs". Then it's simply a matter of balancing when to cache newer data and when to scrub older ones.



I'll take your word for now. My experience with programming hasn't reached d3d yet >_>
 
And this is exactly why I bought a Titan X over the 980 Ti...so I won't be forced to wait for months of patches etc before they fixed memory leaks in case I wanted to play one of these crappy ports (not that I do - COD is the last game I'd consider buying)...
 
I have to laugh...several people have talked of patches to fix the amount of RAM used....why do you assume there will be patches for that? Why do you assume Activision thinks it is a bug? You use up that much system RAM with a game because it was DESIGNED that way.
 
I have to laugh...several people have talked of patches to fix the amount of RAM used....why do you assume there will be patches for that? Why do you assume Activision thinks it is a bug? You use up that much system RAM with a game because it was DESIGNED that way.

1. Because the game looks like crap...
2. Because Batman AK used over 7GB at launch, now uses around 5.5GB max. GTA V online also had memory leak issues, which have been fixed since.
 
Back
Top