Thursday, November 5th 2015

Black Ops III: 12 GB RAM and GTX 980 Ti Not Enough

This year's installment to the Call of Duty franchise, Black Ops III, has just hit stores, and is predictably flying off shelves. As with every ceremonial annual release, Black Ops III raises the visual presentation standards for the franchise. There is, however, one hitch with the way the game deals with system memory amounts as high as 12 GB and video memory amounts as high as 8 GB. This hitch could possibly be the reason behind the stuttering issues many users are reporting.

In our first play-through of the game with its highest possible settings on our personal gaming machines - equipped with a 2560 x 1600 pixels display, Core i7 "Haswell" quad-core CPU, 12 GB of RAM, a GeForce GTX 980 Ti graphics card, NVIDIA's latest Black Ops III Game Ready driver 385.87, and Windows 7 64-bit to top it all off, we noticed that the game was running out of memory. Taking a peek at Task Manager revealed that in "Ultra" settings (and 2560 x 1600 resolution), the game was maxing out memory usage within our 12 GB, not counting the 1.5-2 GB used up by the OS and essential lightweight tasks (such as antivirus).
We also noticed game crashes as little as 10 seconds into gameplay, on a machine with 8 GB of system memory and a GTX 980 Ti.

What's even more interesting is its video memory behavior. The GTX 980 Ti, with its 6 GB video memory, was developing a noticeable stutter. This stutter disappeared on the GTX TITAN X, with its 12 GB video memory, in which memory load shot up from maxed out 6 GB on the GTX 980 Ti, to 8.4 GB on the video memory. What's more, system memory usage dropped with the GTX TITAN X, down to 8.3 GB.
On Steam Forums, users report performance issues that don't necessarily point at low FPS (frames per second), but stuttering, especially at high settings. Perhaps the game needs better memory management. Once we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.
Add your own comment

168 Comments on Black Ops III: 12 GB RAM and GTX 980 Ti Not Enough

#101
wickedcricket
Huh?

On side note, anyone fancy a BF4 Conquest game? I just put this new 60Hz server up on PC:

-=THE EAGLES=- 60Hz |No Limits|Conquest 24/7|4FUN|Free 4 All

Anyone from EU (and surrounding areas) are very welcome to join!

cheers lads! :)
Posted on Reply
#102
Prima.Vera
It's awesome! Let's have some uncompressed textures, put them inside the game and be done with it. No wonder the game is more than 50GB....
Ridiculous!!
Posted on Reply
#103
LiveOrDie
Must be the shitty old engine code there still using they should of just paid unreal %5 and used unreal.
Posted on Reply
#104
Tsukiyomi91
@Xzibit will do on my main rig (Rig 1) & also to see if there's any impact on Rig 2. Will run without those anti-aliasing methods to gauge performance impact.
Posted on Reply
#105
xorbe
Latest game stresses latest hardware on top-shelf "ultra" benchmark settings. News at 11.
Posted on Reply
#106
Legacy-ZA
Well, we better see some killer number crunching graphics cards from nVidia or AMD. It would seem most of the latest game titles are very demanding. I am sure Fallout 4 and Deus-Ex: Mankind Divided will be so too.
Posted on Reply
#107
yogurt_21
iSkylakerYou couldn't be more wrong with that statement, Ultra doesn't necessarily means having x8 of MSAA, which is what taxes the most the performance of a game when you setup the "Ultra" preset of a game. I'm pretty sure the sample quality isn't part of a developer goal when targeting the final visual results in a video game, anything above x2 MSAA, heck even above any post-processing Anti Aliasing technique is pretty much a complement for more sharpness in the image.
MSAA didn't exist in 2005. It has nothing to do with an Ultra preset.

Ultra = the most hardware taxing visual display a game can muster, optimization be damned. And a 1000$ machine shouldn't be able to handle it, at least not based on the precedents set by games like doom 3, quake 4, crysis, heavily modded skyrim. etc. You want eye candy? It's going to cost you.
Posted on Reply
#108
RejZoR
That's a load of bollocks @yogurt_21 and you know it.

Firstly, MSAA did exist long before 2005. I didn't bother to do that much extensive search, but GeForce 3 back in 2001 had MSAA support. Not to mention the fact that "anti-aliasing" has been used even long before that. I know my GeForce 2MX and GeForce 2 Pro had anti-aliasing which I extensively used even back then. Probably year 2000 or so.

Secondly, providing superior visual fidelity by not using ANY optimizations is just stupendous wasting of resources if you can barely tell the difference in the end or even worse, if you can't tell the difference at all! That's why Far Cry used poly-bumpmapping which gives player the illussion you're using 3 million polygons on a model, but in reality, you're only using 3000. Sure some took it a bit too far which resulted in square-ish heads in stills, but frankly, in-game, you've rarely noticed it even on such far end extremes.

And this isn't the only optimization. Megatextures, texture streaming, LOD, tessellation etc, all this stuff means you can spend resources on things that matter and cleverly obscure those that are less important. It's why we can have wast worlds because engine is cleverly balancing graphic card capabilities for things that matter and things that don't. If you spend all resources on a character you can't even admire properly, you've just wasted resources that could be spent on 500 trees in the distance. Sure there are people who say faking something is not the same, but when faked effect is nearly impossible to be distinguished from the reference one, does it even matter at that point?

Besides, Ultra setting often doesn't mean you're disabling optimizations, it just means you're pushing the settings to start tessellating items sooner and using distance LOD later. Which is logically more demanding...
Posted on Reply
#109
truth teller
ShihabyoooAnd why are you comparing these two, exactly?
The game may be ridiculously unoptimized, doesn't mean it has 2007 graphics.
you are right, ut3 actually looks better than bloopers3, all those mushy textures, yuck, i shouldnt have compared it to a 07 game, next time i might do comparison between blondes&blacks3 and ut2003
Filip GeorgievskiSounds to me like an Nvidia based game.....
oh its based alright, higher ram usage than a kite
Posted on Reply
#110
yogurt_21
RejZoRThat's a load of bollocks @yogurt_21 and you know it.

Firstly, MSAA did exist long before 2005. I didn't bother to do that much extensive search, but GeForce 3 back in 2001 had MSAA support. Not to mention the fact that "anti-aliasing" has been used even long before that. I know my GeForce 2MX and GeForce 2 Pro had anti-aliasing which I extensively used even back then. Probably year 2000 or so.

Secondly, providing superior visual fidelity by not using ANY optimizations is just stupendous wasting of resources if you can barely tell the difference in the end or even worse, if you can't tell the difference at all! That's why Far Cry used poly-bumpmapping which gives player the illussion you're using 3 million polygons on a model, but in reality, you're only using 3000. Sure some took it a bit too far which resulted in square-ish heads in stills, but frankly, in-game, you've rarely noticed it even on such far end extremes.

And this isn't the only optimization. Megatextures, texture streaming, LOD, tessellation etc, all this stuff means you can spend resources on things that matter and cleverly obscure those that are less important. It's why we can have wast worlds because engine is cleverly balancing graphic card capabilities for things that matter and things that don't. If you spend all resources on a character you can't even admire properly, you've just wasted resources that could be spent on 500 trees in the distance. Sure there are people who say faking something is not the same, but when faked effect is nearly impossible to be distinguished from the reference one, does it even matter at that point?

Besides, Ultra setting often doesn't mean you're disabling optimizations, it just means you're pushing the settings to start tessellating items sooner and using distance LOD later. Which is logically more demanding...
www.tomshardware.com/reviews/anti-aliasing-nvidia-geforce-amd-radeon,2868.html

Quincunx Anti-Aliasing is not MSAA

its a precursor or rather something NV tried to do to make better aa performance when the hardware couldn't handle it. I do see MSAA as a part of the Open GL 1.5 standard from late 2003 though as well as DirectX 9.0c from August 2004. So you are correct it was around in 2005, but SSAA (games settings just had it as AA) was still the standard then. Either way the ultra setting wasn't just about aa, draw distance/FOV, shadows, hdr, water effects, other animations like arrow trails, bullet effects, and etc.

So thanks for zeroing in on some random points in my post. The point was that Ultra was not as he described locked in with an 8x MSAA boost.

secondly who says it has to look the same? you? are you ranting against yourself? As described above Ultra typically adds a ton of different effects to a game, many of which are quite noticeable. Though ideally the game will still look and play fine with them off for the majority of people.

Some people are just fine adding an area rug and calling it a day. Some people pay millions to interior decorators to make their homes look like a palace or the set of their fav sci-fi show.

Ultra mode is for the latter group.

The rest will do fine on Medium or High or better yet their own custom preset of the effects they care about and none that they don't.

But chewing up 12GB of ram isn't a feat these days and as good as a single 980 ti is, you can still go better (sli, tri-sli, sli titan-z's)
Posted on Reply
#111
Aquinus
Resident Wat-man
yogurt_21But chewing up 12GB of ram isn't a feat these days and as good as a single 980 ti is, you can still go better (sli, tri-sli, sli titan-z's)
Absolutely but, it seems that it is also hungry enough for VRAM where the 4GB versus 6/8GB argument starts becoming more of a thing.
btarunrOnce we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.
So tell me, how does the 390 and 390X behave with it? This article seems to be 100% "omg, nvidia cards can barely run it," which means nothing if not pitted up against the competition. :confused:
Posted on Reply
#112
Pill Monster
Blame MS not the developers. Any rig with a 4GB GPU will need 16GB running a DX11 title..... it's the way WDDM works...(or doesn't work lol)

Drivers map system memory into GPU address space, but in W7/8.1 it's a bit broken, as there's no Dynamic Resources or reclaim support
This means the maximum available is mapped to the GPU...usually equvilant to the VRAM on your card.


It was supposed to be fixed in 8.1 but MS postponed it till W10..... no biggie, as Wiz said witth 16GB it's all good... :)
Posted on Reply
#114
Ikaruga
Perhaps it's just an oversight/bug and they probably didn't think about 12GB systems and manualy treat as as 16GB ones? As I wrote earlier, the rest is good, any AAA game should use as much resources as it can, it's actually a good thing to fill up the system and the video memory as much as possible.
Posted on Reply
#115
Shihab
Ikarugaany AAA game should use as much resources as it can, it's actually a good thing to fill up the system and the video memory as much as possible.
Any game should use as much resources as it needs.
Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards should always be capped by what the game can offer. Go higher and it's a case of poor optimization.
Pill MonsterDrivers map system memory into GPU address space, but in W7/8.1 it's a bit broken, as there's no Dynamic Resources or reclaim.
This basically means the maximum available is mapped into the GPU adress space...usually equvilant to the VRAM on your card.
I thought this was fixed with Windows 7 (and d3d10/11)
Posted on Reply
#116
Pill Monster
ShihabyoooAny game should use as much resources as it needs.
Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards should always be capped by what the game can offer. Go higher and it's a case of poor optimization.



I thought this was fixed with Windows 7 (and d3d10/11)
They're talking about DWM and 0 copy between RAM and VRAM. I'm referring to unified addressing where RAM is mapped into GPU space......

but it's not fixed, and never will be in W7... kernel issue..



Here's some info on DX if you want to read ....

msdn.microsoft.com/en-us/library/windows/desktop/dn899121(v=vs.85).aspx
Posted on Reply
#117
Sephil Slyfox
ZeppMan217Crysis came out in 2007 and it was unique in many ways. It's 2015, BLOPS3 is not breaking any grounds, aside from sales.
I do think it looks cool however I also liked it when it was Call of Duty 2 dunno why:p
GhostRyderI thought I read it stutters on the GTX 980ti but that disappears with the Titan X?

Oh I missed that bottom comment about 16gb of ram and the GTX 980ti working fine... Ok so we now have 6 Cards that can run it efficiently on Ultra.


Eh yea, I actually like the Black Ops/WaW games just for the zombies personally as the multiplayer got to boring and pretty easy if you know the right ways to play it (Meaning right guns, positions, etc).
I di too but... Wish
P4-630Let's hope Bethesda's Fallout 4 does better
Hope so just pre-ordered :p
Posted on Reply
#118
64K
Sephil SlyfoxI do think it looks cool however I also liked it when it was Call of Duty 2 dunno why:p
Probably because COD 2 was better. COD 1, 2 and MW1 were the most fun to me but I haven't given up on the series. I'm way behind though. I've still got half of Black Ops 1 to finish and then on to MW3 later on. I'm in no hurry. Newer COD games aren't a priority for me. I still like to go back and play 1, 2 and 4 every couple of years.

And welcome to TPU fellow Tennesean. :toast:
Posted on Reply
#119
Ikaruga
ShihabyoooAny game should use as much resources as it needs.
Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards should always be capped by what the game can offer. Go higher and it's a case of poor optimization.
Absolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.
Posted on Reply
#120
Pill Monster
IkarugaAbsolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.
Yep for sure.....:)




Execption being Superfetch imo. :P Never did like it lol
Posted on Reply
#121
Shihab
IkarugaAbsolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.
IIRC Windows' been doing similar stuff since XP and Vista (Prefetcher and Superfetch, respectively), and I can't count the times I had to disable the latter because it was screwing up the system.

You are right. Running a game from ram is better, but that still wouldn't justify caching data for segments that won't be needed for minutes/hours to come, or ones that aren't needed any more. What matters is what's being displayed now and what will be in the very near future (for the game), in other words: what the game "needs". Then it's simply a matter of balancing when to cache newer data and when to scrub older ones.
Pill MonsterHere's some info on DX if you want to read ....

msdn.microsoft.com/en-us/library/windows/desktop/dn899121(v=vs.85).aspx
I'll take your word for now. My experience with programming hasn't reached d3d yet >_>
Posted on Reply
#122
Am*
And this is exactly why I bought a Titan X over the 980 Ti...so I won't be forced to wait for months of patches etc before they fixed memory leaks in case I wanted to play one of these crappy ports (not that I do - COD is the last game I'd consider buying)...
Posted on Reply
#123
rtwjunkie
PC Gaming Enthusiast
I have to laugh...several people have talked of patches to fix the amount of RAM used....why do you assume there will be patches for that? Why do you assume Activision thinks it is a bug? You use up that much system RAM with a game because it was DESIGNED that way.
Posted on Reply
#124
Am*
rtwjunkieI have to laugh...several people have talked of patches to fix the amount of RAM used....why do you assume there will be patches for that? Why do you assume Activision thinks it is a bug? You use up that much system RAM with a game because it was DESIGNED that way.
1. Because the game looks like crap...
2. Because Batman AK used over 7GB at launch, now uses around 5.5GB max. GTA V online also had memory leak issues, which have been fixed since.
Posted on Reply
#125
rtwjunkie
PC Gaming Enthusiast
You are ignoring that games progress. Nothing stands still in the gaming world. Gaming, more than anything has consistently forced the advancement of computer parts to bigger, better, faster as requirements increase.

It's a constantly moving finish line, and the natural evolution of things. It's unrealistic to think or hope that requirements will stand still.
Posted on Reply
Add your own comment
Apr 26th, 2024 07:06 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts