• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Call of Duty FPS Drop

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
I'm not sure if this is the right forum to post this in but i just completed my new rig and i've been noticing FPS drops in both Call of Duty WWII and Call of Duty Blacks OPs IIII. They are very random and not consistent. FPS would be at a constant 60 than randomly drop to 40 or lower than goes back to 60. Honestly i'm not sure what could be causing the issue. i did notice that lowering the in game settings did help make the FPS drop less often. Has anyone else experienced this ?


Specs:

AMD Ryzen 7 1700X@3.9GHz
Corsair H70
ASRock B350 Pro4
Sk Hynix 32GB(4x8GB) @2666MHz
Gigabyte GTX 1060 3GB OC (Until my RTX 2070 Arrives)
Sandisk X300 512GB M.2 SSD
Seagate Constellation ES.3 3TB
Antec Quattro 850W


Resolution i'm currently running is 1920x1080
 
Joined
Sep 17, 2014
Messages
20,692 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
3GB 1060 = possible cause
Ryzen 1 on slower RAM = possible cause

I have this feeling your 2070 will fix it, while introducing a new form of stutter unless you use some sort of Vsync. Because Ryzen won't keep up in high FPS. Not an optimal gaming build, this. Use software tweaks to get around it, and it should work out. You might want to use Adaptive Vsync here.

What refresh rate are you going for? nvm I read 60. Use adaptive Vsync or a frame cap.

Come to think of CoD, actually, it has always had a retarded memory footprint

https://www.techpowerup.com/reviews/Performance_Analysis/Call_Of_Duty_WW_II/4.html

Quite sure the 3GB is killing you now.
 
Last edited:

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
3GB 1060 = possible cause
Ryzen 1 on slower RAM = possible cause

I have this feeling your 2070 will fix it, while introducing a new form of stutter unless you use some sort of Vsync. Because Ryzen won't keep up in high FPS. Not an optimal gaming build, this. Use software tweaks to get around it, and it should work out. You might want to use Adaptive Vsync here.

What refresh rate are you going for? nvm I read 60. Use adaptive Vsync or a frame cap.
1. That could be possible. Activision recommends a GTX 1060 6GB or RX 580. But i would think the 1060 is still a viable GPU at 1080p
2. I do know Ryzen benefits from faster RAM but it is at AMDs recommended speed. But even than i can't ran any higher than 2666MHz unless i lose 16GB of RAM as running 4 Sticks is much harder on than IMC than 2.

I hope so and i do use Vsync as the current monitor i'm use is only 60Hz. I mean even in reviews at 1080p Ryzen still can handle over 100+FPS, it may not be as high as intel but it still can deliver optimal performance paired with a decent GPU. And doesn't adaptive Vsync need to have a compatible monitor to work
 
Joined
Sep 17, 2014
Messages
20,692 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
1. That could be possible. Activision recommends a GTX 1060 6GB or RX 580. But i would think the 1060 is still a viable GPU at 1080p
2. I do know Ryzen benefits from faster RAM but it is at AMDs recommended speed. But even than i can't ran any higher than 2666MHz unless i lose 16GB of RAM as running 4 Sticks is much harder on than IMC than 2.

I hope so and i do use Vsync as the current monitor i'm use is only 60Hz. I mean even in reviews at 1080p Ryzen still can handle over 100+FPS, it may not be as high as intel but it still can deliver optimal performance paired with a decent GPU. And doesn't adaptive Vsync need to have a compatible monitor to work

You may want to ask yourself why you run 4x8GB in the first place. Slash it in half, get faster sticks, and find yourself 10-15% more performance out of that 2070 you're about to push in there. Up to you. Because you will find your current setup to be a limitation to your new GPU. And 32GB is... well. I'm sure you have reasons.

But, if 60 FPS is all you need, should be fine in that regard. Just wait out the 2070 :)

Adaptive Vsync is available for all nvidia cards, I'm sure you can translate this to what it is in English. It will only use Vsync when you go over monitor refresh, so its Vsync minus input lag penalty (just a small one)

1545145217218.png
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
You may want to ask yourself why you run 4x8GB in the first place. Slash it in half, get faster sticks, and find yourself 10-15% more performance out of that 2070 you're about to push in there. Up to you. Because you will find your current setup to be a limitation to your new GPU. And 32GB is... well. I'm sure you have reasons.

But, if 60 FPS is all you need, should be fine in that regard. Just wait out the 2070 :)

Adaptive Vsync is available for all nvidia cards, I'm sure you can translate this to what it is in English. It will only use Vsync when you go over monitor refresh, so its Vsync minus input lag penalty (just a small one)

View attachment 112843
I run 32GB because i ran VMs and do alot of editing/rendering which benefit from more RAM

Also 60FPS isn't the target goal as when i get the 2070 i'll be running triple monitors with a refresh of 120 while gaming. 60FPS is just the limit of the monitor i'm using with my GTX 1060 which both are just place holders.

yea after you mentioned i googled it and got a good understanding of what it was
 
Joined
Sep 17, 2014
Messages
20,692 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I run 32GB because i ran VMs and do alot of editing/rendering which benefit from more RAM

Also 60FPS isn't the target goal as when i get the 2070 i'll be running triple monitors with a refresh of 120 while gaming. 60FPS is just the limit of the monitor i'm using with my GTX 1060 which both are just place holders.

yea after you mentioned i googled it and got a good understanding of what it was

Fair enough. But I can tell you right now that 120 fps on triple monitor is not going to go well on a 2070 either. I barely sustain 120 fps @ 1080p. Card is too light for triple monitor high refresh. And when you do get it, Ryzen 1 will hold you back regardless. > 100 FPS is hard to sustain that performance level across a wide range of games and the CPU plays a huge role. RAM too.

Personally I'd limit my resolution around the region of 1440p or 2560x1080 if you want ultrawide with this card.
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
Fair enough. But I can tell you right now that 120 fps on triple monitor is not going to go well on a 2070 either. I barely sustain 120 fps @ 1080p. Card is too light for triple monitor high refresh. And when you do get it, Ryzen 1 will hold you back regardless. > 100 FPS is hard to sustain that performance level across a wide range of games and the CPU plays a huge role. RAM too.

Personally I'd limit my resolution around the region of 1440p or 2560x1080 if you want ultrawide with this card.
I mean just look here. Ryzen 1st Gen has no problems keeping above 100 FPS in almost all games tested and was only ran with 2666MHz RAM

https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/10.html

CPU should have no issues feeding the GPU at all. No matter the review I’ve read Ryzen can still sustain above 100+FPS in almost all games as long as you have a decent GPU.

And I wouldn’t consinder the RTX 2070 light, you May have troubles sustain long higher frames but you also have to realize the RTX 2070 is over 20% faster than the GTX 1080 and with a OC can easily pass the GTX 1080 Ti.
 
Joined
Jan 8, 2017
Messages
8,810 (3.35/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Seems like you are running out of VRAM.

Do the following, drastically decrease your resolution (play on a single monitor if you got three) and put the textures on lowest quality.
 
Joined
Sep 17, 2014
Messages
20,692 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I mean just look here. Ryzen 1st Gen has no problems keeping above 100 FPS in almost all games tested and was only ran with 2666MHz RAM

https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/10.html

CPU should have no issues feeding the GPU at all. No matter the review I’ve read Ryzen can still sustain above 100+FPS in almost all games as long as you have a decent GPU.

And I wouldn’t consinder the RTX 2070 light, you May have troubles sustain long higher frames but you also have to realize the RTX 2070 is over 20% faster than the GTX 1080 and with a OC can easily pass the GTX 1080 Ti.

OK. Believe whatever you want, and enjoy the card. I'm not going to debate this with you, all I will say is I have the actual experience already and you take it from reviews where you're also interpreting the data wrong. (2070 is not 20% faster at all, 5-10% at the very best ie comparable, and you're linking average framerates while minimums are what its all about).

We'll probably talk again in a future topic where you complain it wasn't all that rosy after all. You have the info now, do what you like with it, but you've already concluded your 3GB GPU purchase wasn't optimal either. Writing's on the wall.
 
Last edited:

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
Seems like you are running out of VRAM.

Do the following, drastically decrease your resolution (play on a single monitor if you got three) and put the textures on lowest quality.
That what it seems like from the experience I’m getting. Monitoring usage though it’s around 2.5GB usage. But I’ll definitely try dropping the settings so more and post back.

OK. Believe whatever you want, and enjoy the card. I'm not going to debate this with you, all I will say is I have the actual experience already and you take it from reviews where you're also interpreting the data wrong. (2070 is not 20% faster at all, 5-10% at the very best ie comparable, and you're linking average framerates while minimums are what its all about).

We'll probably talk again in a future topic where you complain it wasn't all that rosy after all. You have the info now, do what you like with it, but you've already concluded your 3GB GPU purchase wasn't optimal either. Writing's on the wall.
I mean you say you’re speaking from experience but you don’t own a Ryzen to say it’s from direct experience.

And how can 20FPS plus in most games tested against the 1080 can you say they are comparable. And yes I look at reviews especially W1zzs as it gives you a baseline and W1zz does a great job of giving great results close to real world usage.
 
Last edited:
Joined
Jan 8, 2017
Messages
8,810 (3.35/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
At least according to TPU they're within 10% of each other, call it whatever you want but it's ballpark around the same.
 
Joined
Sep 17, 2014
Messages
20,692 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
That what it seems like from the experience I’m getting. Monitoring usage though it’s around 2.5GB usage. But I’ll definitely try dropping the settings so more and post back.


I mean you say you’re speaking from experience but you don’t own a Ryzen to say it’s from direct experience.

And how can 20FPS plus in most games tested against the 1080 can you say they are comparable. And yes I look at reviews especially W1zzs as it gives you a baseline and W1zz does a great job of giving great results close to real world usage.

I own an 8700K at 4.8 Ghz, correct, and even that struggles to keep a happy 100+ FPS in quite a few situations. The source you linked shows Ryzen vs a stock 7700K...

And as I pointed out elsewhere, W1z tests a stock Pascal FE versus an AIB Turing. Thats your missing 10-15% in favor of Pascal right there. That is interpretation of data instead of glancing at some bar charts.

Im not here to crap on your system but you may want to reconsider some things before you are actually stuck with the hardware. If you want to fight that... your loss.
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
At least according to TPU they're within 10% of each other, call it whatever you want but it's ballpark around the same.
I mean it’s closer to the 1080 Ti than the 1080. TPU shows it 5% behind the 1080 Ti. Now that’s comparable.

I own an 8700K at 4.8 Ghz, correct, and even that struggles to keep a happy 100+ FPS in quite a few situations. The source you linked shows Ryzen vs a stock 7700K...

And as I pointed out elsewhere, W1z tests a stock Pascal FE versus an AIB Turing. Thats your missing 10-15% in favor of Pascal right there. That is interpretation of data instead of glancing at some bar charts.

Im not here to crap on your system but you may want to reconsider some things before you are actually stuck with the hardware. If you want to fight that... your loss.

Non the less the OP was about having FPS drops in 2 newer Call of Duty titles and wanted to know if anyone else experienced this and what could have been the issue source

So let’s just stick to that
 
Joined
Sep 17, 2014
Messages
20,692 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I mean it’s closer to the 1080 Ti than the 1080. TPU shows it 5% behind the 1080 Ti. Now that’s comparable.



Non the less the OP was about having FPS drops in 2 newer Call of Duty titles and wanted to know if anyone else experienced this and what could have been the issue source

So let’s just stick to that

Okidoki
 
Joined
Mar 18, 2015
Messages
2,960 (0.90/day)
Location
Long Island
1. That could be possible. Activision recommends a GTX 1060 6GB or RX 580. But i would think the 1060 is still a viable GPU at 1080p
2. I do know Ryzen benefits from faster RAM but it is at AMDs recommended speed. But even than i can't ran any higher than 2666MHz unless i lose 16GB of RAM as running 4 Sticks is much harder on than IMC than 2.

I hope so and i do use Vsync as the current monitor i'm use is only 60Hz. I mean even in reviews at 1080p Ryzen still can handle over 100+FPS, it may not be as high as intel but it still can deliver optimal performance paired with a decent GPU. And doesn't adaptive Vsync need to have a compatible monitor to work

The VRAM isn't going to be an issue at 1080p, but do be aware that the 3 GB model has less shaders so GPU performance is about 6% slower. The only games I have seen impacted with the # GB mdel at 1080p are Hitman and Tomb Raider .... but this seems to be the GPU itself. The 1060 6Gb also struggles a bit here whih indicates that the shaders are a significant issue oin these 2 games at the highest settings. The TPU review states:

"Tomb Raider, which sees a performance loss of around 25% in even 1080p, requires you to reduce details on both the 6 GB and 3 GB version in order to achieve 60 FPS. If you go with a 3 GB version, you might have to dial down settings just a little more, but not by much ."

That what it seems like from the experience I’m getting. Monitoring usage though it’s around 2.5GB usage. But I’ll definitely try dropping the settings so more and post back.

Actually you are not monitoring usage as no utility exists that measures VRAM usage, you are measuring "VRAM allocation". The best analogy is when you go for a car loan, and the redit reporting agency says you have $5,000 on your credit card when what you actually have is $1,000 charged on your credit card which has a limit of $5,000. The CC Company has 'allocated' you to spend up to $5,000 and that is what gets reported. Same for VRAM. The install program looks at what is physically present and say... "OK, Durv here has 8 GB so let's allocate 6 GB for our game". If it sees 4 GB it might allocate 3 GB ... Both might never use mor than 2 GB but there is no way to measure actual usage as no tool is available which does this.

These tools don't "actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, ..... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available".

https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

As for the original problem.... unfortunately most web sites when doing game tests only report average FPS, we really need to see average and minimum. Many bad decisions are made in component selection due to this info missing. For example, you may see no difference in fps when using more of faster / lower CAS RAM ... but go to SLI / CF or look at minimum fps and we often do see differences. While the GFX card was the bottleneck for average fps, when we do to dual cards or look at minimum fps, it may very well be that the RAM is the bottleneck. Not exactly common but it does happen (i.e F1 and STALKER series). Also multiplayer can cause fps drop,

I hate to use gamers nexus as a "reliable source" but it was first one O found that did minimum versus average fps testing

https://www.gamersnexus.net/images/media/2016/gpu/gtx-1060/3gb/1060-3v6gb-doom-ogl-1080.png

Note, very little performance difference despite the lower clock (should have use 3 GB and 6 GB versions of MSI card) ... bit the 6 GB did 101.3 average fps to the 3 GB's 98.3 ... but both had minimum fps of 74.3 / 74.0.

https://www.gamersnexus.net/images/media/2016/gpu/gtx-1060/3gb/gtx-1060-3v6gb-gta-1080p.png

Here again on GTA V ....

6 GB had 98.0 versus 68.3 for average and minimum / 3 GB had 89.3 versus 56.0 for average and minimum

Same pattern continues....

https://www.gamersnexus.net/images/media/2016/gpu/gtx-1060/3gb/gtx-1060-3v6gb-mordor-1080p.png
https://www.gamersnexus.net/images/media/2016/gpu/gtx-1060/3gb/gtx-1060-3v6gb-acs-1080.png

But this is the one you wanna see...




The difference between the same clocked EVGA 6 GB (122.7) to the 3 GB (114.7) is small and due no doubt to the 10% less shaders. But the minimum fps was 77.4% of average on the 6 GB and 75.4% on the 3 GB. In other words... this is normal... the 3 GB has 10% less shaders which means the 3 GB's GPU can't process pixels as fast as the 6 GB. there's no evidence that VRAM has anything to do with what you are experiencing. Average fps means just that. minimum fps will be substantially lower.
 
Last edited:

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
Not at 3GB in some titles. Anything less than 3GB its OK...

You can confirm by looking in MSI AB or GPUz, how much memory you are using......
At current settings I see usage around 2.5-2.7GB which is slightly under the 3GB Total VRAM

My plan is to drop settings when I get home and see if that helps any.


Update: I just came across this



So yea the 3GB 1060 is the culprit
 
Joined
Sep 17, 2014
Messages
20,692 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Not at 3GB in some titles. Anything less than 3GB its OK...

You can confirm by looking in MSI AB or GPUz, how much memory you are using......

The driver will probably manage that fine, what it means is the GPU will be swapping a whole lot more, putting the bus under higher stress, which can result in moments where there isn't enough bandwidth to be in time for the next frame. Result: frame dip/stutter. The higher VRAM cards show how much the game cán use, which is easily double what's on the card.

@John Naylor can you please stop regurgitating the same old nonsense everywhere? You're looking at a practical use case right here where frame drops are directly caused by lack of VRAM and you're saying it ain't happening. I've confronted you on that several times and you happily ignore it and soldier on with the same story of 'but shaders'. Shader counts don't cause big drops like that, a bottleneck does. Core performance is very linear, VRAM shortage is not.

Do you even think people keep reading your walls of BS? There is nothing in the bench you screened over there that even remotely resembles the numbers OP is seeing. There is also more than a bench run in gaming.
 
Last edited:

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
So the CoD engine seems from post to be a big VRAM hog and GPUs under 4GB run into stuttering or frame drops because CoD fills the VRAM profiling the VRAM with textures

Drop settings may help
 
Last edited:
Joined
Mar 18, 2015
Messages
2,960 (0.90/day)
Location
Long Island
Not at 3GB in some titles. Anything less than 3GB its OK...

You can confirm by looking in MSI AB or GPUz, how much memory you are using......

Actually you can not. I hesitate to say that people are wrong when this statement is made .... they have simply been miinformed, not because they aren't useful tools but because these utilities do not actually do all that folks think they do. These utilities measure VRAM allocation, not usage.

https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

These tools don't "actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, ..... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available".

@John Naylor can you please stop regurgitating the same old nonsense everywhere? You're looking at a practical use case right here where frame drops are directly caused by lack of VRAM and you're saying it ain't happening. I've confronted you on that several times and you happily ignore it and soldier on with the same story of 'but shaders'. Shader counts don't cause big drops like that, a bottleneck does. Core performance is very linear, VRAM shortage is not.

You have yet to provide any information that supports your case. TPU testing is in direct and complete conflict with your position; the numbers are what the numbers are and they show otherwise. You want to change our minds, show the data. I have always maintained ....

1. Except in rare instances (i.e Hitman), you do NOT see a substantial impact on performance due to VRAM between the 6GB and 3 GB models until you get to 2160p,

2. And when ya get there, in most AAA games, it won't matter as the GPU is inadequate in any case.

You can confront t and call it nonsense if ya want ... but with no data, don't expect anyone to be convinced. Kellyanne Conway "confronted" the press a few weeks ago claiming"speeding up a film is not doctoring" and that "they do this stuff all the time in sports, they speed up the film so they can tell whether it's a touchdown or 1st down". She didn't make her case with many people as that "Speeding up thing" she was talking about .... most of us call it "slow motion".

1. Is it not fair to say that higher resolutions (1440p) need more VRAM then lower (1080p)?

2. Then, if we accept that premise, it's a given that if a 6 GB card has a 6% advantage at 1080p, then without exception, if VRAM is an issue, that gap must invariably widen at higher resolutions.

Lets look at the graph above ..... "the practical case right here" confirms what I am saying.

COD on 1060 3 GB = 61.6 fps w/ 1152 shaders
COD on 1060 6 GB = 68.8 fps w/ 1280 shaders

So the card with 11.1 % more shaders is 11.7% faster in that test. I don't see anything related to VRAM impact....to test that, we need to look and see if that advantage increases at higher resolutions. So lets puts this to bed once and for all with just a wee bit of data ....

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_1920_1080.png
The 6 GB card does 65.2 fps to the 6 GB cards 60.4 in COD 3 meaning it is 7.9% faster at 1080p in COD3 .... not far from TPUs avaergae oif 6% in the entire test suite.

So, for your "theory" to be correct, that 7.9 % advantage must invariably increase at 1440p .. if it doesn't, then VRAM is not a factor,

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_2560_1440.png
The 6 GB card does 41.0 fps to the 6 GB cards 39.2 in COD 3 meaning it is 4.6 % faster at 1440p. So if the smaller VRAM is the problem, why isn't the problem bigger at 1440p ?

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_3840_2160.png
The 6 GB card does 20.6 fps to the 6 GB cards18.6 in COD 3 meaning it is 10.8% faster at 2160p.....and unplayable in both cases so VRAM matters zilch at 2160p

Again, as I have always maintained ....

1. Except in rare instances (i.e Hitman), you do NOT see a substantial impact on performance due to VRAM between the 6GB and 3 GB models at 1080p and often 1440p, Until you get to 2160p, the VRAM is not an issue ***in most games***. Clearly, given the numbers, no data supports it being an issue in COD3. Hitman is one of those games that does show an impact as the performance advantage of the 6 GB card jumps from 19% at 1080p to 26.3 % at 1440p. That doesn't show that VRAM matters at 1080p, but clearly it has an impact in this game at 1440p

2. And when ya get t2160p, in most AAA games, it won't matter as the GPU is inadequate in any case.

Clearly VRAM is not the issue in COD3. Clearly, it's not the issue in most games in TPUs test suite

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_1920_1080.png
6GB is 6% faster then 3 GB overall at 1080p

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_2560_1440.png
6GB is 6% faster then 3 GB overall at 1440p ... so again, if VRAM is inadequate at 1080p, for the position you present to be hold water, it must invariable be more inadequate at 1440p. It is not ... so the water leaked out.

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_3840_2160.png
Finally at 2160p, we see the 6 GB having an impact as the performance difference jumps to 14%. And yes, it is completely irrelevant as no games are actually playable above 40 fps with either card and most don't break 30 fps.

Furthermore, walking way from this analysis, I can't agree that COD3 is impacted by VRAM at 1080p .... I'm going to continue to agree with Wizzard in his conclusion in the 3 GB 1060 review where he states that most* "games seem completely unaffected by having 3 GB less VRAM at their disposal, especially at 1080p."

* Most being 16 of the 18 games in TPU testing (all but Hitman and Tomb Raider)
 
Last edited:

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
Actually you can not. I hesitate to say that people are wrong when this statement is made .... they have simply been miinformed, not because they aren't useful tools but because these utilities do not actually do all that folks think they do. These utilities measure VRAM allocation, not usage.

https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

These tools don't "actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, ..... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available".



You have yet to provide any information that supports your case. TPU testing is in direct and complete conflict with your position; the numbers are what the numbers are and they show otherwise. You want to change our minds, show the data. I have always maintained ....

1. Except in rare instances (i.e Hitman), you do NOT see a substantial impact on performance due to VRAM between the 6GB and 3 GB models until you get to 2160p,

2. And when ya get there, in most AAA games, it won't matter as the GPU is inadequate in any case.

You can confront t and call it nonsense if ya want ... but with no data, don't expect anyone to be convinced. Kellyanne Conway "confronted" the press a few weeks ago claiming"speeding up a film is not doctoring" and that "they do this stuff all the time in sports, they speed up the film so they can tell whether it's a touchdown or 1st down". She didn't make her case with many people as that "Speeding up thing" she was talking about .... most of us call it "slow motion".

1. Is it not fair to say that higher resolutions (1440p) need more VRAM then lower (1080p)?

2. Then, if we accept that premise, it's a given that if a 6 GB card has a 6% advantage at 1080p, then without exception, if VRAM is an issue, that gap must invariably widen at higher resolutions.

Lets look at the graph above ..... "the practical case right here" confirms what I am saying.

COD on 1060 3 GB = 61.6 fps w/ 1152 shaders
COD on 1060 6 GB = 68.8 fps w/ 1280 shaders

So the card with 11.1 % more shaders is 11.7% faster in that test. I don't see anything related to VRAM impact....to test that, we need to look and see if that advantage increases at higher resolutions. So lets puts this to bed once and for all with just a wee bit of data ....

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_1920_1080.png
The 6 GB card does 65.2 fps to the 6 GB cards 60.4 in COD 3 meaning it is 7.9% faster at 1080p in COD3 .... not far from TPUs avaergae oif 6% in the entire test suite.

So, for your "theory" to be correct, that 7.9 % advantage must invariably increase at 1440p .. if it doesn't, then VRAM is not a factor,

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_2560_1440.png
The 6 GB card does 41.0 fps to the 6 GB cards 39.2 in COD 3 meaning it is 4.6 % faster at 1440p. So if the smaller VRAM is the problem, why isn't the problem bigger at 1440p ?

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_3840_2160.png
The 6 GB card does 20.6 fps to the 6 GB cards18.6 in COD 3 meaning it is 10.8% faster at 2160p.....and unplayable in both cases so VRAM matters zilch at 2160p

Again, as I have always maintained ....

1. Except in rare instances (i.e Hitman), you do NOT see a substantial impact on performance due to VRAM between the 6GB and 3 GB models at 1080p and often 1440p, Until you get to 2160p, the VRAM is not an issue ***in most games***. Clearly, given the numbers, no data supports it being an issue in COD3. Hitman is one of those games that does show an impact as the performance advantage of the 6 GB card jumps from 19% at 1080p to 26.3 % at 1440p. That doesn't show that VRAM matters at 1080p, but clearly it has an impact in this game at 1440p

2. And when ya get t2160p, in most AAA games, it won't matter as the GPU is inadequate in any case.

Clearly VRAM is not the issue in COD3. Clearly, it's not the issue in most games in TPUs test suite

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_1920_1080.png
6GB is 6% faster then 3 GB overall at 1080p

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_2560_1440.png
6GB is 6% faster then 3 GB overall at 1440p ... so again, if VRAM is inadequate at 1080p, for the position you present to be hold water, it must invariable be more inadequate at 1440p. It is not ... so the water leaked out.

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_3840_2160.png
Finally at 2160p, we see the 6 GB having an impact as the performance difference jumps to 14%. And yes, it is completely irrelevant as no games are actually playable above 40 fps with either card and most don't break 30 fps.

Furthermore, walking way from this analysis, I can't agree that COD3 is impacted by VRAM at 1080p .... I'm going to continue to agree with Wizzard in his conclusion in the 3 GB 1060 review where he states that most* "games seem completely unaffected by having 3 GB less VRAM at their disposal, especially at 1080p."

* Most being 16 of the 18 games in TPU testing (all but Hitman and Tomb Raider)
I mean how else can you explain the problem i'm having in WW2 if it's not VRAM related
 
Joined
Dec 31, 2009
Messages
19,366 (3.73/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Actually you can not. I hesitate to say that people are wrong when this statement is made .... they have simply been miinformed, not because they aren't useful tools but because these utilities do not actually do all that folks think they do. These utilities measure VRAM allocation, not usage.

https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

These tools don't "actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, ..... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available".
There is a thread for you in the GPUz section...

... do you see your notifications or are they turned off???


The idea I think I now understand better but it's your analogies in previous posts which threw me off to the point. Those applications will absolutely tell you how much vram is being requested. Just because not all of it is being actively used that second, doesnt meant it isnt allocated and taking up space.

A page file works differently (the bucket size doesnt change unless the bucket gets full) . Credit cards... also different (credit limit = amount of vram on the Card while allocated is your balance).


I asked SEVERAL questions which went unanswered in the recent past when you quoted this passage before... :(

Anyway.. not the thread... go check your notifications and poke on over to thread in gpuz section.
 
Last edited:

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,675 (1.56/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700X
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) |AMD R9 290
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
So i swapped in my RTX 2070 and set every setting to the max. Played all night with a consistent 130FPS with not a single drop. So yes the GTX 1060 was the issue. I'm leaning toward the VRAM being the biggest limiting factor.

 
Joined
Sep 17, 2014
Messages
20,692 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Actually you can not. I hesitate to say that people are wrong when this statement is made .... they have simply been miinformed, not because they aren't useful tools but because these utilities do not actually do all that folks think they do. These utilities measure VRAM allocation, not usage.

https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

These tools don't "actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, ..... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available".



You have yet to provide any information that supports your case. TPU testing is in direct and complete conflict with your position; the numbers are what the numbers are and they show otherwise. You want to change our minds, show the data. I have always maintained ....

1. Except in rare instances (i.e Hitman), you do NOT see a substantial impact on performance due to VRAM between the 6GB and 3 GB models until you get to 2160p,

2. And when ya get there, in most AAA games, it won't matter as the GPU is inadequate in any case.

You can confront t and call it nonsense if ya want ... but with no data, don't expect anyone to be convinced. Kellyanne Conway "confronted" the press a few weeks ago claiming"speeding up a film is not doctoring" and that "they do this stuff all the time in sports, they speed up the film so they can tell whether it's a touchdown or 1st down". She didn't make her case with many people as that "Speeding up thing" she was talking about .... most of us call it "slow motion".

1. Is it not fair to say that higher resolutions (1440p) need more VRAM then lower (1080p)?

2. Then, if we accept that premise, it's a given that if a 6 GB card has a 6% advantage at 1080p, then without exception, if VRAM is an issue, that gap must invariably widen at higher resolutions.

Lets look at the graph above ..... "the practical case right here" confirms what I am saying.

COD on 1060 3 GB = 61.6 fps w/ 1152 shaders
COD on 1060 6 GB = 68.8 fps w/ 1280 shaders

So the card with 11.1 % more shaders is 11.7% faster in that test. I don't see anything related to VRAM impact....to test that, we need to look and see if that advantage increases at higher resolutions. So lets puts this to bed once and for all with just a wee bit of data ....

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_1920_1080.png
The 6 GB card does 65.2 fps to the 6 GB cards 60.4 in COD 3 meaning it is 7.9% faster at 1080p in COD3 .... not far from TPUs avaergae oif 6% in the entire test suite.

So, for your "theory" to be correct, that 7.9 % advantage must invariably increase at 1440p .. if it doesn't, then VRAM is not a factor,

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_2560_1440.png
The 6 GB card does 41.0 fps to the 6 GB cards 39.2 in COD 3 meaning it is 4.6 % faster at 1440p. So if the smaller VRAM is the problem, why isn't the problem bigger at 1440p ?

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_3840_2160.png
The 6 GB card does 20.6 fps to the 6 GB cards18.6 in COD 3 meaning it is 10.8% faster at 2160p.....and unplayable in both cases so VRAM matters zilch at 2160p

Again, as I have always maintained ....

1. Except in rare instances (i.e Hitman), you do NOT see a substantial impact on performance due to VRAM between the 6GB and 3 GB models at 1080p and often 1440p, Until you get to 2160p, the VRAM is not an issue ***in most games***. Clearly, given the numbers, no data supports it being an issue in COD3. Hitman is one of those games that does show an impact as the performance advantage of the 6 GB card jumps from 19% at 1080p to 26.3 % at 1440p. That doesn't show that VRAM matters at 1080p, but clearly it has an impact in this game at 1440p

2. And when ya get t2160p, in most AAA games, it won't matter as the GPU is inadequate in any case.

Clearly VRAM is not the issue in COD3. Clearly, it's not the issue in most games in TPUs test suite

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_1920_1080.png
6GB is 6% faster then 3 GB overall at 1080p

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_2560_1440.png
6GB is 6% faster then 3 GB overall at 1440p ... so again, if VRAM is inadequate at 1080p, for the position you present to be hold water, it must invariable be more inadequate at 1440p. It is not ... so the water leaked out.

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_3840_2160.png
Finally at 2160p, we see the 6 GB having an impact as the performance difference jumps to 14%. And yes, it is completely irrelevant as no games are actually playable above 40 fps with either card and most don't break 30 fps.

Furthermore, walking way from this analysis, I can't agree that COD3 is impacted by VRAM at 1080p .... I'm going to continue to agree with Wizzard in his conclusion in the 3 GB 1060 review where he states that most* "games seem completely unaffected by having 3 GB less VRAM at their disposal, especially at 1080p."

* Most being 16 of the 18 games in TPU testing (all but Hitman and Tomb Raider)

Stop reading and start playing and maybe the penny will drop at some point. Reviews rarely touch edge cases or peak loads, and they take the averages of multiple runs to present to you. There is only one edge case in your wall of text, which is the 3GB card pushing 4K. It is when you ask too much of VRAM that it will lose the linear performance loss and turns into a major frame drop - this is why higher end cards remain consistent for so much longer than, for example, a crippled 970 will (you can see those topics popping up now). And yes, that can happen on any res, especially on cards with a rather small, 192 bit bus. Is it common? No. Is it possible? Absolutely.

W1zz also reviewed a 1060 with faster memory not too long ago and a 7% core OC + 44% mem OC netted him a 13-14% overall performance gain.
How is that possible? Surely it can't.... Seems that 1060 is having a little struggle with VRAM capabilities after all. Even the 6GB version gets beyond linear performance from its core bump when the memory is faster (or: bandwidth is increased, more accurately). Swaps can happen much faster and no longer hold up the core.

It is commonplace for Nvidia's midrange. Almost every x60 card is tight on VRAM, either or both in capacity and bus width.

https://www.techpowerup.com/reviews/KFA2/GTX_1060_6_GB_GDDR5X/34.html

That is all I'm going to say about this. You can do whatever you want with that :) I got convinced of the importance of VRAM capacity and bus width through experience, not through reading reviews. I even made the stupid of putting an x60 in SLI... twice the core for similar, crippled VRAM... it was horrible. Followed that up with a 3GB 780ti... and again, VRAM based stutter knocked that one out of the game for me as I started playing more recent (2017>) titles. Its hard to support that with data, I don't run FRAPS while gaming.
 
Last edited:
Top