Actually you can not. I hesitate to say that people are wrong when this statement is made .... they have simply been miinformed, not because they aren't useful tools but because these utilities do not actually do all that folks think they do. These utilities measure VRAM allocation, not usage.
https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
These tools don't
"actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, ..... They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available".
You have yet to provide any information that supports your case. TPU testing is in direct and complete conflict with your position; the numbers are what the numbers are and they show otherwise. You want to change our minds, show the data. I have always maintained ....
1. Except in rare instances (i.e Hitman), you do NOT see a substantial impact on performance due to VRAM between the 6GB and 3 GB models until you get to 2160p,
2. And when ya get there, in most AAA games, it won't matter as the GPU is inadequate in any case.
You can confront t and call it nonsense if ya want ... but with no data, don't expect anyone to be convinced. Kellyanne Conway "confronted" the press a few weeks ago claiming"speeding up a film is not doctoring" and that "they do this stuff all the time in sports, they speed up the film so they can tell whether it's a touchdown or 1st down". She didn't make her case with many people as that "Speeding up thing" she was talking about .... most of us call it "slow motion".
1. Is it not fair to say that higher resolutions (1440p) need more VRAM then lower (1080p)?
2. Then, if we accept that premise, it's a given that if a 6 GB card has a 6% advantage at 1080p,
then without exception, if VRAM is an issue, that gap must invariably widen at higher resolutions.
Lets look at the graph above ..... "the practical case right here" confirms what I am saying.
COD on 1060 3 GB = 61.6 fps w/ 1152 shaders
COD on 1060 6 GB = 68.8 fps w/ 1280 shaders
So the card with 11.1 % more shaders is 11.7% faster in that test. I don't see anything related to VRAM impact....to test that, we need to look and see if that advantage increases at higher resolutions. So lets puts this to bed once and for all with just a wee bit of data ....
https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_1920_1080.png
The 6 GB card does 65.2 fps to the 6 GB cards 60.4 in COD 3 meaning it is 7.9% faster at 1080p in COD3 .... not far from TPUs avaergae oif 6% in the entire test suite.
So, for your "theory" to be correct, that 7.9 % advantage must invariably increase at 1440p .. if it doesn't, then VRAM is not a factor,
https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_2560_1440.png
The 6 GB card does 41.0 fps to the 6 GB cards 39.2 in COD 3 meaning it is 4.6 % faster at 1440p. So if the smaller VRAM is the problem, why isn't the problem bigger at 1440p ?
https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/codbo3_3840_2160.png
The 6 GB card does 20.6 fps to the 6 GB cards18.6 in COD 3 meaning it is 10.8% faster at 2160p.....and unplayable in both cases so VRAM matters zilch at 2160p
Again, as I have always maintained ....
1. Except in rare instances (i.e Hitman), you do NOT see a substantial impact on performance due to VRAM between the 6GB and 3 GB models at 1080p and often 1440p, Until you get to 2160p, the VRAM is not an issue ***in most games***. Clearly, given the numbers, no data supports it being an issue in COD3. Hitman is one of those games that does show an impact as the performance advantage of the 6 GB card jumps from 19% at 1080p to 26.3 % at 1440p. That doesn't show that VRAM matters at 1080p, but clearly it has an impact in this game at 1440p
2. And when ya get t2160p, in most AAA games, it won't matter as the GPU is inadequate in any case.
Clearly VRAM is not the issue in COD3. Clearly, it's not the issue in most games in TPUs test suite
https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_1920_1080.png
6GB is 6% faster then 3 GB overall at 1080p
https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_2560_1440.png
6GB is 6% faster then 3 GB overall at 1440p ... so again, if VRAM is inadequate at 1080p, for the position you present to be hold water, it must invariable be more inadequate at 1440p. It is not ... so the water leaked out.
https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_3840_2160.png
Finally at 2160p, we see the 6 GB having an impact as the performance difference jumps to 14%. And yes, it is completely irrelevant as no games are actually playable above 40 fps with either card and most don't break 30 fps.
Furthermore, walking way from this analysis, I can't agree that COD3 is impacted by VRAM at 1080p .... I'm going to continue to agree with Wizzard in his conclusion in the 3 GB 1060 review where he states that most*
"games seem completely unaffected by having 3 GB less VRAM at their disposal, especially at 1080p."
* Most being 16 of the 18 games in TPU testing (all but Hitman and Tomb Raider)