Right... and I think overall I either misunderstood the point or the analogies the guy was trying to make... here is the support....
I can refute the statement that cards with larger memory will request more memory' as a blanket statement. It seems to depend in title and amount of vram on the card. But I've seen 4/8/11 GB cards allocate the same amount of memory in most titles... does everyone else share that sentiment?
Thank you ! for conforming exactly what I am saying. It most certainly,as you said, will "depend on title and
amount of vram on the card". So if the the game sees 4 GB, it might allocate 2.5 GB ... if it sees 8 GB it might allocate 4.5 GB.
I have posted multiple references multiple times, perhaps reading the would have helped. Let's start here 1st with the RAM amount issue:
http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims to require 2750MB***. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.
Now the entire review contradicts what everyone was saying back then .... "make sure you get the 4 GB model". Those test shows without any shadow of date that 2 or 4GB was irrelevant. The only time that VRAM mattered was when settings were high enough as to make the game was unplayable . Whoo hooo 4 GB model got me 11.8% more fps in Sleeping Dogs at 2560 x 1600 ... who cared, you went from 13.5 fps to 15.9, ... unplayable with 2 or 4 GB. Same results with 6xx series by Puget Sound, same with, same rsults at Guru3D with 9xx, same results by Extremetech, same results in every review of this type I have ever read....same resukts with 10xx here on TPU. Most games are not affected at 1080p and most of those at 1440p. No doubt if ya search long and hard enough you can find a game which bucks the results, hard to explain why, but a porr console port is one known cause as the process gives little concern to such things.
Clearly, in the case of Max Payne, the GPU was requesting 2.750 GB and therefore would not install with only 2 GB, and yet it played with no discernable difference .... no o significant difference in performance, image quality, or user experience between them. Same was found to be true with other links previous posted for Puget Sound, Guru3D, ExtremeTech and Techpowerups. The simple fact is most games are not affected at 1080p with 3 GB, the ones that are are anomalies.... look at it this way, if 3 GB isn't enough or 1080p, then no card exists that has enough VRAM for 2160 ... aka 4 x 1080p. If 3 GB id no good at 1080p, then with 4 times the pixels, 4 x 3GB is no good for 2160p. The data is there, just have to read it.
In Wizzard's MSI 1060 3Gb review, he noted unexpected performance drops in Hitman and Tomb Raider, no explanation was presented nor do we have a means (that I can see) in determining why that is so given there are multiple differences between the cards. Is it the shaders ? is it the RAM, no way to tell. But it is very clearly stated that
"Other games seem completely unaffected by having 3 GB less VRAM at their disposal, especially at 1080p". I can't see any ambiguity in Wizzard's statement. As was shown , repeatedly, if 3 GB is an issue at 1080p, then it absolutely must follow that it will be a bigger issue at 1440p. Wizzard's test results show no significant difference in performance over the 18 game test suite between 1080p and 1440p, both showing 6% . Even when we get to 2160p, where VRAM does have a perfomane impact, the games, most of the games are unplyable and the other barely so as the GPU can not keep up.
With regard to "VRAM Usage" as used in available utilities, nvidia is paraphrasing Indigo Montoya (Princess Bride) saying " that word doesn't mean what you think it means". The link to nVidia's statement has been posted multiple times and no one has refuted it. It does provide the only explanation I have heard that "works" with the results above. If not, the other thread which birthed this one, has some issues as the OP say 2.0 to 2.5 GB IIRC, I'd like to see how it would possible show the same if installed witha 2 GB card. I'll check .... I have a pair of 560 2Gb laying around, I will see what happens.
https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”
Note the following sentence in the article quoting Mr. Bell
"Our own testing backed up this claim; VRAM monitoring is subject to a number of constraints." call it what you want ... allocation or usage but clearly just because you ***see*** 2.75 GB of RAM usage in a utility on a 4GB card does not mean the game won't run at the same fs, quality etc. on a 2 GB card.... regardless of what name you give it, it clearly is having no impact. Let's change the semantics, and focus on the point, the allegation is that seeing usage of 2.X GB in GPU_z proves that the game is being impacted. Clearly, in the referenced thread, GPUz is not going to show 2.X GB or VRAM usage if run on a 2 GB card. The number in any utility proves nothing of the sort. Max Payne showed this definitively and cinclusively .... the other 40 or so games in Alienbabeltech (7xx) showed mo significant hit. Same with Guru3D (9xx) , same with Puget Sound (6xx), and other TPUs tests showed this in 16 /18 games. Because of the difference in shaders, have not eliminated all other impacts on those two.
If there's evidence to the contrary, I'd be anxious to read it. The Max Payne experience clearly shows that the GPU requests more than it actually **needs** as it assigns 2.75 GB and won't allow game to run at those settings if not there. Play the semantic game if ya will but on a 2 GB card, it clearly is not impacted by not having the 2.75 GB the GPU insists on allocating for the game. ... so strictly so that it will not allow you to run it. If Mr. Bell is wrong abut allocation versus usage, then explain Max Payne. Until given, will have to take it that nVidia knows their product. Of all those commenting on the issue, Mr. Bell has the best resume. In the end, the semantics do not matter, whether it's allocation, usage or keebler eleves, the numbers reported by any utility are no indication whatsover that the game's performance is impacted when uitility reads 2.5 GB on a 3 GB card or even a 2 GB card. Whether it's using it, allocating it or whatever, is irrelevant ... if it is [pick a word you prefer] X.Y GB in the utility, that is no indication as to whether it's [pick a word you prefer] it because it needs that as a minimum or just grabbing in "just in case" ... or whatever. It's not indicating that performance is in any way impacted if less is available.. and THAT is the only point of relevance.
a 1060 3/6 GB vs 580 4/8 GB would be a perfect guinea pig test for this.
I'm gonna see if any of the kids has COD3 in the library. Will come back w/ results if its there.