There was a post by a member here as follows: I'm looking to find some sort of support/contrary evidence for this assertion. Be it a link from somewhere stating this, or at minimum some screenshots showing the differences. Backstory: With 1GB of ram @ 1920x1080 4xAA, Ultra, HBAO, I noticed some stuttering in games and ram use maxed at 950+ MB. When moving up to a 1.5GB card (GTX580) with the same settings the ram usage increased to around 1.2GB with no stuttering. With a 2GB card (GTX680) with the same settings, the ram usage increased to around 1.8GB. And with a 3GB card (7970) 1.8GB seemed to level off with those same settings. I have never seen a game dynamically lower IQ (doesnt mean it doesnt exist of course) with available vram, so Im looking for something more concrete than a forum post. Can anyone help out with some information regarding the Frostbite2 engine doing this as well as some screenshots showing the visual differences? Thanks!