If it really is the delta color compression it's very impressive gain. I remember on my GTX 970 at the same setting gobs up around 3.8GB which made it stutter. 4GB isn't enough nowadays, even 6GB is edging it. I play Rise of Tomb Raider the memory exceed 6GB memory at very high texture quality, and that is on 1920x1080
It's a mix of both I guess (compression and relative usage that varies from game session to game session and gpus).
And just because 2 or a few more games use more than 2 or 3gb you can't say it's not enough (I would know as a 3GB card owner), that's basically just wrong. Average what a game needs today is about 2-3 GB, saying this I'd never recommend someone a *new* 3 or 4 GB card now, because it would be careless for the future. The average bar rises but it rises slowly as always. Nothing changed. Basically the minimum "playable amount" is 2GB now, Radeon cards it's 3GB (HD 7970, R9 280X holds itself pretty well, but R9 380 2GB has problems even with compression). I already posted evidence on this multiple times.
Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need
Call of duty is not in any way to be called bad or "worse" it's a pretty well functioning engine - I commented the same on the article and I did now again. Just because a engine uses flexible amounts of Vram it's not bad, the absolute opposite is actually true.
All in all people here are overstating things about the vram amount "needed". More is better but is only useful in a few games anyway. I'd say 4GB is perfectly fine atm, even 3GB. 2GB is where only maxwell based cards such as 960 work properly and radeons start to suffer. This is because of better compression and drivers.