*the CPU different is not that large - worst case its about 10%. DX12 will change that for future titles.
*power/heat - valid concern.
"3.5GB will always be enough" - are you new to the PC gaming world? 2 years ago "1GB will always be enough" was common. times change, fast.- for a ~2 year investment, this is not good advice.
*8GB VRAM is irrelevant: fair point on multi GPU, but again... VRAM usage climbs year after year. if he keeps this card 2-3 years, 4GB will not be enough then.
my 3GB 7970 was holding me back in a few titles, i'm really confident in a year or so i'd be turning settings down in all major titles to work on that card - 3.5GB is only one setting away from that.
Wait, wait wait.
VRAM usage is not growing linear with time at all. In fact, it grows all of a sudden and then comes to a halt for about 5 years or more. Look at the start of the PS3 era and the end of it for a prime example. The only reason we need more VRAM today is because of console ports that are badly ported and apply the consoles' EDRAM or 6 gigs of GDDR5- principle to a PC environment. Now look at The Witcher 3 at max settings, still taking a healthy 1700 MB. That's right, a 2015 game, ported to consoles, with a measly limit of under 2 GB. Even with maxed textures, every bell and whistle, at a glorious 1080p. Crysis 3, arguably still one of the harder games to run on big rigs, is one of the few exceptions that go beyond 2 GB and *actually make use of it* (important little bit there!)... but when you do, CryEngine also doesn't really want to give you 60 fps unless you crossfire or SLI something.
Now, today, we have 4K. This is the ONLY development and reason I see for now and the coming 5 years where you would want more than 4GB. And then there is the consideration whether you could actually run things at your preferred settings at 4K and a decent FPS with the current line up of cards *at all*.
Trust me on my blue eyes: the new consoles have landed and will stay in the game for about 4-5 more years. 4K is going to be enthusiast territory for at least 2 more years, but I reckon even 3 to 4 years, and by then you will most certainly have incredibly appealing alternatives to the 8GB underpowered junk they are selling now.
VRAM is heavily overrated. The 970 proved the point, being the fastest 1080p card with the lowest TDP... and getting there by cutting into VRAM budget. A little trick that AMD with immense VRAM bus -driven cards has no way of competing with - even at top resolutions the gain they have because of that compared to Nvidia is minimal at best. They gain, but is it a sensible pay off considering the power budget required? Look back at the release of Hawaii and you have your answer. The only reason they are still competing is because of price cuts and getting kicked down to Nvidia's smaller xx-107 chip tier.
If any rule of thumb can be followed today, it is that the PC gaming market follows the console gaming market, and that the console gaming market is decisive for the spec requirements of games on PC. Another rule of thumb: console versions run for about 5 to 7 years max.
Bottom line: 8 GB is wasted silicon and power budget on todays' cards. Even the 980ti is edging on being good balance with its 6GB. And not unsurprising, the power budget limits on 980ti actually limit the card at stock cooling. 1+1 = ?