Dual 970's should be pretty hot, someone @ TPU I saw benching a pair of MSI Twin Frozr cards which do admirably. But the heat...
I'm just sitting here waiting for TNBT (The next big thing). Shame I didn't pick up an RX 580 8GB when they weren't a hot item. Now a year has passed, and with not much left to wait for the next gen cards. I think I can wait a bit longer... Maybe.
I was tempted to get a used 970 (since it's performance when OC'ed is good for another year or two), but now they're selling for much more than they used to here.
Because of the mining craze, I went back to the card which took me though 4 years of gaming without many hitches, the GTX 460. Why? Because I can still count on it to play a lot of games today, it overclocks well and costs as much as decent gaming mouse or less.
All I want is to play heavilly modded Skyrim/Fallout and be ready for the next in the series. But that requires at LEAST a 1070 to play @ 1080p with near constant 75 FPS with third-party post processing on (I love you Boris).
Why do most of my loved games have to have really badly optimized engines. And when you start attaching tumors to it, it just keeps getting worse.
But I digress.
If you have previous gen xx70 or xx80 card, and most games run fine on your prefered resolution. Then yes, there is little reason to upgrade. Though some people have less limited budgets so they sell their high-end cards while they can to upgrade, to keep on top of new more demanding games that might come out later.
Most people still play @ 1080p resolutions so that's what a lot of product makers are targetting. As soon as a demanding title is available, your previous card will still be able to play it, but with reduced performance at particular settings, but always can remove some effects to get more performance without much loss in fidelity.
There's cases where resolution plays a large role if you need to upgrade. With higher-end cards you see diminishing returns on performance when you compare prices. There are multiple factors at play on how well a graphics card ages.
Graphics cards are more than pixel pushers nowadays. So I am wondering what if Nvidia/AMD got rid of CUDA and GCN cores altogether on their gaming cards. They aren't even used in games at all. They do help however if you are streaming/capturing footage on a game. But what if a pure gaming card existed and not the general purpose cards we get today. They do more sure, but at what cost?
I'm repeating myself but, it all depends on your needs really. If you need a card that pushes frames well on a game you like, it exists. Most of the time.
And if the card runs well anything you want to throw at it, why waste double the money on a card that gets 20 FPS more? Unless you need those extra frames.
The progression is efficiency-wise while still being capable. It was impossible to get anywhere when the final Kepler cards were released, the 28nm node and the architecture has hit it's limit. The only option was to push that ceiling ever so slightly, enough to be able to push clock speeds up and cram more transistors. And games, love higher clock speeds.
You really need those "revolutionary" architectural changes to get more out of limited sized piece of silicon. We were staying on the 28nm node for 3 GPU generations. Now it seems like there aren't as many refreshes as there were before. Though i'm not holding my breath, there are rumors of Nvidia doing a Pascal refresh instead of releasing Volta. Whether the rumors will prove true will be revealed only with time. They might not have to release Volta, simply because Pascal sold very well, and the RX Vega isn't going to make a dent in the sales, most people already have upgraded when they needed the extra performance, and VEGA is just too late.
More competition and better optimized triple-a title games are what us consumers need. But the GPU waters are so muddied right now. I don't even know.