Everyone seem to forget that 4GB on Fury are HBM, which behave differently from GDDR5. Don't compare 4GB of HBM with 4GB of GDDR5, they are not the same. For HDMI 2.0, I doubt that any card claiming to have full support has 4:4:4@60Hz at 4K. DP is the future of 4K imho.
Bullshit. If you have 4GB of geometry and textures, then you fill 4GB. HBM doesn't change how much needs to get rendered, it just changes
how it's delivered, not
what is delivered.
The points I made are facts and being a 4k gamer I know these 3 points are very important.
Actually, VRAM usage doesn't increase by a whole lot just from jumping up to a higher resolution. It does increase but that alone isn't a huge reason to jump ship. The HDMI 2.0 is if you plan on play on a TV. Any half sensible person going 4k as a computer monitor would get it with DisplayPort, to do otherwise is simply dumb. As for as the DX12.1 features, aren't most of those eye candy related and won't impact running a DX12 game?
If you can afford 4k now, you probably can afford a new GPU when 4GB isn't enough, so I find the things "important to you" to be laughable at best. At least AMD gives you 4GB that's uniform and doesn't behave differently once you've used a certain amount. As for drivers, I don't recall AMD frying any GPUs with a driver update which is a plus in my book. CFX aside, I've had very few issues with CCC. So lets leave the general asshattery at home as it's not going to help the thread and it only serves to polarize the thread.
Simple fact of the matter is that Fury X still isn't out yet and we still don't have reliable data to base any assumptions whatsoever on. However, my concern is that power consumption is still going to be a weak point.
Lastly: Considering 4k is still an immature technology, I wouldn't even bother investing into it until it has more time to mature for these very reasons because when push comes to shove, most hardware can't handle it and you're paying through the nose to get it.