• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD "Fiji" Silicon Lacks HDMI 2.0 Support

Well so much for my interest in Fiji, I was looking forward to ditching my GTX 980 SLI setup in favor of a single 4k capable card from team red.

This is such a small thing to overlook that completely prevents me from being able to buy this card. I have a 4k 40" TV as my main monitor that uses HDMI 2.0 and having to buy a different TV because AMD decided to ignore HDMI 2.0 is crazy.

As much as Display port is a better solution its going to be years before HDMI is displaced as a TV connection standard. So, for AMD to decide to not include it as a 4k connectivity option is a mistake.
Thank you, that answers the question at the end of R.H.P's post # 71, all my point is/was is just because this limitation does not hinder some users AMD is still in my opinion missing a trick and an opportunity with all those 4K TV owners who don't want to spend another bunch of cash on a monitor, in my case it's more about commiserating with 4K TV owners than criticising AMD but both go hand in hand to a degree.
 
Maxwell 2, which is almost a year old shipped with HDMI 2.0 support from day 1, yet AMD can't get it into their own "high-end" product. That's inexcusable.
Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.
Yea guess what, its still cheaper than the Titan X...

Odd for them to not include that, I had thought it was confirmed to be part of the Fiji card...Well its definitely not smart of them to not include it especially with the focus at 4k. However, I find DP and monitors to be better as it is and that is what I use for my 4k experience so it would not bother me one bit.
 
I think AMD should have included HDMI 2.0 support on their high end cards. They will lose some sales of Fiji over this but how many? The last Steam Hardware Survey I looked at said that about 1 out of 1,650 are gaming at 4K.
 
This might help

TC-XXMDPHDMI-1024x783.png
NOooooooope! That's HDMI 1.4 (30Hz max at 4K).

Found this- Display port to HDMI 2 - http://www.amazon.com/dp/B00E964YGC/?tag=tec06d-20

But in the review section people are still complaining - is it related to driver ?
Noooooope, HDMI 1.4. Two comments say 30 Hz is the best it can do at 4K, if it even does 4K.

HDMI 2 support is near non-existant. Why would Fiji be an exception to the rule? I'm disappointed it doesn't but at the same time, I don't care. DisplayPort is the future, not HDMI.


Look at the reviews. Advertized as 4K: only does 30Hz; horrible reviews. This is the problem with HDMI2. They keep trying to ram more bits through that hose without changing the hose. The connectors may be able to handle the advertised 60 Hz but cables cannot. When genuine 2.0 compliant cables debut, they'll probably be like $20+ per foot because of the massive insulation required to prevent crosstalk and interference. HDMI has always been the retard of the display standards taking the DVI spec, tacking audio on to it, and disregarding all the controls VESA put on it to guarantee the cable will work. This was in inevitability with HDMI because the people behind HDMI haven't a clue what they're doing. This is why VESA didn't get behind HDMI and why they went off and designed their own standard that's actually prepared to handle the task. HDMI is not suitable for 4K and likely never will be.


The Fury X manual only mentions DisplayPort 1.2 repeatedly. I don't know if it supports 1.3:
http://support.amd.com/Documents/amd-radeon-r9-fury-x.pdf
 
Last edited:
Yea guess what, its still cheaper than the Titan X...

Odd for them to not include that, I had thought it was confirmed to be part of the Fiji card...Well its definitely not smart of them to not include it especially with the focus at 4k. However, I find DP and monitors to be better as it is and that is what I use for my 4k experience so it would not bother me one bit.
and the gtx 980 ti ?
 
Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.

I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature :D
 
HDMI 2 support is near non-existant. Why would Fiji be an exception to the rule? I'm disappointed it doesn't but at the same time, I don't care. DisplayPort is the future, not HDMI.

HDMI is not suitable for 4K and likely never will be.

All new 4k TVs are HDMI 2.0 compliant and this standard isn't going anywhere anytime soon. UHD BluRay is imminent and all those players will be HDMI 2.0. Aside from 2.0 pretty much every other device uses some variant of HDMI so it is the standard for connectivity in the home theater world.

I agree that Displayport is a better solution but at the time this card is being released HDMI 2.0 is the connection that is required by the HTPC market. If Display port ever takes off as a TV connection it will be too late for this card since there will be something better by that time. To me it simply does not make sense for AMD to not adopt the standard that is available and being used today.

Also HDMI 2.0 works fine for 4k 60hz, yes it is very close to the threshold of HDMI bandwidth limits but it still works so why not use it.

Not trying to be rude to you specifically, just pointing out another side of the argument.
 
Sure, the TVs have HDMI2 inputs but find a cable that can actually handle 4K @ 60 Hz reliably. Until there are cables that can handle it, the rest is moot.
 
Last edited:
Last edited:
and the gtx 980 ti ?
Its performance (if we believe the leaks) is better so its the same cost for more performance. To top it off it comes with a better cooler which when you factor that onto the GTX 980ti will turn out to be roughly the same (better cooler for GTX 980ti versus R9 Fury X with the cable).

So yes still a better deal (If the leaks are believed) depending on how you factor everything in. Of course this is all still based on rumor but factoring in everything and the fact most gaming monitors in this age have DP or you can use a DP to (Whatever plug you have) adaptor, its pretty irrelevant. The only downside to this is it makes it difficult for T.V.'s that have HDMI 2.0 running at 4k 60hz...
 
Last edited:
I think that my curse helped little to AMD.
I prey to god, from moment when I saw that NVIDIA cut 980Ti I prey god to AMD win for that part.
Now all of us who couldn't afford 1200-1250e for TITAN X should turn back them or what??? What NVIDIA suggest to us now...? :)
But now and we will be good and welcomed for cheaper and weaker TITAN X as I told before few days.
They had big confidence to cut CUDA cores instead to go forward as much as possible with full GM200 with increased based clock with AIO if need and to prey god to keep crown they didn't care.
Now AMD will maybe to take crown exactly for that little peace and Maxwell will be first NVIDIA 3DMark loser after Fermi and Kepler.
What now? Feeding customers 10 months with rumors about Pascal while AMD sold their cards. They couldn't finish him before Spring no way.
You will see how much will NVIDIA ask for HBM II, fortune. Last 3-4 years on every perfidious way they try to silently increase prices, every year 100-150$ more for high end chip. Where is end? I think now.
And people whole time thought TITAN X is god miracle, no that was one strong card overpriced to the max.
But nothing special. NVIDIA had GK110 2013, and now they have 50% more performance.
Why NVIDIA decide to give little buy little that's different story, but they made 50% improvements from 2013.
For that period they ask 1000$ 3 times. TITAN, TITAN Black, TITAN X.
You need to be very tricky to force people to pay that.
Real value is 450-500-550MAX. Same as GTX580.
Their fans justify that as no competition... What is Fury than.
Let's take Intel as example... Intel no competition and hold CPU market much stronger than NVIDIA GPU...
Intel didn't ask 1500 or 2000$ for extreme processors. Every series of Intel processors is 10% cheaper or more expensive than before 5 years. They didn't increased price for double. But what NVIDIA do?
I'm glad because I was right and because cutting CUDA cores from GTX980Ti will cost NVIDIA crown for single chip and hundreds of thousands dollars, because we prey for full GM200 chip with increased clock, literally, months before they launch only to cut on 6GB video memory and to drop price as for normal GeForce series and they didn't had mercy. No 1200e - no full chip. That was their motto.
That was weird how NVIDIA teach their fans and they expect to AMD ask 1000$ day before presentation, they are almost sure in similar price as TITAN X.
Just because that's new technology HBM they have rights to ask 1000$.
 
I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature :D

Because my GTX 980 didn't need a registry hack to support 4:4:4 over HDMI 2.0 and the driver doesn't reset desktop scaling 4 times out of 10 when resuming from sleep?
 
I think that my curse helped little to AMD.
I prey to god, from moment when I saw that NVIDIA cut 980Ti I prey god to AMD win for that part.
Now all of us who couldn't afford 1200-1250e for TITAN X should turn back them or what??? What NVIDIA suggest to us now...? :)
But now and we will be good and welcomed for cheaper and weaker TITAN X as I told before few days.
They had big confidence to cut CUDA cores instead to go forward as much as possible with full GM200 with increased based clock with AIO if need and to prey god to keep crown they didn't care.
Now AMD will maybe to take crown exactly for that little peace and Maxwell will be first NVIDIA 3DMark loser after Fermi and Kepler.
What now? Feeding customers 10 months with rumors about Pascal while AMD sold their cards. They couldn't finish him before Spring no way.
You will see how much will NVIDIA ask for HBM II, fortune. Last 3-4 years on every perfidious way they try to silently increase prices, every year 100-150$ more for high end chip. Where is end? I think now.
And people whole time thought TITAN X is god miracle, no that was one strong card overpriced to the max.
But nothing special. NVIDIA had GK110 2013, and now they have 50% more performance.
Why NVIDIA decide to give little buy little that's different story, but they made 50% improvements from 2013.
For that period they ask 1000$ 3 times. TITAN, TITAN Black, TITAN X.
You need to be very tricky to force people to pay that.
Real value is 450-500-550MAX. Same as GTX580.
Their fans justify that as no competition... What is Fury than.
Let's take Intel as example... Intel no competition and hold CPU market much stronger than NVIDIA GPU...
Intel didn't ask 1500 or 2000$ for extreme processors. Every series of Intel processors is 10% cheaper or more expensive than before 5 years. They didn't increased price for double. But what NVIDIA do?
I'm glad because I was right and because cutting CUDA cores from GTX980Ti will cost NVIDIA crown for single chip and hundreds of thousands dollars, because we prey for full GM200 chip with increased clock, literally, months before they launch only to cut on 6GB video memory and to drop price as for normal GeForce series and they didn't had mercy. No 1200e - no full chip. That was their motto.
That was weird how NVIDIA teach their fans and they expect to AMD ask 1000$ day before presentation, they are almost sure in similar price as TITAN X.
Just because that's new technology HBM they have rights to ask 1000$.

Fanboy detected :rolleyes:
 
I dont think they want me getting this card :/

I use DVI for my monitor (fair enough, its getting on and things need to move forward), but Ive a lovely TV which I use currently as my 4k gaming display when I feel like being on the couch and thats over HDMI2 (And yes 4k@60)
 
All new 4k TVs are HDMI 2.0 compliant and this standard isn't going anywhere anytime soon. UHD BluRay is imminent and all those players will be HDMI 2.0. Aside from 2.0 pretty much every other device uses some variant of HDMI so it is the standard for connectivity in the home theater world.

I agree that Displayport is a better solution but at the time this card is being released HDMI 2.0 is the connection that is required by the HTPC market. If Display port ever takes off as a TV connection it will be too late for this card since there will be something better by that time. To me it simply does not make sense for AMD to not adopt the standard that is available and being used today.

Also HDMI 2.0 works fine for 4k 60hz, yes it is very close to the threshold of HDMI bandwidth limits but it still works so why not use it.

Not trying to be rude to you specifically, just pointing out another side of the argument.

I agree 100%

Regardless of what the future will bring we still have to live in the here and now. Omitting HDMI 2.0 doesn't lend to a harmonious coexistence with UHD TVs in the here and now.

Its better to have it and not need it then to need it and not have it.

I'm not going to say that the omission of HDMI 2.0 would mean I would never buy one of these cards but it would jumpstart my urge to look elsewhere for a product that does support HDMI 2.0.
 
Armchair gamers can keep their HDMI 2.0 :)
 
Armchair gamers can keep their HDMI 2.0 :)
they can certainly keep not getting full color on a uber expensive 4k tv.. if you actually think about it full circle its a bunch of garbage. especially when you realize displayport has been ahead of hdmi way before 2.0 came out.

hell yeah amd! toss the waist of cash hdmi to the curb!
 
Hell yeah amd! toss the waist of cash hdmi to the curb!

And with that they toss a bunch of potential buyers to the curb as well.
 
4:4:4 isn't full color? Interesting...
4k@60hz with hdmi 2.0 is limited color because of insufficient bandwidth. so you pay a shit load and your tv is not looking as good as it could or not to its full potential.
 
4k@60hz with hdmi 2.0 is limited color because of insufficient bandwidth. so you pay a shit load and your tv is not looking as good as it could or not to its full potential.
I'm playing @ 4k60 over HDMI with 4:4:4....
 
not even close to the deep color range your able to get on your 4k with displayport. its called subsampling https://en.wikipedia.org/wiki/HDMI
@OneMoar i think was trying to tell me months ago why it was crap but i didnt fully read the hdmi wiki and the links on it.

"4:4:4 color is a platinum standard for color, and it’s extremely rare to see a recording device or camera that outputs 4:4:4 color. Since the human eye doesn’t really notice when color is removed, most of the higher-end devices output something called 4:2:2."

..... http://blogs.adobe.com/VideoRoad/2010/06/color_subsampling_or_what_is_4.html

I think you're confusing bandwidth with sub sampling. DP 1.2 supports 17.28Gbps, HDMI 2.0 supports 18Gbps. I do agree that DP is a better platform (although MST can be fussy with cables), but don't kid yourself into thinking that HDMI 2.0 "doesn't give full color reproduction".
 
"4:4:4 color is a platinum standard for color, and it’s extremely rare to see a recording device or camera that outputs 4:4:4 color. Since the human eye doesn’t really notice when color is removed, most of the higher-end devices output something called 4:2:2."

..... http://blogs.adobe.com/VideoRoad/2010/06/color_subsampling_or_what_is_4.html

I think you're confusing bandwidth with sub sampling. DP 1.2 supports 17.28Gbps, HDMI 2.0 supports 18Gbps. I do agree that DP is a better platform (although MST can be fussy with cables), but don't kid yourself into thinking that HDMI 2.0 "doesn't give full color reproduction".
well i will see what a few other people have to say about because you copied what that said and idk if that is crap or what unless i translated something wrong on some more technical information.
 
Back
Top