Thursday, June 18th 2015

AMD "Fiji" Silicon Lacks HDMI 2.0 Support

It turns out that AMD's new "Fiji" silicon lacks HDMI 2.0 support, after all. Commenting on OCUK Forums, an AMD representative confirmed that the chip lacks support for the connector standard, implying that it's limited to HDMI 1.4a. HDMI 2.0 offers sufficient bandwidth for 4K Ultra HD resolution at 60 Hz. While the chip's other connectivity option, DisplayPort 1.2a supports 4K at 60 Hz - as do every 4K Ultra HD monitor ever launched - the lack of HDMI 2.0 support hurts the chip's living room ambitions, particularly with products such as the Radeon R9 Nano, which AMD CEO Lisa Su, stated that is being designed for the living room. You wouldn't need a GPU this powerful for 1080p TVs (a GTX 960 or R9 270X ITX card will do just fine), and if it's being designed for 4K UHD TVs, then its HDMI interface will cap visuals at a console-rivaling 30 Hz.
Source: OCUK Forums
Add your own comment

139 Comments on AMD "Fiji" Silicon Lacks HDMI 2.0 Support

#76
GhostRyder
AssimilatorMaxwell 2, which is almost a year old shipped with HDMI 2.0 support from day 1, yet AMD can't get it into their own "high-end" product. That's inexcusable.
Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.
Yea guess what, its still cheaper than the Titan X...

Odd for them to not include that, I had thought it was confirmed to be part of the Fiji card...Well its definitely not smart of them to not include it especially with the focus at 4k. However, I find DP and monitors to be better as it is and that is what I use for my 4k experience so it would not bother me one bit.
Posted on Reply
#77
64K
I think AMD should have included HDMI 2.0 support on their high end cards. They will lose some sales of Fiji over this but how many? The last Steam Hardware Survey I looked at said that about 1 out of 1,650 are gaming at 4K.
Posted on Reply
#78
FordGT90Concept
"I go fast!1!11!1!"
XzibitThis might help

NOooooooope! That's HDMI 1.4 (30Hz max at 4K).
jigar2speedFound this- Display port to HDMI 2 - www.amazon.com/dp/B00E964YGC/?tag=tec06d-20

But in the review section people are still complaining - is it related to driver ?
Noooooope, HDMI 1.4. Two comments say 30 Hz is the best it can do at 4K, if it even does 4K.

HDMI 2 support is near non-existant. Why would Fiji be an exception to the rule? I'm disappointed it doesn't but at the same time, I don't care. DisplayPort is the future, not HDMI.
john_How about this one?

Amazon.com: Belkin Displayport to HDMI Adapter (Supports HDMI 2.0): Electronics



  • Supports HDMI 2.0 Technology, which increases bandwidth from 10.2 Gbps to 18 Gbps and is 4k and Ultra HD compatible. Increases from 8 Audio Channels to 32 Audio Channels for expanded audio. 60 fps video playback at 4k resolution. Dynamic synchronization of video and audio streams.
$11.56
Look at the reviews. Advertized as 4K: only does 30Hz; horrible reviews. This is the problem with HDMI2. They keep trying to ram more bits through that hose without changing the hose. The connectors may be able to handle the advertised 60 Hz but cables cannot. When genuine 2.0 compliant cables debut, they'll probably be like $20+ per foot because of the massive insulation required to prevent crosstalk and interference. HDMI has always been the retard of the display standards taking the DVI spec, tacking audio on to it, and disregarding all the controls VESA put on it to guarantee the cable will work. This was in inevitability with HDMI because the people behind HDMI haven't a clue what they're doing. This is why VESA didn't get behind HDMI and why they went off and designed their own standard that's actually prepared to handle the task. HDMI is not suitable for 4K and likely never will be.


The Fury X manual only mentions DisplayPort 1.2 repeatedly. I don't know if it supports 1.3:
support.amd.com/Documents/amd-radeon-r9-fury-x.pdf
Posted on Reply
#79
mroofie
GhostRyderYea guess what, its still cheaper than the Titan X...

Odd for them to not include that, I had thought it was confirmed to be part of the Fiji card...Well its definitely not smart of them to not include it especially with the focus at 4k. However, I find DP and monitors to be better as it is and that is what I use for my 4k experience so it would not bother me one bit.
and the gtx 980 ti ?
Posted on Reply
#80
[XC] Oj101
john_Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.
I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature :D
Posted on Reply
#81
Whilhelm
FordGT90ConceptHDMI 2 support is near non-existant. Why would Fiji be an exception to the rule? I'm disappointed it doesn't but at the same time, I don't care. DisplayPort is the future, not HDMI.

HDMI is not suitable for 4K and likely never will be.
All new 4k TVs are HDMI 2.0 compliant and this standard isn't going anywhere anytime soon. UHD BluRay is imminent and all those players will be HDMI 2.0. Aside from 2.0 pretty much every other device uses some variant of HDMI so it is the standard for connectivity in the home theater world.

I agree that Displayport is a better solution but at the time this card is being released HDMI 2.0 is the connection that is required by the HTPC market. If Display port ever takes off as a TV connection it will be too late for this card since there will be something better by that time. To me it simply does not make sense for AMD to not adopt the standard that is available and being used today.

Also HDMI 2.0 works fine for 4k 60hz, yes it is very close to the threshold of HDMI bandwidth limits but it still works so why not use it.

Not trying to be rude to you specifically, just pointing out another side of the argument.
Posted on Reply
#82
FordGT90Concept
"I go fast!1!11!1!"
Sure, the TVs have HDMI2 inputs but find a cable that can actually handle 4K @ 60 Hz reliably. Until there are cables that can handle it, the rest is moot.
Posted on Reply
#83
Whilhelm
FordGT90ConceptSure, the TVs have HDMI2 inputs but find a cable that can actually handle 4K @ 60 Hz.
The one that I have plugged in works fine.
Posted on Reply
#85
GhostRyder
mroofieand the gtx 980 ti ?
Its performance (if we believe the leaks) is better so its the same cost for more performance. To top it off it comes with a better cooler which when you factor that onto the GTX 980ti will turn out to be roughly the same (better cooler for GTX 980ti versus R9 Fury X with the cable).

So yes still a better deal (If the leaks are believed) depending on how you factor everything in. Of course this is all still based on rumor but factoring in everything and the fact most gaming monitors in this age have DP or you can use a DP to (Whatever plug you have) adaptor, its pretty irrelevant. The only downside to this is it makes it difficult for T.V.'s that have HDMI 2.0 running at 4k 60hz...
Posted on Reply
#86
Vlada011
I think that my curse helped little to AMD.
I prey to god, from moment when I saw that NVIDIA cut 980Ti I prey god to AMD win for that part.
Now all of us who couldn't afford 1200-1250e for TITAN X should turn back them or what??? What NVIDIA suggest to us now...? :)
But now and we will be good and welcomed for cheaper and weaker TITAN X as I told before few days.
They had big confidence to cut CUDA cores instead to go forward as much as possible with full GM200 with increased based clock with AIO if need and to prey god to keep crown they didn't care.
Now AMD will maybe to take crown exactly for that little peace and Maxwell will be first NVIDIA 3DMark loser after Fermi and Kepler.
What now? Feeding customers 10 months with rumors about Pascal while AMD sold their cards. They couldn't finish him before Spring no way.
You will see how much will NVIDIA ask for HBM II, fortune. Last 3-4 years on every perfidious way they try to silently increase prices, every year 100-150$ more for high end chip. Where is end? I think now.
And people whole time thought TITAN X is god miracle, no that was one strong card overpriced to the max.
But nothing special. NVIDIA had GK110 2013, and now they have 50% more performance.
Why NVIDIA decide to give little buy little that's different story, but they made 50% improvements from 2013.
For that period they ask 1000$ 3 times. TITAN, TITAN Black, TITAN X.
You need to be very tricky to force people to pay that.
Real value is 450-500-550MAX. Same as GTX580.
Their fans justify that as no competition... What is Fury than.
Let's take Intel as example... Intel no competition and hold CPU market much stronger than NVIDIA GPU...
Intel didn't ask 1500 or 2000$ for extreme processors. Every series of Intel processors is 10% cheaper or more expensive than before 5 years. They didn't increased price for double. But what NVIDIA do?
I'm glad because I was right and because cutting CUDA cores from GTX980Ti will cost NVIDIA crown for single chip and hundreds of thousands dollars, because we prey for full GM200 chip with increased clock, literally, months before they launch only to cut on 6GB video memory and to drop price as for normal GeForce series and they didn't had mercy. No 1200e - no full chip. That was their motto.
That was weird how NVIDIA teach their fans and they expect to AMD ask 1000$ day before presentation, they are almost sure in similar price as TITAN X.
Just because that's new technology HBM they have rights to ask 1000$.
Posted on Reply
#87
mister2
[XC] Oj101I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature :D
Because my GTX 980 didn't need a registry hack to support 4:4:4 over HDMI 2.0 and the driver doesn't reset desktop scaling 4 times out of 10 when resuming from sleep?
Posted on Reply
#88
mroofie
Vlada011I think that my curse helped little to AMD.
I prey to god, from moment when I saw that NVIDIA cut 980Ti I prey god to AMD win for that part.
Now all of us who couldn't afford 1200-1250e for TITAN X should turn back them or what??? What NVIDIA suggest to us now...? :)
But now and we will be good and welcomed for cheaper and weaker TITAN X as I told before few days.
They had big confidence to cut CUDA cores instead to go forward as much as possible with full GM200 with increased based clock with AIO if need and to prey god to keep crown they didn't care.
Now AMD will maybe to take crown exactly for that little peace and Maxwell will be first NVIDIA 3DMark loser after Fermi and Kepler.
What now? Feeding customers 10 months with rumors about Pascal while AMD sold their cards. They couldn't finish him before Spring no way.
You will see how much will NVIDIA ask for HBM II, fortune. Last 3-4 years on every perfidious way they try to silently increase prices, every year 100-150$ more for high end chip. Where is end? I think now.
And people whole time thought TITAN X is god miracle, no that was one strong card overpriced to the max.
But nothing special. NVIDIA had GK110 2013, and now they have 50% more performance.
Why NVIDIA decide to give little buy little that's different story, but they made 50% improvements from 2013.
For that period they ask 1000$ 3 times. TITAN, TITAN Black, TITAN X.
You need to be very tricky to force people to pay that.
Real value is 450-500-550MAX. Same as GTX580.
Their fans justify that as no competition... What is Fury than.
Let's take Intel as example... Intel no competition and hold CPU market much stronger than NVIDIA GPU...
Intel didn't ask 1500 or 2000$ for extreme processors. Every series of Intel processors is 10% cheaper or more expensive than before 5 years. They didn't increased price for double. But what NVIDIA do?
I'm glad because I was right and because cutting CUDA cores from GTX980Ti will cost NVIDIA crown for single chip and hundreds of thousands dollars, because we prey for full GM200 chip with increased clock, literally, months before they launch only to cut on 6GB video memory and to drop price as for normal GeForce series and they didn't had mercy. No 1200e - no full chip. That was their motto.
That was weird how NVIDIA teach their fans and they expect to AMD ask 1000$ day before presentation, they are almost sure in similar price as TITAN X.
Just because that's new technology HBM they have rights to ask 1000$.
Fanboy detected :rolleyes:
Posted on Reply
#89
swirl09
I dont think they want me getting this card :/

I use DVI for my monitor (fair enough, its getting on and things need to move forward), but Ive a lovely TV which I use currently as my 4k gaming display when I feel like being on the couch and thats over HDMI2 (And yes 4k@60)
Posted on Reply
#90
Octavean
WhilhelmAll new 4k TVs are HDMI 2.0 compliant and this standard isn't going anywhere anytime soon. UHD BluRay is imminent and all those players will be HDMI 2.0. Aside from 2.0 pretty much every other device uses some variant of HDMI so it is the standard for connectivity in the home theater world.

I agree that Displayport is a better solution but at the time this card is being released HDMI 2.0 is the connection that is required by the HTPC market. If Display port ever takes off as a TV connection it will be too late for this card since there will be something better by that time. To me it simply does not make sense for AMD to not adopt the standard that is available and being used today.

Also HDMI 2.0 works fine for 4k 60hz, yes it is very close to the threshold of HDMI bandwidth limits but it still works so why not use it.

Not trying to be rude to you specifically, just pointing out another side of the argument.
I agree 100%

Regardless of what the future will bring we still have to live in the here and now. Omitting HDMI 2.0 doesn't lend to a harmonious coexistence with UHD TVs in the here and now.

Its better to have it and not need it then to need it and not have it.

I'm not going to say that the omission of HDMI 2.0 would mean I would never buy one of these cards but it would jumpstart my urge to look elsewhere for a product that does support HDMI 2.0.
Posted on Reply
#91
TheGuruStud
Armchair gamers can keep their HDMI 2.0 :)
Posted on Reply
#92
xfia
TheGuruStudArmchair gamers can keep their HDMI 2.0 :)
they can certainly keep not getting full color on a uber expensive 4k tv.. if you actually think about it full circle its a bunch of garbage. especially when you realize displayport has been ahead of hdmi way before 2.0 came out.

hell yeah amd! toss the waist of cash hdmi to the curb!
Posted on Reply
#93
mister2
4:4:4 isn't full color? Interesting...
Posted on Reply
#94
Whilhelm
xfiaHell yeah amd! toss the waist of cash hdmi to the curb!
And with that they toss a bunch of potential buyers to the curb as well.
Posted on Reply
#95
xfia
mister24:4:4 isn't full color? Interesting...
4k@60hz with hdmi 2.0 is limited color because of insufficient bandwidth. so you pay a shit load and your tv is not looking as good as it could or not to its full potential.
Posted on Reply
#96
mister2
xfia4k@60hz with hdmi 2.0 is limited color because of insufficient bandwidth. so you pay a shit load and your tv is not looking as good as it could or not to its full potential.
I'm playing @ 4k60 over HDMI with 4:4:4....
Posted on Reply
#97
xfia
mister2I'm playing @ 4k60 over HDMI with 4:4:4....
not even close to the deep color range your able to get on your 4k with displayport. its called subsampling en.wikipedia.org/wiki/HDMI
@OneMoar i think was trying to tell me months ago why it was crap but i didnt fully read the hdmi wiki and the links on it.
Posted on Reply
#98
mister2
xfianot even close to the deep color range your able to get on your 4k with displayport. its called subsampling en.wikipedia.org/wiki/HDMI
@OneMoar i think was trying to tell me months ago why it was crap but i didnt fully read the hdmi wiki and the links on it.
"4:4:4 color is a platinum standard for color, and it’s extremely rare to see a recording device or camera that outputs 4:4:4 color. Since the human eye doesn’t really notice when color is removed, most of the higher-end devices output something called 4:2:2."

..... blogs.adobe.com/VideoRoad/2010/06/color_subsampling_or_what_is_4.html

I think you're confusing bandwidth with sub sampling. DP 1.2 supports 17.28Gbps, HDMI 2.0 supports 18Gbps. I do agree that DP is a better platform (although MST can be fussy with cables), but don't kid yourself into thinking that HDMI 2.0 "doesn't give full color reproduction".
Posted on Reply
#99
xfia
mister2"4:4:4 color is a platinum standard for color, and it’s extremely rare to see a recording device or camera that outputs 4:4:4 color. Since the human eye doesn’t really notice when color is removed, most of the higher-end devices output something called 4:2:2."

..... blogs.adobe.com/VideoRoad/2010/06/color_subsampling_or_what_is_4.html

I think you're confusing bandwidth with sub sampling. DP 1.2 supports 17.28Gbps, HDMI 2.0 supports 18Gbps. I do agree that DP is a better platform (although MST can be fussy with cables), but don't kid yourself into thinking that HDMI 2.0 "doesn't give full color reproduction".
well i will see what a few other people have to say about because you copied what that said and idk if that is crap or what unless i translated something wrong on some more technical information.
Posted on Reply
#100
mister2
xfiawell i will see what a few other people have to say about because you copied what that said and idk if that is crap or what unless i translated something wrong on some more technical information.
I copied it from Adobe. I would hope Adobe knows about color lol. All joking aside, we agree on the core topic (DP > HDMI), but HDMI 2.0 isn't limited in color reproduction compared to DP 1.2. I really wanted to get a Fury X, but since the home theater world relies on HDMI 2.0, it's a must have for me :(.
Posted on Reply
Add your own comment
Apr 24th, 2024 15:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts