Thursday, June 18th 2015

AMD "Fiji" Silicon Lacks HDMI 2.0 Support

It turns out that AMD's new "Fiji" silicon lacks HDMI 2.0 support, after all. Commenting on OCUK Forums, an AMD representative confirmed that the chip lacks support for the connector standard, implying that it's limited to HDMI 1.4a. HDMI 2.0 offers sufficient bandwidth for 4K Ultra HD resolution at 60 Hz. While the chip's other connectivity option, DisplayPort 1.2a supports 4K at 60 Hz - as do every 4K Ultra HD monitor ever launched - the lack of HDMI 2.0 support hurts the chip's living room ambitions, particularly with products such as the Radeon R9 Nano, which AMD CEO Lisa Su, stated that is being designed for the living room. You wouldn't need a GPU this powerful for 1080p TVs (a GTX 960 or R9 270X ITX card will do just fine), and if it's being designed for 4K UHD TVs, then its HDMI interface will cap visuals at a console-rivaling 30 Hz.
Source: OCUK Forums
Add your own comment

139 Comments on AMD "Fiji" Silicon Lacks HDMI 2.0 Support

#101
FordGT90Concept
"I go fast!1!11!1!"
DisplayPort 1.2 = 17.28 gbps
HDMI 2.0 = 18 gbps
DisplayPort 1.3 = 32.4 gbps
The best HDMI cables can do is 25 Gbps and those are the best of the best cables over very short distances.

There's a chart here on HDMI2: www.dpllabs.com/page/dpl-full-4k-cable-certification

DisplayPort 1.3 should be able to handle 4:4:4 4K @ 16-bits per color where HDMI 2.0 can only handle 8-bits per color. Not to mention DisplayPort 1.3 can carry an HDMI signal. As if that weren't enough, DisplayPort 1.3 is capable of VESA Display Stream Compression which can further increase effective payload.


If Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.
Posted on Reply
#102
mister2
FordGT90ConceptDisplayPort 1.3 is 32.4 Gbps
Yup, and supports 8k, though display adoption is still very slim right now.
Posted on Reply
#103
Tatty_Two
Gone Fishing
swirl09I dont think they want me getting this card :/

I use DVI for my monitor (fair enough, its getting on and things need to move forward), but Ive a lovely TV which I use currently as my 4k gaming display when I feel like being on the couch and thats over HDMI2 (And yes 4k@60)
So would you mind sharing what cable you have to allow you that @60?
Posted on Reply
#104
Steevo
www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of


Nvidia used a 4:2:0 @ 8bit to get 4K 60Hz working.


Great game looks, MSAA with blocky color gradients, but Nvidia users are used to that since their cards colors a always Fuxxed up. www.reddit.com/r/pcgaming/comments/2p3xs7/nvidia_users_using_hdmi_output_youre_most_likely

forums.geforce.com/default/topic/770359/geforce-drivers/fix-for-the-limited-color-range-issue-in-hdmi-dp/

www.neogaf.com/forum/showthread.php?t=953806
Posted on Reply
#105
xfia
the amd middle finger to people wasting money? 4k tv's got that price tag.
Posted on Reply
#106
mister2
Steevowww.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of


Nvidia used a 4:2:0 @ 8bit to get 4K 60Hz working.


Great game looks, MSAA with blocky color gradients, but Nvidia users are used to that since their cards colors a always Fuxxed up. www.reddit.com/r/pcgaming/comments/2p3xs7/nvidia_users_using_hdmi_output_youre_most_likely

forums.geforce.com/default/topic/770359/geforce-drivers/fix-for-the-limited-color-range-issue-in-hdmi-dp/

www.neogaf.com/forum/showthread.php?t=953806
Yup, I had to do the reg hack to get full range.

and that 4:2:0 is only for kepler (600/700 series).
Posted on Reply
#107
xfia
its seems your 4k may not even use that much color space.. because i only look at the awesome ones for gaming lol
Posted on Reply
#108
mister2
xfiaits seems your 4k may not even use that much color space.. because i only look at the awesome ones for gaming lol
Maybe. I play on a 65" Samsung.
Posted on Reply
#109
xfia
mister2Maybe. I play on a 65" Samsung.
well check it out.. if you got time to read a forum you can read and learn while you do it if thats what interests you.
welcome to tpu by the way.. probably more than obvious how much misleading information after reading this thread so just ask haha
i will tell you @HumanSmoke @Steevo @FordGT90Concept are probably going to give you the best strait up information.
i can give the mods shit but they are cool for the most part just some of them seem to be very limited in knowledge for the years they have been around reading articles.
i would like to be wrong for what i said about what mussels was doing but it does not seem that way.
Posted on Reply
#110
Steevo
Lets give this a good logical look.


HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.

Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.

AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?

Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.
Posted on Reply
#111
john_
[XC] Oj101I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature :D
You will be installing newer *hotfix* "stable" drivers every week until then :p
Posted on Reply
#112
newtekie1
Semi-Retired Folder
XzibitThis might help

TFA
  • Maximum Resolution: 4k @ 30 Hz (note: at time of writing no chipset supports 60Hz)
Nope, won't help one bit.
Posted on Reply
#113
$ReaPeR$
buggalugslol at people complaining about this. 95% of users will buy this card to use on a monitor with displayport......and for the few % that want to use a 4K TV, most of the 4K TVs and even monitors on the market don't even support HDMI 2.0 yet.
SO maybe there is 0.0000000001 % of the market that will feel let down.
Sure, if you're planning to hold onto the card for 5 years and want to use a TV it could be a problem, but these cards will be obsolete in 12 months when 14/16 nm cards arrive anyway, and most people will have to buy a new 4K TV to support HDMI 2.0 anyway
If you're buying 4K TVs and $500-$600 graphics cards I'm sure you can afford an upgrade next year.
+1 to that!!! i really dont get why so many people in here are so worked up.. i mean if you can afford to pay those amount of $ why the hell wouldnt you buy a panel that can support DP.. i really dont get it.
Ferrum MasterWho the Sock cares?? And if even if someone do care -

Buy a TV with display port then like this Panasonic TC-L65WT600 has...
exactly!!! as if the 4k displays are chep in the first place..
john_I can write it in Greek if you prefer. No misspelling there. By the way. Who is the retard here? The person who makes a mistake writing in another language, or the person that comments about that, like you did? Anyway, I can understand you being upset. Try to relax.
ela mori Elladara!!! ;)
the54thvoidSuch an ill tempered thread. How about people stop being dicks and stick to the topic.
thank you!
Tatty_OneThank you, that answers the question at the end of R.H.P's post # 71, all my point is/was is just because this limitation does not hinder some users AMD is still in my opinion missing a trick and an opportunity with all those 4K TV owners who don't want to spend another bunch of cash on a monitor, in my case it's more about commiserating with 4K TV owners than criticising AMD but both go hand in hand to a degree.
i really think this is a non issue.. if you have $$ in order to buy a 4k tv and such expensive GPUs i am certain that you can afford a new tv with all the bells and whistles..
FordGT90ConceptNOooooooope! That's HDMI 1.4 (30Hz max at 4K).
Noooooope, HDMI 1.4. Two comments say 30 Hz is the best it can do at 4K, if it even does 4K.
HDMI 2 support is near non-existant. Why would Fiji be an exception to the rule? I'm disappointed it doesn't but at the same time, I don't care. DisplayPort is the future, not HDMI.
Look at the reviews. Advertized as 4K: only does 30Hz; horrible reviews. This is the problem with HDMI2. They keep trying to ram more bits through that hose without changing the hose. The connectors may be able to handle the advertised 60 Hz but cables cannot. When genuine 2.0 compliant cables debut, they'll probably be like $20+ per foot because of the massive insulation required to prevent crosstalk and interference. HDMI has always been the retard of the display standards taking the DVI spec, tacking audio on to it, and disregarding all the controls VESA put on it to guarantee the cable will work. This was in inevitability with HDMI because the people behind HDMI haven't a clue what they're doing. This is why VESA didn't get behind HDMI and why they went off and designed their own standard that's actually prepared to handle the task. HDMI is not suitable for 4K and likely never will be.
The Fury X manual only mentions DisplayPort 1.2 repeatedly. I don't know if it supports 1.3:
support.amd.com/Documents/amd-radeon-r9-fury-x.pdf
thank you!
SteevoLets give this a good logical look.
HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.
AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?
Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.
the truth has been spoken!!!!

i really find this a non issue. and i want to thank you guys for trying to be objective and civil.
Posted on Reply
#114
john_
FordGT90ConceptIf Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.
I wouldn't expect that. They only need 1.2a for Freesync, so they will be fine with that. I believe they decided to spend the last dollars they had on the LEDs instead of implementing DP1.3.
SteevoHDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
AMD was working with Freesync over HDMI at Computex and the same was rumored for Nvidia.

AMD Demonstrates FreeSync-over-HDMI Concept Hardware at Computex 2015
Posted on Reply
#115
Octavean
People have every right to expect that new video cards will support newer standards like HDMI 2.0. If AMD was trying to make some stand against HDMI (which I doubt) then it would be more appropriate for them to omit support for all versions of HDMI rather then stagnating on an older HDMI standard.

Based on that alone it seems more like a mistake then some message. Is it a big mistake, not IMO but it still looks like a mistake.

I also expect hardware H.265 encode and decode. If this HDMI 2.0 thing is true I wouldn't be surprised if that was a bust too.
Posted on Reply
#116
ShurikN
Ok, so HDMI 2.0 is needed for 4K/60.
How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
And most importantly, how much of the market share do they take?
Posted on Reply
#117
xfia
SteevoLets give this a good logical look.


HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.

Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.

AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?

Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.
30vs60pfs
i guess that is a question of budget and standard. 30fps is playable and people do it every day.. i like 50 because going around 50-60. not really noticeable to me. wont being locked into a refresh rate eventually cause input lag if not driving someone crazy for days trying to fix it?
g-sync vs freesync
they are both good and above my standard on refresh rates and totally specd out in my opinion.
i do like how freesync works and doesnt need extra parts in the display that you get charged for because oem's get charged the cost of the extra hardware with a license free.
yet another open standard amd helped to put on paper way before gsync was a thought.
Posted on Reply
#118
Octavean
ShurikNOk, so HDMI 2.0 is needed for 4K/60.
How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
And most importantly, how much of the market share do they take?
That's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
Posted on Reply
#119
Steevo
OctaveanThat's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
With 8 bit color.

www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx


So a little color schooling.


What do you see?

i4.minus.com/ibyJcwdIniHUEs.png


en.wikipedia.org/wiki/Color_depth

Even if you saw the highest end, it may only be processed in 8 bit per color instead of 10 and thus will still have blocking and gradients. HDMI 2.0 is still shit compared to Display Port.
Posted on Reply
#120
xfia
OctaveanThat's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
your missing that the color space is filled and is not the true color of the content when your 4:4:4 via hdmi 2.0 4k@60hz. so if your display can do better than 8bits its not going to be what it should.
displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz
Posted on Reply
#121
Octavean
xfiayour missing that the color space is filled and is not the true color of the content when your 4:4:4 via hdmi 2.0 4k@60hz. so if your display can do better than 8bits its not going to be what it should.
displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz
So what is your point,......?

Tell it to the industry making UHD TV's.

My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.
Posted on Reply
#122
Steevo
OctaveanSo what is your point,......?

Tell it to the industry making UHD TV's.

My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.



Either you understand that 8 bit color looks like shit, or you don't.

We have two simple scenarios in which you replay to a thread about a new graphics card where HDMI 2.0 is NOT supported.


1) You care as you have something relevant to add, understand what it means, why its important or not.

2) You are a Nvidiot and need to thread crap elsewhere.
Posted on Reply
#123
HumanSmoke
You guys are arguing two different standpoints that are mutually exclusive.
@Octavean is putting forward that HDMI 2.0 has favour with TV vendors and even if it lacks bandwidth compared with DP, will still be utilized.
@Steevo ...well you're basically arguing that DP is better than HDMI and graphics vendors should concentrate on it even though TV manufacturers aren't using it to any great extent.

One is argument about tech implementation (and a few insults), one is about practical implementation in a real market.
Posted on Reply
#124
Xzibit
4k standards is defined as 4k resolution 10-bit+ Rec/BT 2020. HDMI 2.0 can only do that at 4K/30hz.

That still isn't the overall issues because even then your upscaling or downscaling thru the chain
Posted on Reply
#125
FordGT90Concept
"I go fast!1!11!1!"
OctaveanIf the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
DisplayPort 1.3 was released almost exactly a year after HDMI 2.0. If the latter wasn't adopted, the former most certainly isn't. These are probably things they're putting off for 16/14nm.

Titan X has HDMI 2.0 but DisplayPort 1.2 (not "a" for Adaptive V-Sync support). So right now we either have to go with HDMI 1.4a and DisplayPort 1.2a or HDMI 2.0 and DisplayPort 1.2. I think I'd have to go with the former because I loathe proprietary standards like G-Sync and all HDMI has to ever power for me is a 1920x1200 display via DVI adapter.
HumanSmokeYou guys are arguing two different standpoints that are mutually exclusive.
@Octavean is putting forward that HDMI 2.0 has favour with TV vendors and even if it lacks bandwidth compared with DP, will still be utilized.
@Steevo ...well you're basically arguing that DP is better than HDMI and graphics vendors should concentrate on it even though TV manufacturers aren't using it to any great extent.

One is argument about tech implementation (and a few insults), one is about practical implementation in a real market.
The argument Steevo makes, and one I agree with, is that HDMI 2.0 should be terminated and DisplayPort should be replacing it in full. DisplayPort supports HDMI packets so DisplayPort has backwards compatibility ingrained. There's no reason HDMI 2.0 exists other than, as Steevo said, "Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO. " It's the TV industry trying to dictate what standard people use because they refuse to provide an affordable alternative.
Posted on Reply
Add your own comment
May 9th, 2024 06:08 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts