• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD "Fiji" Silicon Lacks HDMI 2.0 Support

well i will see what a few other people have to say about because you copied what that said and idk if that is crap or what unless i translated something wrong on some more technical information.
I copied it from Adobe. I would hope Adobe knows about color lol. All joking aside, we agree on the core topic (DP > HDMI), but HDMI 2.0 isn't limited in color reproduction compared to DP 1.2. I really wanted to get a Fury X, but since the home theater world relies on HDMI 2.0, it's a must have for me :(.
 
DisplayPort 1.2 = 17.28 gbps
HDMI 2.0 = 18 gbps
DisplayPort 1.3 = 32.4 gbps
The best HDMI cables can do is 25 Gbps and those are the best of the best cables over very short distances.

There's a chart here on HDMI2: http://www.dpllabs.com/page/dpl-full-4k-cable-certification

DisplayPort 1.3 should be able to handle 4:4:4 4K @ 16-bits per color where HDMI 2.0 can only handle 8-bits per color. Not to mention DisplayPort 1.3 can carry an HDMI signal. As if that weren't enough, DisplayPort 1.3 is capable of VESA Display Stream Compression which can further increase effective payload.


If Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.
 
Last edited:
I dont think they want me getting this card :/

I use DVI for my monitor (fair enough, its getting on and things need to move forward), but Ive a lovely TV which I use currently as my 4k gaming display when I feel like being on the couch and thats over HDMI2 (And yes 4k@60)
So would you mind sharing what cable you have to allow you that @60?
 
the amd middle finger to people wasting money? 4k tv's got that price tag.
 
Yup, I had to do the reg hack to get full range.

and that 4:2:0 is only for kepler (600/700 series).
 
its seems your 4k may not even use that much color space.. because i only look at the awesome ones for gaming lol
 
Maybe. I play on a 65" Samsung.
well check it out.. if you got time to read a forum you can read and learn while you do it if thats what interests you.
welcome to tpu by the way.. probably more than obvious how much misleading information after reading this thread so just ask haha
i will tell you @HumanSmoke @Steevo @FordGT90Concept are probably going to give you the best strait up information.
i can give the mods shit but they are cool for the most part just some of them seem to be very limited in knowledge for the years they have been around reading articles.
i would like to be wrong for what i said about what mussels was doing but it does not seem that way.
 
Last edited:
Lets give this a good logical look.


HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.

Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.

AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?

Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.
 
I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature :D
You will be installing newer *hotfix* "stable" drivers every week until then :p
 
This might help

TC-XXMDPHDMI-1024x783.png

TFA said:
  • Maximum Resolution: 4k @ 30 Hz (note: at time of writing no chipset supports 60Hz)

Nope, won't help one bit.
 
lol at people complaining about this. 95% of users will buy this card to use on a monitor with displayport......and for the few % that want to use a 4K TV, most of the 4K TVs and even monitors on the market don't even support HDMI 2.0 yet.
SO maybe there is 0.0000000001 % of the market that will feel let down.
Sure, if you're planning to hold onto the card for 5 years and want to use a TV it could be a problem, but these cards will be obsolete in 12 months when 14/16 nm cards arrive anyway, and most people will have to buy a new 4K TV to support HDMI 2.0 anyway
If you're buying 4K TVs and $500-$600 graphics cards I'm sure you can afford an upgrade next year.

+1 to that!!! i really dont get why so many people in here are so worked up.. i mean if you can afford to pay those amount of $ why the hell wouldnt you buy a panel that can support DP.. i really dont get it.



Who the Sock cares?? And if even if someone do care -

Buy a TV with display port then like this Panasonic TC-L65WT600 has...


exactly!!! as if the 4k displays are chep in the first place..


I can write it in Greek if you prefer. No misspelling there. By the way. Who is the retard here? The person who makes a mistake writing in another language, or the person that comments about that, like you did? Anyway, I can understand you being upset. Try to relax.

ela mori Elladara!!! ;)



Such an ill tempered thread. How about people stop being dicks and stick to the topic.


thank you!


Thank you, that answers the question at the end of R.H.P's post # 71, all my point is/was is just because this limitation does not hinder some users AMD is still in my opinion missing a trick and an opportunity with all those 4K TV owners who don't want to spend another bunch of cash on a monitor, in my case it's more about commiserating with 4K TV owners than criticising AMD but both go hand in hand to a degree.


i really think this is a non issue.. if you have $$ in order to buy a 4k tv and such expensive GPUs i am certain that you can afford a new tv with all the bells and whistles..



NOooooooope! That's HDMI 1.4 (30Hz max at 4K).
Noooooope, HDMI 1.4. Two comments say 30 Hz is the best it can do at 4K, if it even does 4K.
HDMI 2 support is near non-existant. Why would Fiji be an exception to the rule? I'm disappointed it doesn't but at the same time, I don't care. DisplayPort is the future, not HDMI.
Look at the reviews. Advertized as 4K: only does 30Hz; horrible reviews. This is the problem with HDMI2. They keep trying to ram more bits through that hose without changing the hose. The connectors may be able to handle the advertised 60 Hz but cables cannot. When genuine 2.0 compliant cables debut, they'll probably be like $20+ per foot because of the massive insulation required to prevent crosstalk and interference. HDMI has always been the retard of the display standards taking the DVI spec, tacking audio on to it, and disregarding all the controls VESA put on it to guarantee the cable will work. This was in inevitability with HDMI because the people behind HDMI haven't a clue what they're doing. This is why VESA didn't get behind HDMI and why they went off and designed their own standard that's actually prepared to handle the task. HDMI is not suitable for 4K and likely never will be.
The Fury X manual only mentions DisplayPort 1.2 repeatedly. I don't know if it supports 1.3:
http://support.amd.com/Documents/amd-radeon-r9-fury-x.pdf


thank you!



Lets give this a good logical look.
HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.
AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?
Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.

the truth has been spoken!!!!

i really find this a non issue. and i want to thank you guys for trying to be objective and civil.
 
If Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.
I wouldn't expect that. They only need 1.2a for Freesync, so they will be fine with that. I believe they decided to spend the last dollars they had on the LEDs instead of implementing DP1.3.

HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
AMD was working with Freesync over HDMI at Computex and the same was rumored for Nvidia.

AMD Demonstrates FreeSync-over-HDMI Concept Hardware at Computex 2015
 
People have every right to expect that new video cards will support newer standards like HDMI 2.0. If AMD was trying to make some stand against HDMI (which I doubt) then it would be more appropriate for them to omit support for all versions of HDMI rather then stagnating on an older HDMI standard.

Based on that alone it seems more like a mistake then some message. Is it a big mistake, not IMO but it still looks like a mistake.

I also expect hardware H.265 encode and decode. If this HDMI 2.0 thing is true I wouldn't be surprised if that was a bust too.
 
Ok, so HDMI 2.0 is needed for 4K/60.
How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
And most importantly, how much of the market share do they take?
 
Lets give this a good logical look.


HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.

Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.

AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?

Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.
30vs60pfs
i guess that is a question of budget and standard. 30fps is playable and people do it every day.. i like 50 because going around 50-60. not really noticeable to me. wont being locked into a refresh rate eventually cause input lag if not driving someone crazy for days trying to fix it?
g-sync vs freesync
they are both good and above my standard on refresh rates and totally specd out in my opinion.
i do like how freesync works and doesnt need extra parts in the display that you get charged for because oem's get charged the cost of the extra hardware with a license free.
yet another open standard amd helped to put on paper way before gsync was a thought.
 
Ok, so HDMI 2.0 is needed for 4K/60.
How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
And most importantly, how much of the market share do they take?
That's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
 
That's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.


With 8 bit color.

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx


So a little color schooling.


What do you see?

http://i4.minus.com/ibyJcwdIniHUEs.png


https://en.wikipedia.org/wiki/Color_depth

Even if you saw the highest end, it may only be processed in 8 bit per color instead of 10 and thus will still have blocking and gradients. HDMI 2.0 is still shit compared to Display Port.
 
That's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
your missing that the color space is filled and is not the true color of the content when your 4:4:4 via hdmi 2.0 4k@60hz. so if your display can do better than 8bits its not going to be what it should.
displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz
 
your missing that the color space is filled and is not the true color of the content when your 4:4:4 via hdmi 2.0 4k@60hz. so if your display can do better than 8bits its not going to be what it should.
displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz
So what is your point,......?

Tell it to the industry making UHD TV's.

My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.
 
So what is your point,......?

Tell it to the industry making UHD TV's.

My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.
moresense.jpg



Either you understand that 8 bit color looks like shit, or you don't.

We have two simple scenarios in which you replay to a thread about a new graphics card where HDMI 2.0 is NOT supported.


1) You care as you have something relevant to add, understand what it means, why its important or not.

2) You are a Nvidiot and need to thread crap elsewhere.
 
You guys are arguing two different standpoints that are mutually exclusive.
@Octavean is putting forward that HDMI 2.0 has favour with TV vendors and even if it lacks bandwidth compared with DP, will still be utilized.
@Steevo ...well you're basically arguing that DP is better than HDMI and graphics vendors should concentrate on it even though TV manufacturers aren't using it to any great extent.

One is argument about tech implementation (and a few insults), one is about practical implementation in a real market.
 
4k standards is defined as 4k resolution 10-bit+ Rec/BT 2020. HDMI 2.0 can only do that at 4K/30hz.

That still isn't the overall issues because even then your upscaling or downscaling thru the chain
 
Back
Top