Thursday, June 18th 2015

AMD "Fiji" Silicon Lacks HDMI 2.0 Support

It turns out that AMD's new "Fiji" silicon lacks HDMI 2.0 support, after all. Commenting on OCUK Forums, an AMD representative confirmed that the chip lacks support for the connector standard, implying that it's limited to HDMI 1.4a. HDMI 2.0 offers sufficient bandwidth for 4K Ultra HD resolution at 60 Hz. While the chip's other connectivity option, DisplayPort 1.2a supports 4K at 60 Hz - as do every 4K Ultra HD monitor ever launched - the lack of HDMI 2.0 support hurts the chip's living room ambitions, particularly with products such as the Radeon R9 Nano, which AMD CEO Lisa Su, stated that is being designed for the living room. You wouldn't need a GPU this powerful for 1080p TVs (a GTX 960 or R9 270X ITX card will do just fine), and if it's being designed for 4K UHD TVs, then its HDMI interface will cap visuals at a console-rivaling 30 Hz.
Source: OCUK Forums
Add your own comment

139 Comments on AMD "Fiji" Silicon Lacks HDMI 2.0 Support

#1
dwade
So much for focusing on 4k when most 4k gamers own a tv instead of smallish monitors. Times have changed, AMD. Get with it.
Posted on Reply
#2
mroofie
lel fail :slap:

RejZoR, post: 3299875, member: 1515"
Sure I'm on DisplayPort, but a flagship without HDMI 2.0 support while bragging about 4K... Kinda weird, considering the availability of 4K TV's is far greater than it is of computer monitors...
Finally someone who gets it !! :toast:

SimpleTECH, post: 3299869, member: 134802"

lel :D

:peace:

the54thvoid, post: 3299881, member: 79251"
No. Measured responses so far but your own comment is quite blatantly troll bait. Congratulations on starting the self fulfilling prophecy.
lel that line :laugh:

Here take a cookie :D
Posted on Reply
#3
the54thvoid
Issue or non issue? Let the brand loyalists decide. I use a monitor so no impact to me.
Posted on Reply
#5
Champ
I can see how this would be an issue. I had a HTPC gaming machine and want to build another. A 4k version. When you're gaming on the living room teley, you need thr most of everything you can get.
Posted on Reply
#6
RejZoR
Sure I'm on DisplayPort, but a flagship without HDMI 2.0 support while bragging about 4K... Kinda weird, considering the availability of 4K TV's is far greater than it is of computer monitors...
Posted on Reply
#7
Xaled
You figured out that while you are testing the card? :> review date please, i couldnt get any informatiin about that or about NDA anywhere
Posted on Reply
#8
HumanSmoke
Xaled, post: 3299876, member: 158027"
You figured out that while you are testing the card? :> review date please, i couldnt get any informatiin about that or about NDA anywhere
Reading is key.
From the second sentence of the article:
Commenting on OCUK Forums, an AMD representative confirmed that the chip lacks support for the connector standard..
If you click on the source hyperlink. you'll get it from the proverbial horses mouth.
Posted on Reply
#10
the54thvoid
jigar2speed, post: 3299879, member: 103592"
And the Nvidia fanboys go wild.
No. Measured responses so far but your own comment is quite blatantly troll bait. Congratulations on starting the self fulfilling prophecy.
Posted on Reply
#12
the54thvoid
Xzibit, post: 3299883, member: 105152"
This might help


What is this technical wizardry? You magician, begone with.... Oh yeah, a practical work around. Funnily enough.
Posted on Reply
#13
ZoneDymo
Xzibit, post: 3299883, member: 105152"
This might help


I know right? lol wtf are we even talking about here? just get a damn adapter, honestly.
Posted on Reply
#14
ZoneDymo
dwade, post: 3299865, member: 106554"
So much for focusing on 4k when most 4k gamers own a tv instead of smallish monitors. Times have changed, AMD. Get with it.
I would like to know where you got that information
Posted on Reply
#15
nekrik
AMD: "We have true 4k GPU. It gives 55fps average in crysis Yey!!! BUT!! only in reduced 4:2:2 color subsampling (HDMI 2.0 bandwith can do 4:4:4) the latter written with small letters
So it is not real 4k gpu and with 4 gb vram HBM it could not be. Benchmarks is one story real games at 4k will eat that 4gb vram for breakfast, see GTA V , Shadow of Mordor and others .
Posted on Reply
#16
Lionheart
AMD wtf!! You're bragging about 4 & 5k yet you can't support the new HDMI 2.0 standard?? Guess I'm going to need one of those converting cable's that Xzibit posted...
Posted on Reply
#17
Assimilator
Maxwell 2, which is almost a year old shipped with HDMI 2.0 support from day 1, yet AMD can't get it into their own "high-end" product. That's inexcusable.

ZoneDymo, post: 3299886, member: 66089"
I know right? lol wtf are we even talking about here? just get a damn adapter, honestly.
Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.
Posted on Reply
#18
praze
Xzibit, post: 3299883, member: 105152"
This might help


It won't, that cable is capped at 30 Hz just like the HDMI 1.4a port on the card. All this will do is cost you $45 and introduce lag.

Edit: source
Posted on Reply
#19
RejZoR
It also says this for the adapter cable:
Maximum Resolution: 4k @ 30 Hz (note: at time of writing no chipset supports 60Hz)

Meaning DisplayPort supports 60Hz anyway (I have it at 144Hz), output is HDMI 2.0 which means it should work at 60Hz on a LCD TV that has HDMI 2.0 input. Just at the time of writing no devices supported this. Question here is, when did they write the cable description...
Posted on Reply
#20
xfia
well looks like the rest of the industry needs a kick in the right direction. displayport that has been in production for like 7 years is much more advanced than hdmi 2.0 that has been in production for like 3 and is what a 5k display will need to work and goes all the way up to 8k with subsampling at 60hz.. that is what hdmi does at 4k 60hz as mentioned.
non issue for almost anyone that will buy them.
Posted on Reply
#21
jigar2speed
praze, post: 3299895, member: 140261"
It won't, that cable is capped at 30 Hz just like the HDMI 1.4a port on the card. All this will do is cost you $45 and introduce lag.

Edit: source
Found this- Display port to HDMI 2 - http://www.amazon.com/dp/B00E964YGC/?tag=tec06d-20

But in the review section people are still complaining - is it related to driver ?
Posted on Reply
#22
Xaled
HumanSmoke, post: 3299878, member: 98425"
Reading is key.
From the second sentence of the article:

If you click on the source hyperlink. you'll get it from the proverbial horses mouth.
HumanSmoke, post: 3299878, member: 98425"
Reading is key.
From the second sentence of the article:

If you click on the source hyperlink. you'll get it from the proverbial horses mouth.
And review date? Do yiu have any info about it?
Posted on Reply
#23
Lou007
Found these interesting facts:

The HDMI (High Definition Multimedia Interface) specification was conceived more than ten years ago by six consumer electronics giants: Hitachi, Panasonic, Philips, Silicon Image, Sony, and Toshiba. Today, HDMI Licensing, LLC, a wholly owned subsidiary of Silicon Image, controls the spec. Manufacturers must pay a royalty for including HDMI into their products.

The DisplayPort specification was developed by, and remains under the control of, the Video Electronics Standards Association (VESA), a large consortium of manufacturers ranging from AMD to ZIPS Corporation. DisplayPort debuted in 2006 as part of an effort to supplant the much older VGA (Video Graphics Array, an analog interface first introduced in 1987) and DVI (Digital Video Interface, introduced in 1999) standards used primarily for computer displays. DisplayPort is a royalty-free product.

Fun fact: Of the six companies responsible for the creation of HDMI, only Hitachi and Philips are not also member companies of VESA.

http://www.pcworld.com/article/2030669/hdmi-vs-displayport-which-display-interface-reigns-supreme.html

When comparing DisplayPort 1.3 with HDMI 2.0, DisplayPort 1.3 has several key advantages. First, the video bandwidth of DisplayPort 1.3 is much higher than HDMI 2.0. This means that DisplayPort 1.3 can support higher resolution timing such as 8K at 60Hz, where as HDMI 2.0 can support 4K at 60Hz max. Second, DisplayPort 1.3 has the ability to transmit multiple video streams on one cable through the MST feature allowing multiple monitors to be daisy-chained together (although there are limitations as to the number of displays and resolution supported, which makes it more appropriate for desktop uses than video walls). Finally DisplayPort includes installer-friendly locking connectors. HDMI doesn’t natively support locking connectors though many Planar products do provide support for threaded hex nuts for special-locking HDMI connector - See more at: http://www.planar.com/blog/2014/12/15/displayport-13-vs-hdmi-20/#sthash.jrl6QSDr.dpuf

I can now understand why AMD is pushing Display Port over HDMI 2.0
Posted on Reply
#24
xfia
Lou007, post: 3299905, member: 138454"
When comparing DisplayPort 1.3 with HDMI 2.0, DisplayPort 1.3 has several key advantages. First, the video bandwidth of DisplayPort 1.3 is much higher than HDMI 2.0. This means that DisplayPort 1.3 can support higher resolution timing such as 8K at 60Hz, where as HDMI 2.0 can support 4K at 60Hz max. Second, DisplayPort 1.3 has the ability to transmit multiple video streams on one cable through the MST feature allowing multiple monitors to be daisy-chained together (although there are limitations as to the number of displays and resolution supported, which makes it more appropriate for desktop uses than video walls). Finally DisplayPort includes installer-friendly locking connectors. HDMI doesn’t natively support locking connectors though many Planar products do provide support for threaded hex nuts for special-locking HDMI connector - See more at: http://www.planar.com/blog/2014/12/15/displayport-13-vs-hdmi-20/#sthash.jrl6QSDr.dpuf
the big picture they wanted that they have advertised is to run 1440p eyefinity at 60hz at full color.
Posted on Reply
Add your own comment