• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD "Fiji" Silicon Lacks HDMI 2.0 Support

Panasonic's Smart VIERA WT600 has both DP1.2a and HDMI 2.0.
 
that will do it but the bottleneck is still hdmi for most pc gamers standards.. like why buy the best gpu's around and a 4k display when you cant run full color. fine for some peoples standards but that cuts out most production work and any enthusiast type artist rendering off the top.
 
It also says this for the adapter cable:
Maximum Resolution: 4k @ 30 Hz (note: at time of writing no chipset supports 60Hz)

Meaning DisplayPort supports 60Hz anyway (I have it at 144Hz), output is HDMI 2.0 which means it should work at 60Hz on a LCD TV that has HDMI 2.0 input. Just at the time of writing no devices supported this. Question here is, when did they write the cable description...

Found this- Display port to HDMI 2 - http://www.amazon.com/dp/B00E964YGC/?tag=tec06d-20

But in the review section people are still complaining - is it related to driver ?

Stop. Neither of these cables support 4K at 60Hz, you're gonna make someone waste time and money. Just look at the reviews.

Claiming "compatibility" with a backwards compatible port like HDMI 2.0 is misleading in both of these products. It's like saying a PS3 is "HDMI 2.0 compatible" just because it will display a 1080p image on your new 4K TV.

A cable actually capible of this feat would likely cost $100+ and require external power all while introducing lag.
 
Stop. Neither of these cables support 4K at 60Hz, you're gonna make someone waste time and money. Just look at the reviews.

Claiming "compatibility" with a backwards compatible port like HDMI 2.0 is misleading in both of these products. It's like saying a PS3 is "HDMI 2.0 compatible" just because it will display a 1080p image on your new 4K TV.

A cable actually capible of this feat would likely cost $100+ and require external power all while introducing lag.
thats interesting.. i just seen similar reviews on 3 websites haha i doubt every damn person is doing something wrong so yeah that is more than misleading.
from what your saying it would cost you 100 bucks to do it but there wont be lag just not full color.
 
Sure I'm on DisplayPort, but a flagship without HDMI 2.0 support while bragging about 4K... Kinda weird, considering the availability of 4K TV's is far greater than it is of computer monitors...
Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.
Use DP and if monitor doesn't support DP then it is your fault for buying a monitor which uses a soon-to-be legacy connector (HDMI) which will get phased in the future. Everyone knows DP is the future and that HDMI doesn't stand a chance.
I am interested why it doesn't have HDMI 2.0 but they probably tried to extort some additional money over HDMI 2.0 and they stayed with HDMI 1.4.
All this fuss about some stupid thing over "it doesn't support a connector which is 4 years to late and you have to pay annual fee plus a royalty rate per unit."
Die HDMI!!!
 
Saying DP is a future when not a single TV supports it is a bit blunt statement. I don't think DP will ever be supported in LCD TV's. It hasn't been so far, why would it be in the future? No device for the living room even has DP...
 
thats interesting.. i just seen similar reviews on 3 websites haha i doubt every damn person is doing something wrong so yeah that is more than misleading.
from what your saying it would cost you 100 bucks to do it but there wont be lag just not full color.

Yeah, it's a crappy position to be in, not having the right ports on both ends. The industry is full of shady companies making wild claims because cables are almost all profit.

The $100 adapter scenario would likely pull it off full-featured, but would be laggy due to the interruption in signal. This sort of thing has been happening with Dual-Link DVI adapters for years.
 
Yeah, it's a crappy position to be in, not having the right ports on both ends. The industry is full of shady companies making wild claims because cables are almost all profit.

The $100 adapter scenario would likely pull it off full-featured, but would be laggy due to the interruption in signal. This sort of thing has been happening with Dual-Link DVI adapters for years.
borders crazy if not there
 
ha ha ha ha ha ha....
amd has failed once again....
Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.
 
This isn't really a problem for the big bulky Fury (X), but for the Nano I can see this hurting sales...
 
ha ha ha ha ha ha....
amd has failed once again....

Despite me criticizing them a lot lately, I wouldn't say that. Majority of graphic cards still land in the PC's that are connected to monitors.

Only card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...
 
Despite me criticizing them a lot lately, I wouldn't say that. Majority of graphic cards still land in the PC's that are connected to monitors.

Only card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...
i really dont see it as a htpc type of card.. it should around the performance of a 290 while being better at some things due to hbm. was it is even worth a hdmi upgrade for tv's when they are are only 1080p or 4k? i honestly dont know a single person that uses a 4k tv. i have 3 friends with 4k monitors and i use one at work for charting.
 
Only card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...

Wow, that does amplify this problem a bit now that you mention it. And that "console-sized" Project Quantum rig they built with the dual-Fiji card and a power brick the size of a small dog is very confusing now.

I hope they get it together, compromise is the last thing they need in the current market. Lack of competition hurts the consumers in the long run.
 
Maxwell 2, which is almost a year old shipped with HDMI 2.0 support from day 1, yet AMD can't get it into their own "high-end" product. That's inexcusable.



Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.

Because after spending 550 dollars on a video card you are going to cry about 12 dollars worth of adapter cable.... Im guessing you run your gpu with a Pentium 4 and 1gb of ram to save cost right? because after spending 550 dollars on a gpu who would want to have to spend more money on other parts....

Cant even believe I wasted time typing out to obvious as everyone here knows thats a non-argument troll comment.
 
Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.

He is just mad because he was lied to by Nvidia and did not get the full 4gb he paid for.
The wound is still too fresh and it hurts every time he uses the pc.
 
i really dont see it as a htpc type of card.. it should around the performance of a 290 while being better at some things due to hbm.

I think its size is more of a happy side-effect of HBM than a push into the living room, for sure.

was it is even worth a hdmi upgrade for tv's when they are are only 1080p or 4k? i honestly dont know a single person that uses a 4k tv. i have 3 friends with 4k monitors and i use one at work for charting.

While that's absolutely the case (small 4K TV market), I think it's a pivotal time where the price and content delivery options mean the market is about to grow rapidly. Especially this holiday season where the entire GTX 900 series already gives people that option. Hopefully they fix it by Holiday 2016 with some silicon tweaks.
 
lol at people complaining about this. 95% of users will buy this card to use on a monitor with displayport......and for the few % that want to use a 4K TV, most of the 4K TVs and even monitors on the market don't even support HDMI 2.0 yet.

SO maybe there is 0.0000000001 % of the market that will feel let down.

Sure, if you're planning to hold onto the card for 5 years and want to use a TV it could be a problem, but these cards will be obsolete in 12 months when 14/16 nm cards arrive anyway, and most people will have to buy a new 4K TV to support HDMI 2.0 anyway

If you're buying 4K TVs and $500-$600 graphics cards I'm sure you can afford an upgrade next year.
 
Who the Sock cares?? And if even if someone do care -

Buy a TV with display port then like this Panasonic TC-L65WT600 has...
 
These comment sections always deliver.

Hehe.
 
I think its size is more of a happy side-effect of HBM than a push into the living room, for sure.



While that's absolutely the case (small 4K TV market), I think it's a pivotal time where the price and content delivery options mean the market is about to grow rapidly. Especially this holiday season where the entire GTX 900 series already gives people that option. Hopefully they fix it by Holiday 2016 with some silicon tweaks.

heck yeah a push for higher quality and resolutions. you know rendering efficiency is increase by like 1000 percent in some cases with dx12 and gddr5. it will be practically off the chart with hbm especially with dx12.1. that could mean that the furyx 4gb is roughly comparable to titanx 12gb if not even more in favor of the furyx.
@buggalugs its not the cores being 28nm that could make them obsolete but dx12.1.. games and apps will certainly still be able to use 12.0 tho if not 11.0 or 11.1 by then.
 
Facts:
-you cannot play a decent video game (not those crappy indie games) at 4K resolution and at 60 FPS
-if you need 4K resolution for viewing and editing text or images because 1080p does not look smooth enough, 30 Hz is enough

Edit: You will not get a decent minimal 30 FPS when playing at 4K
 
I personally don't use a TV for gaming , its for watching the News or DVDS to relax lol. I have a 28" 4k Monitor running DP from a R9 290x for Gaming, perfect combo in my opinion. This new AMD Fury product looks awesome , reminds me of around year 2000 when AMD Athlon WAS kicking Intel P4 , now it is back to kick Nvidia ..... I am so proud lol :D
 
Back
Top