• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon R9 Nano CrossFire

If it was "high tech". Would you not prefer DisplayPort ?

HDMI FAQ
4k@60hz 10-bit 4:2:0

DisplayPort FAQ

4k@60 10-bit 4:4:4


You can always channel your diatribe towards the non-inclusion from both GPU vendors of DisplayPort 1.3 which was passed last year . At least you'd be advocating for a superior standard not going backward for convenience.
It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.

That said, I've been using DisplayPort exclusively since 2007. It's old tech, too, just like HDMI. I prefer DisplayPort, but I cannot use it to connect a VGA to my home theatre. And yes, I would have likely bought a Nano if it had a connector that would give me 60 FPS @ 4k. Now I'll go with GTX980 Ti, and a larger case.

Since AMD and NVidia launch new GPUs on a yearly basis, any idea of being "forward-looking" by using DP only is asinine. HDMI chips cost money and board real-estate, and that's why they were not used.
 
It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.

That said, I've been using DisplayPort exclusively since 2007. It's old tech, too, just like HDMI. I prefer DisplayPort, but I cannot use it to connect a VGA to my home theatre. And yes, I would have likely bought a Nano if it had a connector that would give me 60 FPS @ 4k. Now I'll go with GTX980 Ti, and a larger case.
As I understand it the difference between 4:2:0 and 4:4:4 has more to do with color and actually impacts sharpness as well. Although, I'm not the authoritative source to be stating that. @Xzibit seems to know a lot more about this than I.
 
My Acer B286HK has four connections (DP, HDMI, Mini-DP, and DVI) and came with a DP Cable too,.....it has Ultra 4K/2K support (60-Hz refresh rate, 2ms response time) and it looks great. Not a bad deal for $400.00.
 
Last edited:
It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.

It is about color options. 4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.

Buy 4K TV with proper connections and capability.
Panasonic TC-65AX900U
Panasonic TC-85AX850U

Saying DP is not an option is being lazy or out of ones budget. It certainly is an option just not one or many are willing to afford.
 
Last edited:
It is about color options. 4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.

Buy a proper TV.

Panasonic TC-65AX900U
Panasonic TC-85AX850U

Saying DP is not an option is being lazy or out of ones budget. It certainly is an option just not one many are willing to afford.
out of budget, for sure. But you missed my point. Nobody complaining about the lack of HDMI is saying those things because of the difference in colour space offered by DP or HDMI. This is a product available now, that doesn't actually connect to the devices it is really intended to be used with (Nano = HTPC card, which connect in most instances to HDTVs), by not having HDMI 2.0. That's it. Sure, you CAN find HTDV panels with DP, but they are by and far a minority.

This is a design oversight that has limited the product reach. But maybe this is intentional since it seems that AMD is not truly capable of release any of the Fury-based designs in decent numbers.


At the same time, you can argue that people interested in 4K HTPC gaming is just a few, so whatever.
 
Architectural changes? You do know how active display port to HDMI converters work, right? Those little dongles contain a tiny chip that functions by processing the DP video signal to convert it to an HDMI video signal, a very small piece of silicon no bigger than a few squared millimeters in size, and those cost how much to companies like AMD? Yeah you guessed that right, cents, no need to change anything in the actual architecture of the GPU to add a tiny video converter to the board to correctly support HDMI 2.0, but they decided to transfer that burden to the people who want to use this card in a home theater situation when plugged it to thousands of existing 4K TVs, so it seems like you should get your information right.
I still wait for the factual information that it would cost "a few cents" thats just blabla from you. You simply want to critizize AMD for not implementing it and you think yourself superior to a whole company - if someone like you can come to such an idea you dont think a company like AMD can? Your arrogance is just... wow. As on topic, other people already explained why they haven't done it, and my self opinion still is, a architectural change would have cost a lot more than a few cents. You're just wrong with your opinion.

As with Nano just being a "SFF card": you can see Nano as a air cooled Fury X too, with much more energy efficiency and only 5-15% less performance. Or with almost same power on same energy level (just raise the energy cap). It's more than just a SFF card, and the prizing is right. Also, you can take that Nano and mod it to have a stronger cooler, and just use it as an Fury X, with same specs, or with -50 MHz. That all being said, when you dont want that radiator/pump-thing, not everyone likes that stuff or has the space for it.
 
It is about color options. 4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.

Buy 4K TV with proper connections and capability.
Panasonic TC-65AX900U
Panasonic TC-85AX850U

Saying DP is not an option is being lazy or out of ones budget. It certainly is an option just not one or many are willing to afford.

It's not laziness or not being able to afford it, a lot of people like me bought 4K TV sets this past couple of years when prices became more palatable (in my case a 2015 Sony Bravia 4K model) and to think that people would go out of their way and buy a new TV just because this card cannot output at 60Hz to their TV's HDMI 2.0 connectors and there simply aren't DP converters currently able to support this card does not make much sense and is not a very sound reason to invest almost $2K on a new TV. Mind you, this is a top of the line 2015 set from a well known manufacturer, and it doesn't have DP connectors on it, like the vast majority of existing 4K TVs out there.

I like gaming in my living room, but have full ATX systems on both my home theaters, so I'm not the target market for this card in particular, but people looking to build a killer mini ATX system will have to look elsewhere if they want to game at 60FPS at 4K

I still wait for the factual information that it would cost "a few cents" thats just blabla from you. You simply want to critizize AMD for not implementing it and you think yourself superior to a whole company - if someone like you can come to such an idea you dont think a company like AMD can? Your arrogance is just... wow. As on topic, other people already explained why they haven't done it, and my self opinion still is, a architectural change would have cost a lot more than a few cents. You're just wrong with your opinion.

As with Nano just being a "SFF card": you can see Nano as a air cooled Fury X too, with much more energy efficiency and only 5-15% less performance. Or with almost same power on same energy level (just raise the energy cap). It's more than just a SFF card, and the prizing is right. Also, you can take that Nano and mod it to have a stronger cooler, and just use it as an Fury X, with same specs, or with -50 MHz. That all being said, when you dont want that radiator/pump-thing, not everyone likes that stuff or has the space for it.

Am I arrogant because I'm just expressing my opinion on this card? I expressed my opinion in a respectful way and even used multiple links to validate it, when presenting a counter argument to yours. It is common knowledge such tiny chips as the ones found inside those dongles cost only cents to manufacture, how do you think you can find active converters for less than $10 dollars out there? Or any other electronic device powered by small processors that sell for a few dollars ?

Take this IC commonly found inside a DP to HDMI converter:

http://datasheet.octopart.com/STDP2650-AC-STMicroelectronics-datasheet-16348534.pdf

It sells for $0.10 when you order 100 or more from China:

http://www.alibaba.com/product-detail/-new-for-ic-in-integrated_60284045204.html

So there, see? A few cents, satisfied? What's more, what do you think the actual manufacturing cost of a similar IC would be to AMD? Let's take adding more traces to the PCB into consideration, to place this 8x8mm IC between the GPU and the HDMI connector, do you think that adds a big chunk of money to the BOM for this card? Just think about it for a minute.

It's all about maximizing your profits by reducing costs, I'm pretty sure the BOM for Fury X is much higher than for the Nano, and yet both sell for the exact same price, does the fact that I point that out make me arrogant? I think not. Nvidia did the same with Titan X by saving a few cents and not adding a backplate to it, something the least expensive 980 featured out of the box from day one, and I called them out back then as well, both companies have a board of directors to respond to and a few cents here and there add up in the end when talking about your bottom line.

I don't even know you and never resorted to insults like you did from your first post, in my view you're the one who comes across as arrogant, funny how you just sidestepped the whole DP adapter topic you brought up in the first place, and yet you accuse me of not backing my argument, double standards much?

Oh, and btw, it's not me presenting this card as the king of SFF cards, it's AMD in pretty much all of their marketing presentations for the Nano so far.

I'm done with you, seldom have I had to deal with people who resorts to belittling and insulting others just for the sake of coming on top of an argument when I'm expressing my valid point of view, seldom have I used the ignore feature in the long time I've been a member of this forum either, as most people here are mature enough to discuss any given topic without resorting to insulting others, so I'm gonna take the high road and hope you learn to appreciate or at least respect the opinions of others.

Have a good day sir.
 
Last edited:
It's not laziness or not being able to afford it, a lot of people like me bought top of the line 4K TV sets this past couple of years when prices became more palatable (in my case a top of the line 2015 Sony Bravia 4K model) and to think that people would go out of their way and buy a new TV just because this card cannot output at 60Hz to their TV's HDMI 2.0 connectors and there simply aren't DP converters currently able to support this card does not make much sense and is not a very sound reason to invest almost $2K on a new TV. Mind you, this is a top of the line 2015 set from a well known manufacturer, and it doesn't have DP connectors on it, like the vast majority of existing 4K TVs out there.

I like gaming in my living room, but have full ATX systems on both my home theaters, so I'm not the target market for this card in particular, but people looking to build a killer mini ATX system will have to look elsewhere if they want to game at 60FPS at 4K

Just looked at some of the Sony 4k TV manuals and unless you have a different one.

Video (2D): 4096 Ă— 2160p (60 Hz)*, 4096 Ă— 2160p (24 Hz), 3840 Ă— 2160p (60 Hz)*,
3840 Ă— 2160p (24, 25, 30 Hz),1080p (30, 60 Hz), 1080/24p, 1080i (60 Hz),
720p (30, 60 Hz), 720/24p, 480p, 480i, PC Formats
*1 YCbCr 4:2:0 / 8 bit
*2 3840 Ă— 2160p is displayed when 4096 Ă— 2160p is input


You'd be down-sampled to 8-bit 4:2:0 which is worse. Better to invest that 2k on a 4k 10-bit monitor with a DP connection and go as big as you can. Regardless of your GPU choice.

Personally I'd wait for DP 1.3 but if I had to buy now I'd look at the pros and cons.
 
Just looked at some of the Sony 4k TV manuals and unless you have a different one.

Video (2D): 4096 Ă— 2160p (60 Hz)*, 4096 Ă— 2160p (24 Hz), 3840 Ă— 2160p (60 Hz)*,
3840 Ă— 2160p (24, 25, 30 Hz),1080p (30, 60 Hz), 1080/24p, 1080i (60 Hz),
720p (30, 60 Hz), 720/24p, 480p, 480i, PC Formats
*1 YCbCr 4:2:0 / 8 bit
*2 3840 Ă— 2160p is displayed when 4096 Ă— 2160p is input


You'd be down-sampled to 8-bit 4:2:0 which is worse. Better to invest that 2k on a 4k 10-bit monitor with a DP connection and go as big as you can. Regardless of your GPU choice.

Personally I'd wait for DP 1.3 but if I had to buy now I'd look at the pros and cons.

Thanks for mentioning that, I had to double check, as honestly I wasn't aware of that limitation on my TV :( I have a Sony XBR55X850C, it does, apparently support 444 mode at 4K 60Hz after the latest firmware update according to this site:

http://www.rtings.com/tv/reviews/by-brand/sony/x850c?uxtv=b58b6b8ba3c3

10PC Monitor:

1080p @ 60Hz @ 4:4:4: Yes
1080p @ 120Hz: Yes
4k @ 30Hz @ 4:4:4: Yes
4k @ 60Hz: Yes
4k @ 60Hz @ 4:4:4: Yes
To enable chroma 4:4:4, set the mode to either Game or Graphics.
Update: With the new firmware update PKG2.463.0010NAB, This TV now supports 4k @ 60Hz @ 4:4:4. To enable this, go to Settings - External Inputs - HDMI Signal Format - Enhanced (new).

I checked and my TV has the latest firmware, so it seems it supports 10bit after all, as it shows the enhanced signal format in the settings menu, thanks for the heads up :p
 
It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin.
Which is nonsensical. If I walk around my house and look at all my TVs that have HDMI, every single one of them has at least VGA + 3.5mm too. Some even have DVI. I think it's pretty clear what's going on here: the TV industry is deliberately trying to sabotage DisplayPort in the name of keeping HDMI around.
 
Thanks for mentioning that, I had to double check, as honestly I wasn't aware of that limitation on my TV :( I have a Sony XBR55X850C, it does, apparently support 444 mode at 4K 60Hz after the latest firmware update according to this site:

http://www.rtings.com/tv/reviews/by-brand/sony/x850c?uxtv=b58b6b8ba3c3



I checked and my TV has the latest firmware, so it seems it supports 10bit after all, as it shows the enhanced signal format in the settings menu, thanks for the heads up :p

No problem.

Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do. HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.

The person who posted that didn't read the HDMI FAQ as some of the following post point out. One even posted the Firmware changes and there is no mention of it.
 
Last edited:
No problem.

Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do. HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.

The person who posted that didn't read the HDMI FAQ as the post following his imply. One even posted the Firmware changes and there is no mention of it.

You're right, I stand corrected, in order to display chroma at 444 the bit rate is downsampled to 8bit.

And I agree it sucks most TVs don't support display port out of the box, it clearly is the best alternative for video interfaces :(
 
Awesome review. I particularly like the editorial comments in the conclusion about the irrationality of it all. Put's enthusiasm in perspective :)
 
No problem.

Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do. HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.

The person who posted that didn't read the HDMI FAQ as some of the following post point out. One even posted the Firmware changes and there is no mention of it.

I didn't know what 4:4:4 was. So I read up on it.

http://hdguru.com/hdmi-2-0-what-you-need-to-know/

Color crunching
Here’s how chroma subsampling works. The human eye is more sensitive to black and white detail than color detail. Chroma subsampling compression takes advantage of this fact by sending a full-resolution black and white (luma) information and only partial-resolution color (chroma) information. The result is a reduction of image data with no accompanying visual degradation.

There are three main types of chroma subsampling for video content: 4:4:4; 4:2:2; and 4:2:0. With 4:4:4 there is no subsampling. With 4:2:2, half of the color detail is thrown away. And with 4:2:0, 75% of color information is discarded. Blu-ray, HDTV, and DVD all use 4:2:0 subsampling. We don’t notice the loss of color detail right now with those formats, and we aren’t likely to notice it after the move to UHD.

Given 4:4:4 is ideal but not what the industry works to , it is still remiss of AMD to omit the HDMI 2.0 standard from their 'marketed' dedicated Home PC graphics card. All other arguments aside and the 4:4:4 drum banging put away - the industry has dictated the format, graphics vendors need to deal with that. The simple question to ask is, would my crossfired Nano's in my (now slightly larger mini ATX) case be better with an HDMI 2.0 connection for 60 fps gaming? The answer is absolutely yes.

Would it be great if the industry all adopted an unsampled chromatic system? - yes. BUT they haven't yet. So my hypothetical Nano crossfire set up is limited to 30fps in most circumstances in my living room TV because AMD didn't use HDMI 2.0.
 
One more thought.

HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.
 
I didn't know what 4:4:4 was. So I read up on it.

http://hdguru.com/hdmi-2-0-what-you-need-to-know/



Given 4:4:4 is ideal but not what the industry works to , it is still remiss of AMD to omit the HDMI 2.0 standard from their 'marketed' dedicated Home PC graphics card. All other arguments aside and the 4:4:4 drum banging put away - the industry has dictated the format, graphics vendors need to deal with that. The simple question to ask is, would my crossfired Nano's in my (now slightly larger mini ATX) case be better with an HDMI 2.0 connection for 60 fps gaming? The answer is absolutely yes.

Would it be great if the industry all adopted an unsampled chromatic system? - yes. BUT they haven't yet. So my hypothetical Nano crossfire set up is limited to 30fps in most circumstances in my living room TV because AMD didn't use HDMI 2.0.

You've convinced me.

If only I can force my system to work at 4:2:0 I'd be set. The bigger the screen the better.

/s

Here is something some of you will be able to duplicate if your system gives you the 4:2:2 option.
 
Last edited:
You've convinced me.

If only I can force my system to work at 4:2:0 I'd be set. The bigger the screen the better.

/s

Here is something some of you will be able to duplicate if your system gives you the 4:2:2 option.

Hmm, wasn't trying to convince you of anything, just pointing out market conditions from the TV manufacturers. No amount of techno posturing will change that. You still wont address whether AMD should have included HDMI 2.0 as an output. What may affect peoples choices though is the fps options for gaming. Fast first person shooters require higher fps for better gameplay - 30fps isn't ideal.

All being said, I personally wouldn't game on a TV, I'd always prefer a monitor, so the argument isn't for me. And perhaps the folks that game on TV's use consoles. For AMD though the fact is, most reviewers have criticised AMD for not adopting HDMI 2.0 for Fiji for it's 'living room TV use'. Whether it's got any real world impact or not, the negative impact is there from the start. I know 4:4:4 is preferred - it's the way the image should be but it's generally not delivered to us that way, hell - Blu Ray (which looks great to 99% of folks) doesn't use 4:4:4 (4:2:0 I think for that link?).

Anyway, this thread is about crossfire Nano so the colour discussion is for another thread. No point talking about it if AMD can't give you it via a TV that wont support it. It's 4:2:2 or 4:2:0. For now - we need to accept that.

EDIT: I did watch that wonderful sales presentation and I've now bought a Roland V-800HD. It added nothing to the argument though, in fact it was irrelevant to the discussion on Nano being used in a living room environment on a 4K TV with no HDMI 2.0.... go figure. Slow clap.
 
Last edited:
Hmm, wasn't trying to convince you of anything, just pointing out market conditions from the TV manufacturers. No amount of techno posturing will change that. You still wont address whether AMD should have included HDMI 2.0 as an output. What may affect peoples choices though is the fps options for gaming. Fast first person shooters require higher fps for better gameplay - 30fps isn't ideal.

All being said, I personally wouldn't game on a TV, I'd always prefer a monitor, so the argument isn't for me. And perhaps the folks that game on TV's use consoles. For AMD though the fact is, most reviewers have criticised AMD for not adopting HDMI 2.0 for Fiji for it's 'living room TV use'. Whether it's got any real world impact or not, the negative impact is there from the start. I know 4:4:4 is preferred - it's the way the image should be but it's generally not delivered to us that way, hell - Blu Ray (which looks great to 99% of folks) doesn't use 4:4:4 (4:2:0 I think for that link?).

Anyway, this thread is about crossfire Nano so the colour discussion is for another thread. No point talking about it if AMD can't give you it via a TV that wont support it. It's 4:2:2 or 4:2:0. For now - we need to accept that.

EDIT: I did watch that wonderful sales presentation and I've now bought a Roland V-800HD. It added nothing to the argument though, in fact it was irrelevant to the discussion on Nano being used in a living room environment on a 4K TV with no HDMI 2.0.... go figure. Slow clap.

Should they have included it sure I think I said it in another thread but not all 4k TVs even fully support HDMI 2.0 functionality as where all monitors and those TVs that have DisplayPort are likely to support all functionality. 15th Warlock just provided an example...

If your only option is HDMI 2.0. You better make sure you know what your 4k TV supports. I would say use the 4k 8-bit 4:4:4. At the same time that defeats the purpose of 4k 10-bit content. So you'll have to switch back and forth between settings. That's if your 4k TV supports the 10-bit 4k signal and doesn't down-sample you. Some 4k TVs down-sample you as soon as you pop-up the menu or use PiP. Until HDMI 2.0 functionality is ironed out in the 4k TVs its just a check-box.

You could just take one of the TVs I linked (they are a few others) or preferably a 4k 60hz 10-bit monitor w/DP and plug it in and forget about having to switch between settings.

Seems you'll buy anything that doesn't have AMDs name attached to it... You set that one up.

Like I said HERE.
 
Last edited:
You could just take one of the TVs I linked (they are a few others) or preferably a 4k 60hz 10-bit monitor w/DP and plug it in and forget about having to switch between settings.


Bleh. I already got my TV, and paid nearly 10 times what a nano would cost locally. Spending that money again, just to get a DP port... psh... it's cheaper to not buy any AMD card, and go NVidia.

All I hope is that AMD corrects this with the next generation of GPUs.

Anyway, I kind of realized that AMD touts this card as a SFF VGA, not a HTPC VGA. It's the HTPC designation that makes things look bad, when it comes to HDMI/DP. A product should meet the needs of the market NOW not the future, and also not force consumers into limited purchasing choices of supporting hardware. SFF PC = monitor. HTPC = HDTV. It's quite different.

Nobody in their right mind will say HDMI is better than DP.. nobody has...yet you keep harping on this point like it matters when it doesn't. IT's the fact that DP-only connectivity prevents many users interested in this card form actually putting it to use, due to their pre-existing hardware.
 
Nvidia did the same with Titan X by saving a few cents and not adding a backplate to it, something the least expensive 980 featured out of the box from day one, and I called them out back then as well, both companies have a board of directors to respond to and a few cents here and there add up in the end when talking about your bottom line.
Nvidia said they scraped the backplate on some (or all) of their cards because it had heat issues with SLI, because room between cards is scarce. I don't know if its true, but could be. Or it's a smart way to earn some more money. Fact is, the point of Nvidia is probably valid anyway.

On HDMI 2.0 vs DP topic in TVs:
It saves money not to include DP, so they try to not add it to their TVs or charge extra for it in another model of same type. That would be another point of view on this subject.

HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.
Interesting thought. Maybe you're right, but I still think it's just AMD without money, trying to get away with HDMI 1.4 and adding it later in their graphics card line in 2016.
 
One more thought.

HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.
HDMI has no support for adaptive sync and likely never will. It would cost TV manufacturers too much to implement.

HDMI 2.0 has enough bandwidth for 4K @ 60 Hz 24-bit color but it does not have enough bandwidth for 4K @ 60 Hz 30-bit color. DisplayPort can handle 4K @ 60 Hz 48-bit color. All figures are for 4:4:4.
 
Last edited:
Interesting thought. Maybe you're right, but I still think it's just AMD without money, trying to get away with HDMI 1.4 and adding it later in their graphics card line in 2016.

They are still paying royalties on HDMI either way since 1.4 was included.

What they said at the Fiji event is probably true (There is a video somewhere). They didn't see HDMI 2.0 necessary because there focus was on DisplayPort which could do it better and it has FreeSync.

Next gen you might see a similar outcome HDMI 2.0 might be there but its not going to be emphasized especially if DisplayPort 1.3 is introduced.
 
Last edited:
I expect all 14/16nm cards to have one HDMI 2.0 and one or more DisplayPorts. I don't know about DVI-I, DVI-D, and VGA--low end cards may still have them but I suspect NVIDIA will take after AMD and exclude them on top of the line cards.
 
I expect all 14/16nm cards to have one HDMI 2.0 and one or more DisplayPorts. I don't know about DVI-I, DVI-D, and VGA--low end cards may still have them but I suspect NVIDIA will take after AMD and exclude them on top of the line cards.
All current HDMI 2.0 implementations that I have seen use a MegaChips support IC. This IC will require additional PCB real-estate (and is probably why Fury cards don't have such support). So I expect that perhaps such connectivity will be put internal to the GPU silicon, but I'd simply be happy with all DP if a suitable HDMI adapter came in the box rather than other things like DVI to VGA or whatever. It's just weird how little HDMI 2.0 support there really is in hardware (many Z170 motherboards support it) considering it is a relatively old spec.
 
They're never going to put DisplayPort-to-anything (except miniDisplayPort to DisplayPort in the case of Eyefinity cards) adapters in the box because they're too expensive. DVI-I to VGA is as simple as changing the pin out (<$1 adapter) which is why they're all over the place. There is no native backwards compatibility in DisplayPort.

Z170 only supports HDMI 2.0 through Thunderbolt via Alpine Ridge chip (achieves it by converting a DisplayPort signal).

Article about MegaChips (it is 7 x 7 mm):
http://www.reuters.com/article/2015/06/15/megachips-hdmi-chip-idUSnPn5PfYPz+96+PRN20150615

DisplayPort to HDMI 2.0 requires a level-shifter and active-protocol converter (LSPCON).
 
Last edited:
Back
Top