• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon R9 Nano CrossFire

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
If it was "high tech". Would you not prefer DisplayPort ?

HDMI FAQ
4k@60hz 10-bit 4:2:0

DisplayPort FAQ

4k@60 10-bit 4:4:4


You can always channel your diatribe towards the non-inclusion from both GPU vendors of DisplayPort 1.3 which was passed last year . At least you'd be advocating for a superior standard not going backward for convenience.
It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.

That said, I've been using DisplayPort exclusively since 2007. It's old tech, too, just like HDMI. I prefer DisplayPort, but I cannot use it to connect a VGA to my home theatre. And yes, I would have likely bought a Nano if it had a connector that would give me 60 FPS @ 4k. Now I'll go with GTX980 Ti, and a larger case.

Since AMD and NVidia launch new GPUs on a yearly basis, any idea of being "forward-looking" by using DP only is asinine. HDMI chips cost money and board real-estate, and that's why they were not used.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.

That said, I've been using DisplayPort exclusively since 2007. It's old tech, too, just like HDMI. I prefer DisplayPort, but I cannot use it to connect a VGA to my home theatre. And yes, I would have likely bought a Nano if it had a connector that would give me 60 FPS @ 4k. Now I'll go with GTX980 Ti, and a larger case.
As I understand it the difference between 4:2:0 and 4:4:4 has more to do with color and actually impacts sharpness as well. Although, I'm not the authoritative source to be stating that. @Xzibit seems to know a lot more about this than I.
 
Joined
Jun 28, 2014
Messages
2,388 (0.67/day)
Location
Shenandoah Valley, Virginia USA
System Name Home Brewed
Processor i9-7900X and i7-8700K
Motherboard ASUS ROG Rampage VI Extreme & ASUS Prime Z-370 A
Cooling Corsair 280mm AIO & Thermaltake Water 3.0
Memory 64GB DDR4-3000 GSKill RipJaws-V & 32GB DDR4-3466 GEIL Potenza
Video Card(s) 2X-GTX-1080 SLI & 2 GTX-1070Ti 8GB G1 Gaming in SLI
Storage Both have 2TB HDDs for storage, 480GB SSDs for OS, and 240GB SSDs for Steam Games
Display(s) ACER 28" B286HK 4K & Samsung 32" 1080P
Case NZXT Source 540 & Rosewill Rise Chassis
Audio Device(s) onboard
Power Supply Corsair RM1000 & Corsair RM850
Mouse Generic
Keyboard Razer Blackwidow Tournament & Corsair K90
Software Win-10 Professional
Benchmark Scores yes
My Acer B286HK has four connections (DP, HDMI, Mini-DP, and DVI) and came with a DP Cable too,.....it has Ultra 4K/2K support (60-Hz refresh rate, 2ms response time) and it looks great. Not a bad deal for $400.00.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.

It is about color options. 4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.

Buy 4K TV with proper connections and capability.
Panasonic TC-65AX900U
Panasonic TC-85AX850U

Saying DP is not an option is being lazy or out of ones budget. It certainly is an option just not one or many are willing to afford.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
It is about color options. 4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.

Buy a proper TV.

Panasonic TC-65AX900U
Panasonic TC-85AX850U

Saying DP is not an option is being lazy or out of ones budget. It certainly is an option just not one many are willing to afford.
out of budget, for sure. But you missed my point. Nobody complaining about the lack of HDMI is saying those things because of the difference in colour space offered by DP or HDMI. This is a product available now, that doesn't actually connect to the devices it is really intended to be used with (Nano = HTPC card, which connect in most instances to HDTVs), by not having HDMI 2.0. That's it. Sure, you CAN find HTDV panels with DP, but they are by and far a minority.

This is a design oversight that has limited the product reach. But maybe this is intentional since it seems that AMD is not truly capable of release any of the Fury-based designs in decent numbers.


At the same time, you can argue that people interested in 4K HTPC gaming is just a few, so whatever.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Architectural changes? You do know how active display port to HDMI converters work, right? Those little dongles contain a tiny chip that functions by processing the DP video signal to convert it to an HDMI video signal, a very small piece of silicon no bigger than a few squared millimeters in size, and those cost how much to companies like AMD? Yeah you guessed that right, cents, no need to change anything in the actual architecture of the GPU to add a tiny video converter to the board to correctly support HDMI 2.0, but they decided to transfer that burden to the people who want to use this card in a home theater situation when plugged it to thousands of existing 4K TVs, so it seems like you should get your information right.
I still wait for the factual information that it would cost "a few cents" thats just blabla from you. You simply want to critizize AMD for not implementing it and you think yourself superior to a whole company - if someone like you can come to such an idea you dont think a company like AMD can? Your arrogance is just... wow. As on topic, other people already explained why they haven't done it, and my self opinion still is, a architectural change would have cost a lot more than a few cents. You're just wrong with your opinion.

As with Nano just being a "SFF card": you can see Nano as a air cooled Fury X too, with much more energy efficiency and only 5-15% less performance. Or with almost same power on same energy level (just raise the energy cap). It's more than just a SFF card, and the prizing is right. Also, you can take that Nano and mod it to have a stronger cooler, and just use it as an Fury X, with same specs, or with -50 MHz. That all being said, when you dont want that radiator/pump-thing, not everyone likes that stuff or has the space for it.
 
Joined
Aug 16, 2004
Messages
3,275 (0.46/day)
Location
Sunny California
Processor Intel Core i9 13900KF
Motherboard Asus ROG Maximus Z690 Hero EVA Edition
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6800MHz G.Skill Trident Z5 Neo Series
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 980 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply EVGA Supernova G2 1300W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
It is about color options. 4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.

Buy 4K TV with proper connections and capability.
Panasonic TC-65AX900U
Panasonic TC-85AX850U

Saying DP is not an option is being lazy or out of ones budget. It certainly is an option just not one or many are willing to afford.

It's not laziness or not being able to afford it, a lot of people like me bought 4K TV sets this past couple of years when prices became more palatable (in my case a 2015 Sony Bravia 4K model) and to think that people would go out of their way and buy a new TV just because this card cannot output at 60Hz to their TV's HDMI 2.0 connectors and there simply aren't DP converters currently able to support this card does not make much sense and is not a very sound reason to invest almost $2K on a new TV. Mind you, this is a top of the line 2015 set from a well known manufacturer, and it doesn't have DP connectors on it, like the vast majority of existing 4K TVs out there.

I like gaming in my living room, but have full ATX systems on both my home theaters, so I'm not the target market for this card in particular, but people looking to build a killer mini ATX system will have to look elsewhere if they want to game at 60FPS at 4K

I still wait for the factual information that it would cost "a few cents" thats just blabla from you. You simply want to critizize AMD for not implementing it and you think yourself superior to a whole company - if someone like you can come to such an idea you dont think a company like AMD can? Your arrogance is just... wow. As on topic, other people already explained why they haven't done it, and my self opinion still is, a architectural change would have cost a lot more than a few cents. You're just wrong with your opinion.

As with Nano just being a "SFF card": you can see Nano as a air cooled Fury X too, with much more energy efficiency and only 5-15% less performance. Or with almost same power on same energy level (just raise the energy cap). It's more than just a SFF card, and the prizing is right. Also, you can take that Nano and mod it to have a stronger cooler, and just use it as an Fury X, with same specs, or with -50 MHz. That all being said, when you dont want that radiator/pump-thing, not everyone likes that stuff or has the space for it.

Am I arrogant because I'm just expressing my opinion on this card? I expressed my opinion in a respectful way and even used multiple links to validate it, when presenting a counter argument to yours. It is common knowledge such tiny chips as the ones found inside those dongles cost only cents to manufacture, how do you think you can find active converters for less than $10 dollars out there? Or any other electronic device powered by small processors that sell for a few dollars ?

Take this IC commonly found inside a DP to HDMI converter:

http://datasheet.octopart.com/STDP2650-AC-STMicroelectronics-datasheet-16348534.pdf

It sells for $0.10 when you order 100 or more from China:

http://www.alibaba.com/product-detail/-new-for-ic-in-integrated_60284045204.html

So there, see? A few cents, satisfied? What's more, what do you think the actual manufacturing cost of a similar IC would be to AMD? Let's take adding more traces to the PCB into consideration, to place this 8x8mm IC between the GPU and the HDMI connector, do you think that adds a big chunk of money to the BOM for this card? Just think about it for a minute.

It's all about maximizing your profits by reducing costs, I'm pretty sure the BOM for Fury X is much higher than for the Nano, and yet both sell for the exact same price, does the fact that I point that out make me arrogant? I think not. Nvidia did the same with Titan X by saving a few cents and not adding a backplate to it, something the least expensive 980 featured out of the box from day one, and I called them out back then as well, both companies have a board of directors to respond to and a few cents here and there add up in the end when talking about your bottom line.

I don't even know you and never resorted to insults like you did from your first post, in my view you're the one who comes across as arrogant, funny how you just sidestepped the whole DP adapter topic you brought up in the first place, and yet you accuse me of not backing my argument, double standards much?

Oh, and btw, it's not me presenting this card as the king of SFF cards, it's AMD in pretty much all of their marketing presentations for the Nano so far.

I'm done with you, seldom have I had to deal with people who resorts to belittling and insulting others just for the sake of coming on top of an argument when I'm expressing my valid point of view, seldom have I used the ignore feature in the long time I've been a member of this forum either, as most people here are mature enough to discuss any given topic without resorting to insulting others, so I'm gonna take the high road and hope you learn to appreciate or at least respect the opinions of others.

Have a good day sir.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
It's not laziness or not being able to afford it, a lot of people like me bought top of the line 4K TV sets this past couple of years when prices became more palatable (in my case a top of the line 2015 Sony Bravia 4K model) and to think that people would go out of their way and buy a new TV just because this card cannot output at 60Hz to their TV's HDMI 2.0 connectors and there simply aren't DP converters currently able to support this card does not make much sense and is not a very sound reason to invest almost $2K on a new TV. Mind you, this is a top of the line 2015 set from a well known manufacturer, and it doesn't have DP connectors on it, like the vast majority of existing 4K TVs out there.

I like gaming in my living room, but have full ATX systems on both my home theaters, so I'm not the target market for this card in particular, but people looking to build a killer mini ATX system will have to look elsewhere if they want to game at 60FPS at 4K

Just looked at some of the Sony 4k TV manuals and unless you have a different one.

Video (2D): 4096 Ă— 2160p (60 Hz)*, 4096 Ă— 2160p (24 Hz), 3840 Ă— 2160p (60 Hz)*,
3840 Ă— 2160p (24, 25, 30 Hz),1080p (30, 60 Hz), 1080/24p, 1080i (60 Hz),
720p (30, 60 Hz), 720/24p, 480p, 480i, PC Formats
*1 YCbCr 4:2:0 / 8 bit
*2 3840 Ă— 2160p is displayed when 4096 Ă— 2160p is input


You'd be down-sampled to 8-bit 4:2:0 which is worse. Better to invest that 2k on a 4k 10-bit monitor with a DP connection and go as big as you can. Regardless of your GPU choice.

Personally I'd wait for DP 1.3 but if I had to buy now I'd look at the pros and cons.
 
Joined
Aug 16, 2004
Messages
3,275 (0.46/day)
Location
Sunny California
Processor Intel Core i9 13900KF
Motherboard Asus ROG Maximus Z690 Hero EVA Edition
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6800MHz G.Skill Trident Z5 Neo Series
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 980 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply EVGA Supernova G2 1300W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
Just looked at some of the Sony 4k TV manuals and unless you have a different one.

Video (2D): 4096 Ă— 2160p (60 Hz)*, 4096 Ă— 2160p (24 Hz), 3840 Ă— 2160p (60 Hz)*,
3840 Ă— 2160p (24, 25, 30 Hz),1080p (30, 60 Hz), 1080/24p, 1080i (60 Hz),
720p (30, 60 Hz), 720/24p, 480p, 480i, PC Formats
*1 YCbCr 4:2:0 / 8 bit
*2 3840 Ă— 2160p is displayed when 4096 Ă— 2160p is input


You'd be down-sampled to 8-bit 4:2:0 which is worse. Better to invest that 2k on a 4k 10-bit monitor with a DP connection and go as big as you can. Regardless of your GPU choice.

Personally I'd wait for DP 1.3 but if I had to buy now I'd look at the pros and cons.

Thanks for mentioning that, I had to double check, as honestly I wasn't aware of that limitation on my TV :( I have a Sony XBR55X850C, it does, apparently support 444 mode at 4K 60Hz after the latest firmware update according to this site:

http://www.rtings.com/tv/reviews/by-brand/sony/x850c?uxtv=b58b6b8ba3c3

10PC Monitor:

1080p @ 60Hz @ 4:4:4: Yes
1080p @ 120Hz: Yes
4k @ 30Hz @ 4:4:4: Yes
4k @ 60Hz: Yes
4k @ 60Hz @ 4:4:4: Yes
To enable chroma 4:4:4, set the mode to either Game or Graphics.
Update: With the new firmware update PKG2.463.0010NAB, This TV now supports 4k @ 60Hz @ 4:4:4. To enable this, go to Settings - External Inputs - HDMI Signal Format - Enhanced (new).

I checked and my TV has the latest firmware, so it seems it supports 10bit after all, as it shows the enhanced signal format in the settings menu, thanks for the heads up :p
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin.
Which is nonsensical. If I walk around my house and look at all my TVs that have HDMI, every single one of them has at least VGA + 3.5mm too. Some even have DVI. I think it's pretty clear what's going on here: the TV industry is deliberately trying to sabotage DisplayPort in the name of keeping HDMI around.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Thanks for mentioning that, I had to double check, as honestly I wasn't aware of that limitation on my TV :( I have a Sony XBR55X850C, it does, apparently support 444 mode at 4K 60Hz after the latest firmware update according to this site:

http://www.rtings.com/tv/reviews/by-brand/sony/x850c?uxtv=b58b6b8ba3c3



I checked and my TV has the latest firmware, so it seems it supports 10bit after all, as it shows the enhanced signal format in the settings menu, thanks for the heads up :p

No problem.

Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do. HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.

The person who posted that didn't read the HDMI FAQ as some of the following post point out. One even posted the Firmware changes and there is no mention of it.
 
Last edited:
Joined
Aug 16, 2004
Messages
3,275 (0.46/day)
Location
Sunny California
Processor Intel Core i9 13900KF
Motherboard Asus ROG Maximus Z690 Hero EVA Edition
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6800MHz G.Skill Trident Z5 Neo Series
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 980 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply EVGA Supernova G2 1300W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
No problem.

Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do. HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.

The person who posted that didn't read the HDMI FAQ as the post following his imply. One even posted the Firmware changes and there is no mention of it.

You're right, I stand corrected, in order to display chroma at 444 the bit rate is downsampled to 8bit.

And I agree it sucks most TVs don't support display port out of the box, it clearly is the best alternative for video interfaces :(
 
Joined
Dec 6, 2005
Messages
10,881 (1.62/day)
Location
Manchester, NH
System Name Senile
Processor I7-4790K@4.8 GHz 24/7
Motherboard MSI Z97-G45 Gaming
Cooling Be Quiet Pure Rock Air
Memory 16GB 4x4 G.Skill CAS9 2133 Sniper
Video Card(s) GIGABYTE Vega 64
Storage Samsung EVO 500GB / 8 Different WDs / QNAP TS-253 8GB NAS with 2x10Tb WD Blue
Display(s) 34" LG 34CB88-P 21:9 Curved UltraWide QHD (3440*1440) *FREE_SYNC*
Case Rosewill
Audio Device(s) Onboard + HD HDMI
Power Supply Corsair HX750
Mouse Logitech G5
Keyboard Corsair Strafe RGB & G610 Orion Red
Software Win 10
Awesome review. I particularly like the editorial comments in the conclusion about the irrationality of it all. Put's enthusiasm in perspective :)
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,450 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
No problem.

Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do. HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.

The person who posted that didn't read the HDMI FAQ as some of the following post point out. One even posted the Firmware changes and there is no mention of it.

I didn't know what 4:4:4 was. So I read up on it.

http://hdguru.com/hdmi-2-0-what-you-need-to-know/

Color crunching
Here’s how chroma subsampling works. The human eye is more sensitive to black and white detail than color detail. Chroma subsampling compression takes advantage of this fact by sending a full-resolution black and white (luma) information and only partial-resolution color (chroma) information. The result is a reduction of image data with no accompanying visual degradation.

There are three main types of chroma subsampling for video content: 4:4:4; 4:2:2; and 4:2:0. With 4:4:4 there is no subsampling. With 4:2:2, half of the color detail is thrown away. And with 4:2:0, 75% of color information is discarded. Blu-ray, HDTV, and DVD all use 4:2:0 subsampling. We don’t notice the loss of color detail right now with those formats, and we aren’t likely to notice it after the move to UHD.

Given 4:4:4 is ideal but not what the industry works to , it is still remiss of AMD to omit the HDMI 2.0 standard from their 'marketed' dedicated Home PC graphics card. All other arguments aside and the 4:4:4 drum banging put away - the industry has dictated the format, graphics vendors need to deal with that. The simple question to ask is, would my crossfired Nano's in my (now slightly larger mini ATX) case be better with an HDMI 2.0 connection for 60 fps gaming? The answer is absolutely yes.

Would it be great if the industry all adopted an unsampled chromatic system? - yes. BUT they haven't yet. So my hypothetical Nano crossfire set up is limited to 30fps in most circumstances in my living room TV because AMD didn't use HDMI 2.0.
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Îťoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
One more thought.

HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
I didn't know what 4:4:4 was. So I read up on it.

http://hdguru.com/hdmi-2-0-what-you-need-to-know/



Given 4:4:4 is ideal but not what the industry works to , it is still remiss of AMD to omit the HDMI 2.0 standard from their 'marketed' dedicated Home PC graphics card. All other arguments aside and the 4:4:4 drum banging put away - the industry has dictated the format, graphics vendors need to deal with that. The simple question to ask is, would my crossfired Nano's in my (now slightly larger mini ATX) case be better with an HDMI 2.0 connection for 60 fps gaming? The answer is absolutely yes.

Would it be great if the industry all adopted an unsampled chromatic system? - yes. BUT they haven't yet. So my hypothetical Nano crossfire set up is limited to 30fps in most circumstances in my living room TV because AMD didn't use HDMI 2.0.

You've convinced me.

If only I can force my system to work at 4:2:0 I'd be set. The bigger the screen the better.

/s

Here is something some of you will be able to duplicate if your system gives you the 4:2:2 option.
 
Last edited:

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,450 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
You've convinced me.

If only I can force my system to work at 4:2:0 I'd be set. The bigger the screen the better.

/s

Here is something some of you will be able to duplicate if your system gives you the 4:2:2 option.

Hmm, wasn't trying to convince you of anything, just pointing out market conditions from the TV manufacturers. No amount of techno posturing will change that. You still wont address whether AMD should have included HDMI 2.0 as an output. What may affect peoples choices though is the fps options for gaming. Fast first person shooters require higher fps for better gameplay - 30fps isn't ideal.

All being said, I personally wouldn't game on a TV, I'd always prefer a monitor, so the argument isn't for me. And perhaps the folks that game on TV's use consoles. For AMD though the fact is, most reviewers have criticised AMD for not adopting HDMI 2.0 for Fiji for it's 'living room TV use'. Whether it's got any real world impact or not, the negative impact is there from the start. I know 4:4:4 is preferred - it's the way the image should be but it's generally not delivered to us that way, hell - Blu Ray (which looks great to 99% of folks) doesn't use 4:4:4 (4:2:0 I think for that link?).

Anyway, this thread is about crossfire Nano so the colour discussion is for another thread. No point talking about it if AMD can't give you it via a TV that wont support it. It's 4:2:2 or 4:2:0. For now - we need to accept that.

EDIT: I did watch that wonderful sales presentation and I've now bought a Roland V-800HD. It added nothing to the argument though, in fact it was irrelevant to the discussion on Nano being used in a living room environment on a 4K TV with no HDMI 2.0.... go figure. Slow clap.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Hmm, wasn't trying to convince you of anything, just pointing out market conditions from the TV manufacturers. No amount of techno posturing will change that. You still wont address whether AMD should have included HDMI 2.0 as an output. What may affect peoples choices though is the fps options for gaming. Fast first person shooters require higher fps for better gameplay - 30fps isn't ideal.

All being said, I personally wouldn't game on a TV, I'd always prefer a monitor, so the argument isn't for me. And perhaps the folks that game on TV's use consoles. For AMD though the fact is, most reviewers have criticised AMD for not adopting HDMI 2.0 for Fiji for it's 'living room TV use'. Whether it's got any real world impact or not, the negative impact is there from the start. I know 4:4:4 is preferred - it's the way the image should be but it's generally not delivered to us that way, hell - Blu Ray (which looks great to 99% of folks) doesn't use 4:4:4 (4:2:0 I think for that link?).

Anyway, this thread is about crossfire Nano so the colour discussion is for another thread. No point talking about it if AMD can't give you it via a TV that wont support it. It's 4:2:2 or 4:2:0. For now - we need to accept that.

EDIT: I did watch that wonderful sales presentation and I've now bought a Roland V-800HD. It added nothing to the argument though, in fact it was irrelevant to the discussion on Nano being used in a living room environment on a 4K TV with no HDMI 2.0.... go figure. Slow clap.

Should they have included it sure I think I said it in another thread but not all 4k TVs even fully support HDMI 2.0 functionality as where all monitors and those TVs that have DisplayPort are likely to support all functionality. 15th Warlock just provided an example...

If your only option is HDMI 2.0. You better make sure you know what your 4k TV supports. I would say use the 4k 8-bit 4:4:4. At the same time that defeats the purpose of 4k 10-bit content. So you'll have to switch back and forth between settings. That's if your 4k TV supports the 10-bit 4k signal and doesn't down-sample you. Some 4k TVs down-sample you as soon as you pop-up the menu or use PiP. Until HDMI 2.0 functionality is ironed out in the 4k TVs its just a check-box.

You could just take one of the TVs I linked (they are a few others) or preferably a 4k 60hz 10-bit monitor w/DP and plug it in and forget about having to switch between settings.

Seems you'll buy anything that doesn't have AMDs name attached to it... You set that one up.

Like I said HERE.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
You could just take one of the TVs I linked (they are a few others) or preferably a 4k 60hz 10-bit monitor w/DP and plug it in and forget about having to switch between settings.


Bleh. I already got my TV, and paid nearly 10 times what a nano would cost locally. Spending that money again, just to get a DP port... psh... it's cheaper to not buy any AMD card, and go NVidia.

All I hope is that AMD corrects this with the next generation of GPUs.

Anyway, I kind of realized that AMD touts this card as a SFF VGA, not a HTPC VGA. It's the HTPC designation that makes things look bad, when it comes to HDMI/DP. A product should meet the needs of the market NOW not the future, and also not force consumers into limited purchasing choices of supporting hardware. SFF PC = monitor. HTPC = HDTV. It's quite different.

Nobody in their right mind will say HDMI is better than DP.. nobody has...yet you keep harping on this point like it matters when it doesn't. IT's the fact that DP-only connectivity prevents many users interested in this card form actually putting it to use, due to their pre-existing hardware.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Nvidia did the same with Titan X by saving a few cents and not adding a backplate to it, something the least expensive 980 featured out of the box from day one, and I called them out back then as well, both companies have a board of directors to respond to and a few cents here and there add up in the end when talking about your bottom line.
Nvidia said they scraped the backplate on some (or all) of their cards because it had heat issues with SLI, because room between cards is scarce. I don't know if its true, but could be. Or it's a smart way to earn some more money. Fact is, the point of Nvidia is probably valid anyway.

On HDMI 2.0 vs DP topic in TVs:
It saves money not to include DP, so they try to not add it to their TVs or charge extra for it in another model of same type. That would be another point of view on this subject.

HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.
Interesting thought. Maybe you're right, but I still think it's just AMD without money, trying to get away with HDMI 1.4 and adding it later in their graphics card line in 2016.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
One more thought.

HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.
HDMI has no support for adaptive sync and likely never will. It would cost TV manufacturers too much to implement.

HDMI 2.0 has enough bandwidth for 4K @ 60 Hz 24-bit color but it does not have enough bandwidth for 4K @ 60 Hz 30-bit color. DisplayPort can handle 4K @ 60 Hz 48-bit color. All figures are for 4:4:4.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Interesting thought. Maybe you're right, but I still think it's just AMD without money, trying to get away with HDMI 1.4 and adding it later in their graphics card line in 2016.

They are still paying royalties on HDMI either way since 1.4 was included.

What they said at the Fiji event is probably true (There is a video somewhere). They didn't see HDMI 2.0 necessary because there focus was on DisplayPort which could do it better and it has FreeSync.

Next gen you might see a similar outcome HDMI 2.0 might be there but its not going to be emphasized especially if DisplayPort 1.3 is introduced.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I expect all 14/16nm cards to have one HDMI 2.0 and one or more DisplayPorts. I don't know about DVI-I, DVI-D, and VGA--low end cards may still have them but I suspect NVIDIA will take after AMD and exclude them on top of the line cards.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
I expect all 14/16nm cards to have one HDMI 2.0 and one or more DisplayPorts. I don't know about DVI-I, DVI-D, and VGA--low end cards may still have them but I suspect NVIDIA will take after AMD and exclude them on top of the line cards.
All current HDMI 2.0 implementations that I have seen use a MegaChips support IC. This IC will require additional PCB real-estate (and is probably why Fury cards don't have such support). So I expect that perhaps such connectivity will be put internal to the GPU silicon, but I'd simply be happy with all DP if a suitable HDMI adapter came in the box rather than other things like DVI to VGA or whatever. It's just weird how little HDMI 2.0 support there really is in hardware (many Z170 motherboards support it) considering it is a relatively old spec.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
They're never going to put DisplayPort-to-anything (except miniDisplayPort to DisplayPort in the case of Eyefinity cards) adapters in the box because they're too expensive. DVI-I to VGA is as simple as changing the pin out (<$1 adapter) which is why they're all over the place. There is no native backwards compatibility in DisplayPort.

Z170 only supports HDMI 2.0 through Thunderbolt via Alpine Ridge chip (achieves it by converting a DisplayPort signal).

Article about MegaChips (it is 7 x 7 mm):
http://www.reuters.com/article/2015/06/15/megachips-hdmi-chip-idUSnPn5PfYPz+96+PRN20150615

DisplayPort to HDMI 2.0 requires a level-shifter and active-protocol converter (LSPCON).
 
Last edited:
Top