• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD "Fiji" Silicon Lacks HDMI 2.0 Support

Joined
Jun 18, 2015
Messages
10 (0.00/day)
well i will see what a few other people have to say about because you copied what that said and idk if that is crap or what unless i translated something wrong on some more technical information.
I copied it from Adobe. I would hope Adobe knows about color lol. All joking aside, we agree on the core topic (DP > HDMI), but HDMI 2.0 isn't limited in color reproduction compared to DP 1.2. I really wanted to get a Fury X, but since the home theater world relies on HDMI 2.0, it's a must have for me :(.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.62/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
DisplayPort 1.2 = 17.28 gbps
HDMI 2.0 = 18 gbps
DisplayPort 1.3 = 32.4 gbps
The best HDMI cables can do is 25 Gbps and those are the best of the best cables over very short distances.

There's a chart here on HDMI2: http://www.dpllabs.com/page/dpl-full-4k-cable-certification

DisplayPort 1.3 should be able to handle 4:4:4 4K @ 16-bits per color where HDMI 2.0 can only handle 8-bits per color. Not to mention DisplayPort 1.3 can carry an HDMI signal. As if that weren't enough, DisplayPort 1.3 is capable of VESA Display Stream Compression which can further increase effective payload.


If Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.
 
Last edited:

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,803 (3.86/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
I dont think they want me getting this card :/

I use DVI for my monitor (fair enough, its getting on and things need to move forward), but Ive a lovely TV which I use currently as my 4k gaming display when I feel like being on the couch and thats over HDMI2 (And yes 4k@60)
So would you mind sharing what cable you have to allow you that @60?
 
Joined
Nov 4, 2005
Messages
11,716 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
the amd middle finger to people wasting money? 4k tv's got that price tag.
 
Joined
Jun 18, 2015
Messages
10 (0.00/day)
Yup, I had to do the reg hack to get full range.

and that 4:2:0 is only for kepler (600/700 series).
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
its seems your 4k may not even use that much color space.. because i only look at the awesome ones for gaming lol
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
Maybe. I play on a 65" Samsung.
well check it out.. if you got time to read a forum you can read and learn while you do it if thats what interests you.
welcome to tpu by the way.. probably more than obvious how much misleading information after reading this thread so just ask haha
i will tell you @HumanSmoke @Steevo @FordGT90Concept are probably going to give you the best strait up information.
i can give the mods shit but they are cool for the most part just some of them seem to be very limited in knowledge for the years they have been around reading articles.
i would like to be wrong for what i said about what mussels was doing but it does not seem that way.
 
Last edited:
Joined
Nov 4, 2005
Messages
11,716 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Lets give this a good logical look.


HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.

Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.

AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?

Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.
 
Joined
Sep 6, 2013
Messages
3,003 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature :D
You will be installing newer *hotfix* "stable" drivers every week until then :p
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
This might help


TFA said:
  • Maximum Resolution: 4k @ 30 Hz (note: at time of writing no chipset supports 60Hz)

Nope, won't help one bit.
 
Joined
Jul 14, 2008
Messages
872 (0.15/day)
Location
Copenhagen, Denmark
System Name Ryzen/Laptop/htpc
Processor R9 3900X/i7 6700HQ/i7 2600
Motherboard AsRock X470 Taichi/Acer/ Gigabyte H77M
Cooling Corsair H115i pro with 2 Noctua NF-A14 chromax/OEM/Noctua NH-L12i
Memory G.Skill Trident Z 32GB @3200/16GB DDR4 2666 HyperX impact/24GB
Video Card(s) TUL Red Dragon Vega 56/Intel HD 530 - GTX 950m/ 970 GTX
Storage 970pro NVMe 512GB,Samsung 860evo 1TB, 3x4TB WD gold/Transcend 830s, 1TB Toshiba/Adata 256GB + 1TB WD
Display(s) Philips FTV 32 inch + Dell 2407WFP-HC/OEM/Sony KDL-42W828B
Case Phanteks Enthoo Luxe/Acer Barebone/Enermax
Audio Device(s) SoundBlasterX AE-5 (Dell A525)(HyperX Cloud Alpha)/mojo/soundblaster xfi gamer
Power Supply Seasonic focus+ 850 platinum (SSR-850PX)/165 Watt power brick/Enermax 650W
Mouse G502 Hero/M705 Marathon/G305 Hero Lightspeed
Keyboard G19/oem/Steelseries Apex 300
Software Win10 pro 64bit
lol at people complaining about this. 95% of users will buy this card to use on a monitor with displayport......and for the few % that want to use a 4K TV, most of the 4K TVs and even monitors on the market don't even support HDMI 2.0 yet.
SO maybe there is 0.0000000001 % of the market that will feel let down.
Sure, if you're planning to hold onto the card for 5 years and want to use a TV it could be a problem, but these cards will be obsolete in 12 months when 14/16 nm cards arrive anyway, and most people will have to buy a new 4K TV to support HDMI 2.0 anyway
If you're buying 4K TVs and $500-$600 graphics cards I'm sure you can afford an upgrade next year.

+1 to that!!! i really dont get why so many people in here are so worked up.. i mean if you can afford to pay those amount of $ why the hell wouldnt you buy a panel that can support DP.. i really dont get it.



Who the Sock cares?? And if even if someone do care -

Buy a TV with display port then like this Panasonic TC-L65WT600 has...


exactly!!! as if the 4k displays are chep in the first place..


I can write it in Greek if you prefer. No misspelling there. By the way. Who is the retard here? The person who makes a mistake writing in another language, or the person that comments about that, like you did? Anyway, I can understand you being upset. Try to relax.

ela mori Elladara!!! ;)



Such an ill tempered thread. How about people stop being dicks and stick to the topic.


thank you!


Thank you, that answers the question at the end of R.H.P's post # 71, all my point is/was is just because this limitation does not hinder some users AMD is still in my opinion missing a trick and an opportunity with all those 4K TV owners who don't want to spend another bunch of cash on a monitor, in my case it's more about commiserating with 4K TV owners than criticising AMD but both go hand in hand to a degree.


i really think this is a non issue.. if you have $$ in order to buy a 4k tv and such expensive GPUs i am certain that you can afford a new tv with all the bells and whistles..



NOooooooope! That's HDMI 1.4 (30Hz max at 4K).
Noooooope, HDMI 1.4. Two comments say 30 Hz is the best it can do at 4K, if it even does 4K.
HDMI 2 support is near non-existant. Why would Fiji be an exception to the rule? I'm disappointed it doesn't but at the same time, I don't care. DisplayPort is the future, not HDMI.
Look at the reviews. Advertized as 4K: only does 30Hz; horrible reviews. This is the problem with HDMI2. They keep trying to ram more bits through that hose without changing the hose. The connectors may be able to handle the advertised 60 Hz but cables cannot. When genuine 2.0 compliant cables debut, they'll probably be like $20+ per foot because of the massive insulation required to prevent crosstalk and interference. HDMI has always been the retard of the display standards taking the DVI spec, tacking audio on to it, and disregarding all the controls VESA put on it to guarantee the cable will work. This was in inevitability with HDMI because the people behind HDMI haven't a clue what they're doing. This is why VESA didn't get behind HDMI and why they went off and designed their own standard that's actually prepared to handle the task. HDMI is not suitable for 4K and likely never will be.
The Fury X manual only mentions DisplayPort 1.2 repeatedly. I don't know if it supports 1.3:
http://support.amd.com/Documents/amd-radeon-r9-fury-x.pdf


thank you!



Lets give this a good logical look.
HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.
AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?
Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.

the truth has been spoken!!!!

i really find this a non issue. and i want to thank you guys for trying to be objective and civil.
 
Joined
Sep 6, 2013
Messages
3,003 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
If Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.
I wouldn't expect that. They only need 1.2a for Freesync, so they will be fine with that. I believe they decided to spend the last dollars they had on the LEDs instead of implementing DP1.3.

HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
AMD was working with Freesync over HDMI at Computex and the same was rumored for Nvidia.

AMD Demonstrates FreeSync-over-HDMI Concept Hardware at Computex 2015
 
Joined
Apr 24, 2008
Messages
1,888 (0.32/day)
Processor RyZen R9 3950X
Motherboard ASRock X570 Taichi
Cooling Coolermaster Master Liquid ML240L RGB
Memory 64GB DDR4 3200 (4x16GB)
Video Card(s) RTX 3050
Storage Samsung 2TB SSD
Display(s) Asus VE276Q, VE278Q and VK278Q triple 27” 1920x1080
Case Zulman MS800
Audio Device(s) On Board
Power Supply Seasonic 650W
VR HMD Oculus Rift, Oculus Quest V1, Oculus Quest 2
Software Windows 11 64bit
People have every right to expect that new video cards will support newer standards like HDMI 2.0. If AMD was trying to make some stand against HDMI (which I doubt) then it would be more appropriate for them to omit support for all versions of HDMI rather then stagnating on an older HDMI standard.

Based on that alone it seems more like a mistake then some message. Is it a big mistake, not IMO but it still looks like a mistake.

I also expect hardware H.265 encode and decode. If this HDMI 2.0 thing is true I wouldn't be surprised if that was a bust too.
 
Joined
Nov 3, 2013
Messages
2,141 (0.56/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
Ok, so HDMI 2.0 is needed for 4K/60.
How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
And most importantly, how much of the market share do they take?
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
Lets give this a good logical look.


HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.

Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.

AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?

Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.
30vs60pfs
i guess that is a question of budget and standard. 30fps is playable and people do it every day.. i like 50 because going around 50-60. not really noticeable to me. wont being locked into a refresh rate eventually cause input lag if not driving someone crazy for days trying to fix it?
g-sync vs freesync
they are both good and above my standard on refresh rates and totally specd out in my opinion.
i do like how freesync works and doesnt need extra parts in the display that you get charged for because oem's get charged the cost of the extra hardware with a license free.
yet another open standard amd helped to put on paper way before gsync was a thought.
 
Joined
Apr 24, 2008
Messages
1,888 (0.32/day)
Processor RyZen R9 3950X
Motherboard ASRock X570 Taichi
Cooling Coolermaster Master Liquid ML240L RGB
Memory 64GB DDR4 3200 (4x16GB)
Video Card(s) RTX 3050
Storage Samsung 2TB SSD
Display(s) Asus VE276Q, VE278Q and VK278Q triple 27” 1920x1080
Case Zulman MS800
Audio Device(s) On Board
Power Supply Seasonic 650W
VR HMD Oculus Rift, Oculus Quest V1, Oculus Quest 2
Software Windows 11 64bit
Ok, so HDMI 2.0 is needed for 4K/60.
How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
And most importantly, how much of the market share do they take?
That's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
 
Joined
Nov 4, 2005
Messages
11,716 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
That's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.


With 8 bit color.

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx


So a little color schooling.


What do you see?

http://i4.minus.com/ibyJcwdIniHUEs.png


https://en.wikipedia.org/wiki/Color_depth

Even if you saw the highest end, it may only be processed in 8 bit per color instead of 10 and thus will still have blocking and gradients. HDMI 2.0 is still shit compared to Display Port.
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
That's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
your missing that the color space is filled and is not the true color of the content when your 4:4:4 via hdmi 2.0 4k@60hz. so if your display can do better than 8bits its not going to be what it should.
displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz
 
Joined
Apr 24, 2008
Messages
1,888 (0.32/day)
Processor RyZen R9 3950X
Motherboard ASRock X570 Taichi
Cooling Coolermaster Master Liquid ML240L RGB
Memory 64GB DDR4 3200 (4x16GB)
Video Card(s) RTX 3050
Storage Samsung 2TB SSD
Display(s) Asus VE276Q, VE278Q and VK278Q triple 27” 1920x1080
Case Zulman MS800
Audio Device(s) On Board
Power Supply Seasonic 650W
VR HMD Oculus Rift, Oculus Quest V1, Oculus Quest 2
Software Windows 11 64bit
your missing that the color space is filled and is not the true color of the content when your 4:4:4 via hdmi 2.0 4k@60hz. so if your display can do better than 8bits its not going to be what it should.
displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz
So what is your point,......?

Tell it to the industry making UHD TV's.

My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.
 
Joined
Nov 4, 2005
Messages
11,716 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
So what is your point,......?

Tell it to the industry making UHD TV's.

My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.
moresense.jpg



Either you understand that 8 bit color looks like shit, or you don't.

We have two simple scenarios in which you replay to a thread about a new graphics card where HDMI 2.0 is NOT supported.


1) You care as you have something relevant to add, understand what it means, why its important or not.

2) You are a Nvidiot and need to thread crap elsewhere.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
You guys are arguing two different standpoints that are mutually exclusive.
@Octavean is putting forward that HDMI 2.0 has favour with TV vendors and even if it lacks bandwidth compared with DP, will still be utilized.
@Steevo ...well you're basically arguing that DP is better than HDMI and graphics vendors should concentrate on it even though TV manufacturers aren't using it to any great extent.

One is argument about tech implementation (and a few insults), one is about practical implementation in a real market.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
4k standards is defined as 4k resolution 10-bit+ Rec/BT 2020. HDMI 2.0 can only do that at 4K/30hz.

That still isn't the overall issues because even then your upscaling or downscaling thru the chain
 
Top