Tuesday, November 28th 2017

HDMI 2.1 Specification Sets New Resolution Standard

HDMI Forum, Inc. today announced the release of Version 2.1 of the HDMI Specification which is now available to all HDMI 2.0 adopters. This latest HDMI Specification supports a range of higher video resolutions and refresh rates including 8K60 and 4K120, and resolutions up to 10K. Dynamic HDR formats are also supported, and bandwidth capability is increased up to 48Gbps. Supporting the 48Gbps bandwidth is the new Ultra High Speed HDMI Cable. The cable ensures high- bandwidth dependent features are delivered including uncompressed 8K video with HDR. It features exceptionally low EMI (electro-magnetic interference) which reduces interference with nearby wireless devices. The cable is backwards compatible and can be used with the existing installed base of HDMI devices.

Version 2.1 of the HDMI Specification is backward compatible with earlier versions of the specification, and was developed by the HDMI Forum's Technical Working Group whose members represent some of the world's leading manufacturers of consumer electronics, personal computers, mobile devices, cables and components.
"The HDMI Forum's mission is to develop specifications meeting market needs, growing demands for higher performance, and to enable future product opportunities," said Robert Blanchard of Sony Electronics, president of the HDMI Forum.

HDMI Specification 2.1 Features Include:
  • Higher video resolutions support a range of high resolutions and faster refresh rates including 8K60Hz and 4K120Hz for immersive viewing and smooth fast-action detail. Resolutions up to 10K are also supported for commercial AV, and industrial and specialty usages.
  • Dynamic HDR support ensures every moment of a video is displayed at its ideal values for depth, detail, brightness, contrast and wider color gamuts-on a scene-by-scene or even a frame-by- frame basis.
  • The Ultra High Speed HDMI Cable supports the 48G bandwidth for uncompressed HDMI 2.1 feature support. The cable also features very low EMI emission and is backwards compatible with earlier versions of the HDMI Specification and can be used with existing HDMI devices.
  • eARC simplifies connectivity, provides greater ease of use, and supports the most advanced audio formats and highest audio quality. It ensures full compatibility between audio devices and upcoming HDMI 2.1 products.
  • Enhanced refresh rate features ensure an added level of smooth and seamless motion and transitions for gaming, movies and video. They include:
  • Variable Refresh Rate (VRR) reduces or eliminates lag, stutter and frame tearing for more fluid and better detailed gameplay.
  • Quick Media Switching (QMS) for movies and video eliminates the delay that can result in blank screens before content is displayed.
  • Quick Frame Transport (QFT) reduces latency for smoother no-lag gaming, and real-time interactive virtual reality.
  • Auto Low Latency Mode (ALLM) allows the ideal latency setting to automatically be set allowing for smooth, lag-free and uninterrupted viewing and interactivity.
The HDMI 2.1 Compliance Test Specification (CTS) will be published in stages during Q1-Q3 2018, and HDMI adopters will be notified when it is available.

The HDMI Forum Reaches Out to Grow Global Membership
The HDMI Forum is an open trade association that guides the future direction of HDMI technology and develops new versions of the HDMI Specification. The HDMI Forum currently has a membership of 92 companies, and is actively inviting more companies to apply for membership and help shape the future of HDMI technology. There is also a focus to encourage more companies to participate as the global presence of HDMI-enabled products and solutions continues to grow.
Add your own comment

41 Comments on HDMI 2.1 Specification Sets New Resolution Standard

#1
R-T-B
FordGT90Concept said:
Broadcasts. NTSC -> ATSC
Still a silly argument. They could've upped the analog ante just as easily, and introduced weird standards there.

The fact is the reason they went digital was more efficient use of bandwidth channels, nothing more.
Posted on Reply
#2
Solidstate89
FordGT90Concept said:
Broadcasts. NTSC -> ATSC
That's not even a cogent argument. A TV I bought a decade ago will still work. A TV I bought 20 years ago will still work. Converting to digital source from analog source has less than nothing to do with making people upgrade "every few years" since of these digital standards we're talking force any upgrades at all. And since this thread is talking about a connector standard specifically, it makes even less sense.
Posted on Reply
#3
TheLostSwede
"The HDMI Forum is an open trade association" no, they're a mafia that forces companies to pay up for the HDMI license or else...
They even keep a list to name and shame companies that are no longer licensees of the HDMI standard - https://www.hdmi.org/learningcenter/adopters_terminated.aspx :kookoo:

DP on the other hand is royalty free...
Posted on Reply
#4
FordGT90Concept
"I go fast!1!11!1!"
[LEFT]
R-T-B said:
Still a silly argument. They could've upped the analog ante just as easily, and introduced weird standards there.

The fact is the reason they went digital was more efficient use of bandwidth channels, nothing more.
[/LEFT]
Solidstate89 said:
That's not even a cogent argument. A TV I bought a decade ago will still work. A TV I bought 20 years ago will still work. Converting to digital source from analog source has less than nothing to do with making people upgrade "every few years" since of these digital standards we're talking force any upgrades at all. And since this thread is talking about a connector standard specifically, it makes even less sense.
In 2009, NTSC broadcasts ended. If you wanted to watch anything on an NTSC TV, you had to use an external ATSC tuner. They sucked. People put up with them for a while but when they could, they just bought a new TV.

Now, 4K TVs are all over the place but content is not. ATSC is to NTSC as ATSC 3.0 is to ATSC 1.0. It will introduce HEVC, 4K, and HDR standards making them as benign as MPEG2, 720p, and 1080i are today. HDMI 2.0 was needed to transmit that signal. HDMI 2.1? First it was black and white TV, then it was color, then it was DTV, then it was HDTV, now we're getting to 4K UHD, and HDMI 2.1 exists solely to begin the push to 8K UHD. Not only does everyone have to buy new TVs to support 8K UHD, broadcasters have to buy new cameras, recording, editing, and transmission equipment, and people have to buy new receivers so their content isn't downscaled. We're talking billions of dollars easily, and repeatable about every decade or less.
Posted on Reply
#5
R-T-B
FordGT90Concept said:


In 2009, NTSC broadcasts ended. If you wanted to watch anything on an NTSC TV, you had to use an external ATSC tuner. They sucked. People put up with them for a while but when they could, they just bought a new TV.

Now, 4K TVs are all over the place but content is not. ATSC is to NTSC as ATSC 3.0 is to ATSC 1.0. It will introduce HEVC, 4K, and HDR standards making them as benign as MPEG2, 720p, and 1080i are today. HDMI 2.0 was needed to transmit that signal. HDMI 2.1? First it was black and white TV, then it was color, then it was DTV, then it was HDTV, now we're getting to 4K UHD, and HDMI 2.1 exists solely to begin the push to 8K UHD. Not only does everyone have to buy new TVs to support 8K UHD, broadcasters have to buy new cameras, recording, editing, and transmission equipment, and people have to buy new receivers so their content isn't downscaled. We're talking billions of dollars easily, and repeatable about every decade or less.
Of course they want you to upgrade every few years. Our point is digital vs analog had nothing to do with it. They could've introduced some special HD analog standard just as easily.

But we are way off topic now.
Posted on Reply
#6
The Von Matrices
Bansaku said:
" The cable ensures high- bandwidth dependent features are delivered including uncompressed 8K video with HDR. It features exceptionally low EMI (electro-magnetic interference) which reduces interference with nearby wireless devices. The cable is backwards compatible and can be used with the existing installed base of HDMI devices. "

Must use the same magical tech that USB3 cables use. :p

While I completely understand the need for better materials to ensure consistent, uninhibited signal flow (like CAT-5 vs CAT-6), this will only give 3rd party vendors a reason to charge a premium for these cables, slapping on star-bursts and bullet points on how superior they are to their predecessor. Unless the pin configurations have changed (looking around at the available data, it hasn't), any good quality HDMI cable should work. :cool:
I've had a different experience with HDMI 2.0. I tried six different cables, all '4K certified', and only one of them was able to transmit the full 18gb/s HDMI 2.0 over the 7m between my media PC and TV. (props to Blue Jeans Cable for being the only good one). So while I agree that the bullet points are meaningless, it's not because cable quality is irrelevant but because the bullet points don't correspond to the actual quality of the cable. I suspect HDMI 2.1 will be even worse in this regard. I would have preferred if they used a backwards compatible port but not a backwards compatible cable so that there was a clear differentiation on which cables were designed for the standard. And I suspect that my 7m cable may be impossible on HDMI 2.1.
Posted on Reply
#7
bonehead123
FordGT90Concept said:
That's exactly why they changed to digital: encourage everyone to replace their TVs every few years instead of every decade.
Actually the change to digital, at least in the US, was a government-mandated change due to the fact that they told all TV stations and broadcasters that they had until 2010 to get with the program or be fined by the FCC and be subject to losing their licenses .....

The fact that the mandate seems to have coincided with other significant technical improvements in tv's, monitors, GPU's etc. is just more icing on the cake for us :)
Posted on Reply
#8
FordGT90Concept
"I go fast!1!11!1!"
[LEFT]
R-T-B said:
Of course they want you to upgrade every few years. Our point is digital vs analog had nothing to do with it. They could've introduced some special HD analog standard just as easily.
Um, no. Not technically possible.
[/LEFT]
bonehead123 said:
Actually the change to digital, at least in the US, was a government-mandated change due to the fact that they told all TV stations and broadcasters that they had until 2010 to get with the program or be fined by the FCC and be subject to losing their licenses .....

The fact that the mandate seems to have coincided with other significant technical improvements in tv's, monitors, GPU's etc. is just more icing on the cake for us :)
And a huge boon to their bottom line. ATSC 1.0 is going to be phased out too sometime after ATSC 3.0 rolls out. Reason: ATSC 1.0 tuners don't know what to do with an ATSC 3.0 signal.

R-T-B said:
But we are way off topic now.
Nope, they go hand in hand. A "TV" is a TV because it has an NTSC/ATSC/PAL tuner in it. HDMI was practically invented to handle ATSC/PAL signals uncompressed with audio.

HDMI 2.0 was invented to handle 4K UHD BluRay. HDMI 2.1 is for 8K UHD.
Posted on Reply
#9
R-T-B
FordGT90Concept said:
Um, no. Not technically possible.
No? Funny. I seem to remember propietary rf-decoders and analog filters for the Disney channel being a thing. As for upping the resolution? Of course that's possible, they do it over component video. Please elaborate. It seems very possible. All you need to do is make it so modern tvs of then can't view it.

Of course it's possible.

FordGT90Concept said:

Nope, they go hand in hand. A "TV" is a TV because it has an NTSC/ATSC/PAL tuner in it. HDMI was practically invented to handle ATSC/PAL signals uncompressed with audio.

HDMI 2.0 was invented to handle 4K UHD BluRay. HDMI 2.1 is for 8K UHD.
The argument you are driving that the analog to digital conversion was primarily to drive upgrades (and furthermore, that no other standard would do) is insanely offtopic.
Posted on Reply
#10
FordGT90Concept
"I go fast!1!11!1!"
R-T-B said:
No? Funny. I seem to remember propietary rf-decoders and analog filters for the Disney channel being a thing. As for upping the resolution? Of course that's possible, they do it over component video. Please elaborate. It seems very possible. All you need to do is make it so modern tvs of then can't view it.

Of course it's possible.
You can always try to recover lost quality with an analog signal but the received signal will always be degraded from the transmitted signal. Digital is 1:1 so long as enough packets are received to decode the complete signal.


R-T-B said:
The argument you are driving that the analog to digital conversion was primarily to drive upgrades (and furthermore, that no other standard would do) is insanely offtopic.
If there was no push to digital TVs, the HDMI Forum would have never been created. On the other hand, DisplayPort was always an inevitability because professional displays (especially medical), require that 1:1 signal quality. They were the driving force behind the creation of DVI in the first place.
Posted on Reply
#11
wiyosaya
TechSpot did an article on using a 4K TV as a monitor a few months back for anyone interested.

FordGT90Concept said:

And a huge boon to their bottom line. ATSC 1.0 is going to be phased out too sometime after ATSC 3.0 rolls out. Reason: ATSC 1.0 tuners don't know what to do with an ATSC 3.0 signal.
Agreed. I, for one, am looking forward to ATSC 3.0 for the main reason that reception promises to be what it should have been with ATSC 1.0.
Posted on Reply
#12
FordGT90Concept
"I go fast!1!11!1!"
That's not changing. They're increasing compression and bandwidth.

The problem with DTV is that it's an all or nothing proposition. In most cases people don't have an antenna big enough (approximately a foot of antenna length per 10 miles of distance to the broadcast tower) to handle the signal from the distance they're receiving it or they're not amplifying it enough to compensate for the number of connected devices. I get 95%+ reception here 24/7 and I'm 40-50 miles from the towers.

A lot of people buy dinky 3' circular antennas from Wal-Mart and expect it to magically work. If you aren't living in the cities where the broadcast towers are, mileage will vary greatly.
Posted on Reply
#13
Captain_Tom
Excellent. I would like a 10K@120Hz HDR display please......Please?

Oh right these idiots haven't even made a 5K@144Hz display yet, let alone a decent one with REAL HDR...

To think the OLED 4K HDR screen on my phone is probably nicer than ANY monitor you can buy right now for gaming under $5000 lol.
Posted on Reply
#15
Owen1982
lol Techpowerup

8K60, 4K120, Dynamic HDR etc etc

Shows picture of $3 HDMI cable from China o_O
Posted on Reply
Add your own comment