• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

HDMI 2.1 Specification Sets New Resolution Standard

Broadcasts. NTSC -> ATSC

Still a silly argument. They could've upped the analog ante just as easily, and introduced weird standards there.

The fact is the reason they went digital was more efficient use of bandwidth channels, nothing more.
 
Broadcasts. NTSC -> ATSC
That's not even a cogent argument. A TV I bought a decade ago will still work. A TV I bought 20 years ago will still work. Converting to digital source from analog source has less than nothing to do with making people upgrade "every few years" since of these digital standards we're talking force any upgrades at all. And since this thread is talking about a connector standard specifically, it makes even less sense.
 
"The HDMI Forum is an open trade association" no, they're a mafia that forces companies to pay up for the HDMI license or else...
They even keep a list to name and shame companies that are no longer licensees of the HDMI standard - https://www.hdmi.org/learningcenter/adopters_terminated.aspx :kookoo:

DP on the other hand is royalty free...
 
Last edited:
Still a silly argument. They could've upped the analog ante just as easily, and introduced weird standards there.

The fact is the reason they went digital was more efficient use of bandwidth channels, nothing more.
That's not even a cogent argument. A TV I bought a decade ago will still work. A TV I bought 20 years ago will still work. Converting to digital source from analog source has less than nothing to do with making people upgrade "every few years" since of these digital standards we're talking force any upgrades at all. And since this thread is talking about a connector standard specifically, it makes even less sense.
In 2009, NTSC broadcasts ended. If you wanted to watch anything on an NTSC TV, you had to use an external ATSC tuner. They sucked. People put up with them for a while but when they could, they just bought a new TV.

Now, 4K TVs are all over the place but content is not. ATSC is to NTSC as ATSC 3.0 is to ATSC 1.0. It will introduce HEVC, 4K, and HDR standards making them as benign as MPEG2, 720p, and 1080i are today. HDMI 2.0 was needed to transmit that signal. HDMI 2.1? First it was black and white TV, then it was color, then it was DTV, then it was HDTV, now we're getting to 4K UHD, and HDMI 2.1 exists solely to begin the push to 8K UHD. Not only does everyone have to buy new TVs to support 8K UHD, broadcasters have to buy new cameras, recording, editing, and transmission equipment, and people have to buy new receivers so their content isn't downscaled. We're talking billions of dollars easily, and repeatable about every decade or less.
 
In 2009, NTSC broadcasts ended. If you wanted to watch anything on an NTSC TV, you had to use an external ATSC tuner. They sucked. People put up with them for a while but when they could, they just bought a new TV.

Now, 4K TVs are all over the place but content is not. ATSC is to NTSC as ATSC 3.0 is to ATSC 1.0. It will introduce HEVC, 4K, and HDR standards making them as benign as MPEG2, 720p, and 1080i are today. HDMI 2.0 was needed to transmit that signal. HDMI 2.1? First it was black and white TV, then it was color, then it was DTV, then it was HDTV, now we're getting to 4K UHD, and HDMI 2.1 exists solely to begin the push to 8K UHD. Not only does everyone have to buy new TVs to support 8K UHD, broadcasters have to buy new cameras, recording, editing, and transmission equipment, and people have to buy new receivers so their content isn't downscaled. We're talking billions of dollars easily, and repeatable about every decade or less.

Of course they want you to upgrade every few years. Our point is digital vs analog had nothing to do with it. They could've introduced some special HD analog standard just as easily.

But we are way off topic now.
 
" The cable ensures high- bandwidth dependent features are delivered including uncompressed 8K video with HDR. It features exceptionally low EMI (electro-magnetic interference) which reduces interference with nearby wireless devices. The cable is backwards compatible and can be used with the existing installed base of HDMI devices. "

Must use the same magical tech that USB3 cables use. :p

While I completely understand the need for better materials to ensure consistent, uninhibited signal flow (like CAT-5 vs CAT-6), this will only give 3rd party vendors a reason to charge a premium for these cables, slapping on star-bursts and bullet points on how superior they are to their predecessor. Unless the pin configurations have changed (looking around at the available data, it hasn't), any good quality HDMI cable should work. :cool:

I've had a different experience with HDMI 2.0. I tried six different cables, all '4K certified', and only one of them was able to transmit the full 18gb/s HDMI 2.0 over the 7m between my media PC and TV. (props to Blue Jeans Cable for being the only good one). So while I agree that the bullet points are meaningless, it's not because cable quality is irrelevant but because the bullet points don't correspond to the actual quality of the cable. I suspect HDMI 2.1 will be even worse in this regard. I would have preferred if they used a backwards compatible port but not a backwards compatible cable so that there was a clear differentiation on which cables were designed for the standard. And I suspect that my 7m cable may be impossible on HDMI 2.1.
 
That's exactly why they changed to digital: encourage everyone to replace their TVs every few years instead of every decade.

Actually the change to digital, at least in the US, was a government-mandated change due to the fact that they told all TV stations and broadcasters that they had until 2010 to get with the program or be fined by the FCC and be subject to losing their licenses .....

The fact that the mandate seems to have coincided with other significant technical improvements in tv's, monitors, GPU's etc. is just more icing on the cake for us :)
 
Of course they want you to upgrade every few years. Our point is digital vs analog had nothing to do with it. They could've introduced some special HD analog standard just as easily.
Um, no. Not technically possible.
Actually the change to digital, at least in the US, was a government-mandated change due to the fact that they told all TV stations and broadcasters that they had until 2010 to get with the program or be fined by the FCC and be subject to losing their licenses .....

The fact that the mandate seems to have coincided with other significant technical improvements in tv's, monitors, GPU's etc. is just more icing on the cake for us :)
And a huge boon to their bottom line. ATSC 1.0 is going to be phased out too sometime after ATSC 3.0 rolls out. Reason: ATSC 1.0 tuners don't know what to do with an ATSC 3.0 signal.

But we are way off topic now.
Nope, they go hand in hand. A "TV" is a TV because it has an NTSC/ATSC/PAL tuner in it. HDMI was practically invented to handle ATSC/PAL signals uncompressed with audio.

HDMI 2.0 was invented to handle 4K UHD BluRay. HDMI 2.1 is for 8K UHD.
 
Um, no. Not technically possible.

No? Funny. I seem to remember propietary rf-decoders and analog filters for the Disney channel being a thing. As for upping the resolution? Of course that's possible, they do it over component video. Please elaborate. It seems very possible. All you need to do is make it so modern tvs of then can't view it.

Of course it's possible.

Nope, they go hand in hand. A "TV" is a TV because it has an NTSC/ATSC/PAL tuner in it. HDMI was practically invented to handle ATSC/PAL signals uncompressed with audio.

HDMI 2.0 was invented to handle 4K UHD BluRay. HDMI 2.1 is for 8K UHD.


The argument you are driving that the analog to digital conversion was primarily to drive upgrades (and furthermore, that no other standard would do) is insanely offtopic.
 
No? Funny. I seem to remember propietary rf-decoders and analog filters for the Disney channel being a thing. As for upping the resolution? Of course that's possible, they do it over component video. Please elaborate. It seems very possible. All you need to do is make it so modern tvs of then can't view it.

Of course it's possible.
You can always try to recover lost quality with an analog signal but the received signal will always be degraded from the transmitted signal. Digital is 1:1 so long as enough packets are received to decode the complete signal.


The argument you are driving that the analog to digital conversion was primarily to drive upgrades (and furthermore, that no other standard would do) is insanely offtopic.
If there was no push to digital TVs, the HDMI Forum would have never been created. On the other hand, DisplayPort was always an inevitability because professional displays (especially medical), require that 1:1 signal quality. They were the driving force behind the creation of DVI in the first place.
 
TechSpot did an article on using a 4K TV as a monitor a few months back for anyone interested.

And a huge boon to their bottom line. ATSC 1.0 is going to be phased out too sometime after ATSC 3.0 rolls out. Reason: ATSC 1.0 tuners don't know what to do with an ATSC 3.0 signal.
Agreed. I, for one, am looking forward to ATSC 3.0 for the main reason that reception promises to be what it should have been with ATSC 1.0.
 
That's not changing. They're increasing compression and bandwidth.

The problem with DTV is that it's an all or nothing proposition. In most cases people don't have an antenna big enough (approximately a foot of antenna length per 10 miles of distance to the broadcast tower) to handle the signal from the distance they're receiving it or they're not amplifying it enough to compensate for the number of connected devices. I get 95%+ reception here 24/7 and I'm 40-50 miles from the towers.

A lot of people buy dinky 3' circular antennas from Wal-Mart and expect it to magically work. If you aren't living in the cities where the broadcast towers are, mileage will vary greatly.
 
Last edited:
Excellent. I would like a 10K@120Hz HDR display please......Please?

Oh right these idiots haven't even made a 5K@144Hz display yet, let alone a decent one with REAL HDR...

To think the OLED 4K HDR screen on my phone is probably nicer than ANY monitor you can buy right now for gaming under $5000 lol.
 
Bring on TVs with Variable Refresh Rate!
 
lol Techpowerup

8K60, 4K120, Dynamic HDR etc etc

Shows picture of $3 HDMI cable from China o_O
 
Back
Top