• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD's Next-Gen UDNA-Based Radeon GPUs to Use 80 Gbit/s HDMI 2.2

Little is known of the HDMI 2.2 spec yet, but if they're "only" claiming 4k 240hz at 12bit that would be an effective bandwidth of 82.27 Gbit/s against DP 2.1 UHBR3 77.37 Gbit/s. A win is a win, but they're really grasping at straws here, especially when their protocol requires a 96 Gbit connection and for display port a 80 Gbit is enough. The cable requirements for DP2.1 are already hard enough, HDMI will be even worse for barely any gain

Why bother are you bothering AMD?! People won't play games with 12bit colors and they won't watch movies with 240Hz motion, the "advantages" of HDMI 2.2 don't really matter, HDMI 2.1 is enough in the places where HDMI matters, we have Display Port for the rest.
AMD's bothering because HDMI won't let 2.1 be implemented under Mesa so they're leapfrogging to 2.2 so they can implement it before HDMI says no this time :pimp:
 
Why bother are you bothering AMD?! People won't play games with 12bit colors and they won't watch movies with 240Hz motion,
12-bit? Nobody cares about 12-bit. The point of upgrade is to have similar bandwidth available in both ports, so that there is flexibility for users to connect GPUs and consoles to either TV or monitor withHDMI 2.2

The whole ecosystem will transition to 2.2 bandwidth in 2027 and onwards, as current 48 Gbps on HDMI cannot do more than 4K/144 10-bit RGB.
the "advantages" of HDMI 2.2 don't really matter, HDMI 2.1 is enough in the places where HDMI matters, we have Display Port for the rest.
Bandwidth does matter on both interfaces. Users need to have a choice to connect whichever source device to whichever dispay. If both interfaces have similar bandwidth, this makes it a lot easier.
 
Drip, drip, drip. Both AMD and nGreedia deliberately holds the monitor industry back.

I can hear the nGreedia fanbois frothing at the mouth and screaming "who actually NEEDS more than display port 1.4!!!"
 
Last edited:
Why bother? I'll tell you why. It's because this will enable things like 10 bpc 8K at 60 Hz, 8 bpc 4K at 360 Hz, 8 bpc 1080p at 1440(!) Hz. It's also the stepping stone to resolutions beyond 8K. There's a monitor revolution simply waiting on such a link standard to become widely adopted, and it'll be DP that will have to up their game to match it.
DP80 is fine for next couple of years. VESA will probably publish an update around 2027.

Edge scenarios are 8K/90Hz on 8-bit, 8K/75Hz on 10-bit, 5K/160Hz on 10-bit, etc. Anything above will force DSC. Such monitors will be extremely niche before 2030, so there's no urgency for DP 2.1 in the same way as there is urgency for HDMI to catch-up.

Both AMD and nGreedia deliberately holds the monitor industry back
This was the case for Nvidia until 5000 cards. AMD offered both DP54 and DP80 ports since 2022. However, as monitor vendors are lazy and had been waiting passively for Nvidia to make the first move, we have just a handful of DP 2.1 monitors.

On the other hand, HDMI 2.2 will have a huge traction in TV-based entertainment and gaming. I am sure that both LG and Samsung are already developing in-house display IC for their TVs. They need to prepare new TVs for 2027 for new consoles that should also support 2.2, so that entire industry could start the transition, just like they did in 2020.

It looks like UDNA arch will natively support HDMI 2.2 across all graphics domains, from GPUs to consoles to APUs.
 
Last edited:
How are they getting away with not implementing the full 96GB/s spec?
 
AMD's bothering because HDMI won't let 2.1 be implemented under Mesa so they're leapfrogging to 2.2 so they can implement it before HDMI says no this time :pimp:
Hahaha. It doesn't work like that. 2.2 spec is protected in the same way as 2.1, so no 2.2 in Lunux.

I can hear the nGreedia fanbois frothing at the mouth and screaming "who actually NEEDS more than display port 1.4!!!"
No. They will be saying now that "who needs HDMI 2.2 when we have DP80".

Just reply to them that gaming OLED TVs will not have DP80 in near future, but HDMI 2.2. The same happened with HDMI 2.1 in 2019 amd 2020. It was LG TVs leading the adoption and monitors were lagging behind.

How are they getting away with not implementing the full 96GB/s spec?
Hardware is not ready for this. The same happened with DP 2.1. DP80 full size physical port was not ready in 2022 when AMD was releasing cards with DP54. Read my other posts in this thread to find out.
 
How are they getting away with not implementing the full 96GB/s spec?
All part of the drip, drip drip plan. You will just need to buy another new card.

It will be exactly the same in the domestic HDTV market, where you will only get 1 HDMI 2.2 port at first, then you will get another couple of ports on next year's model, but some of the ports will miss some feature that makes it useless.

There are STILL high-end HDTVs on the market which do not have a full array of HDMI 2.1 ports.

The home cinema receiver/soundbar market is more of the same, it takes multiple models, many years later to fully support a new standard.

It's all to screw you over and force you to buy updated models.
 
Last edited:
It's more about signal integrity and availability of ports that can produce perfect eye diagram. This technology is probably not mature as yet. It will take time.
Yeah, that must be it! LG did 4x HDMI 2.1 ports on their OLED TV's 6 years ago, and there are still brand new models from other manufactures which still limit their HDMI 2.1 ports!

But eye diagrams...
 
The problem with the Display world is no standards. Most monitors now come with Display Port only or some with HDMI only. Most newer video cards now only have Display port and no HDMI. Then you have to get changers and adapters. They need to make standards period. One thing I hate with Display ports is that sometimes you cannot get into the bios on certain boards unless connected to HDMI.
 
Hardware is not ready for this. The same happened with DP 2.1. DP80 full size physical port was not ready in 2022 when AMD was releasing cards with DP54. Read my other posts in this thread to find out.
Funny, their pro cards had it and it seemed an awful lot like a firmware lock in consumer land, considering they literally didn't change dies between the two parts.

The hardware was plenty well ready, AMD just decided the consumer wasn't.
 
Funny, their pro cards had it and it seemed an awful lot like a firmware lock in consumer land, considering they literally didn't change dies between the two parts.
The hardware was plenty well ready, AMD just decided the consumer wasn't.
Read my post #19 where things are explained as they happened. There is a linked video to Wendell's investigation where he tests video connections. Very informative. At that point of time, physical characteristics of full size DP port were not ready enough for stable DP80 signal. It's enough of you remember this idea for time-being. Port signal integrity test in Wendell's video clearly shows this, as he was also curious to find out what was going on.

Pro cards had DP80 speed on mini-DP port only, but it was DP54 on full size DP ports, even on Pro cards. Wendell explains why this was the case. You may remember that Samsung released around that time 57-inch 8K/2K 240Hz monitor. That monitor has mini-DP port, but it's not DP 80, like on AMD Pro cards. It's DP54.

In 2022, AMD was the only company in the world that did their homework with DP80 signal on any product, plus the first one to certify with VESA DP 2.1 signal on any laptop. It's on other companies to put more effort too and implement solutions on offer in their products. You can criticize AMD for not putting DP80 on client cards, which a good discussion point, as I expected the same, but the choice available at that point of time was DP80 on mini-DP ports only. This would have meant that every client card would have had one mini-DP port in order to support the highest speed, in addition to other DP and HDMI ports. As you may imagine, many gamers would have hated to see yet another interface, as barely any monitor hosts mini-DP port. This did not happen and world at that time was also screaming that there should be return of USB-C ports. So, they can't make everyone happy, when there are so many contradictory voices around.

It's enough if you watch Wendell's video to find out for yourself. I would not have known either, had I not seen the video.
 
I always use DisplayPort that is the superior port, so this is a meaningless move by AMD.
 
Read my post #19 where things are explained as they happened. There is a linked video to Wendell's investigation where he tests video connections. Very informative. At that point of time, physical characteristics of full size DP port were not ready enough for stable DP80 signal. It's enough of you remember this idea for time-being. Port signal integrity test in Wendell's video clearly shows this, as he was also curious to find out what was going on.

Pro cards had DP80 speed on mini-DP port only, but it was DP54 on full size DP ports, even on Pro cards. Wendell explains why this was the case. You may remember that Samsung released around that time 57-inch 8K/2K 240Hz monitor. That monitor has mini-DP port, but it's not DP 80, like on AMD Pro cards. It's DP54.

In 2022, AMD was the only company in the world that did their homework with DP80 signal on any product, plus the first one to certify with VESA DP 2.1 signal on any laptop. It's on other companies to put more effort too and implement solutions on offer in their products. You can criticize AMD for not putting DP80 on client cards, which a good discussion point, as I expected the same, but the choice available at that point of time was DP80 on mini-DP ports only. This would have meant that every client card would have had one mini-DP port in order to support the highest speed, in addition to other DP and HDMI ports. As you may imagine, many gamers would have hated to see yet another interface, as barely any monitor hosts mini-DP port. This did not happen and world at that time was also screaming that there should be return of USB-C ports. So, they can't make everyone happy, when there are so many contradictory voices around.

It's enough if you watch Wendell's video to find out for yourself. I would not have known either, had I not seen the video.

Mini DP is just DP my dude. You can get one cable that plugs into standard port monitors. They don't have to have the same form factor plug on both ends lmao. If AMD felt like it they could do USB C to DP. Indeed some vendors do exactly that. But good attempt to paint AMD as perfect, as usual, "oh they segmented bandwidth because gamers would actually prefer lower bandwidth ports for their $1000+ monitors, instead of buying a $10 cable or using the one in the box, instead of limiting high bandwidth to more expensive cards".
 
Yeah, I get the explanation and plausible reasons but I also don't think AMD did the right thing here by intentionally limiting the connection speed. GPUs usually stick around for some time in the market, and disabling this functionality serves no meaningful purpose, especially since inducing upselling when you're AMD is quite the double edged sword: if an user is forced to upgrade their GPU due to an imposed limitation on the display engine, chances are they are going to buy something faster to ensure a better experience at such high resolutions, and AMD does not offer something faster at this point in time. Like I said, I don't claim to know anything about the electrical limitations but, mDP is used in Pro GPUs so they can have 6 display outs, as far as I'm aware. Not because they allow higher stability or bandwidths.

Good job securing another 5080/5090 sale, I guess?
 
Mini DP is just DP my dude. You can get one cable that plugs into standard port monitors. They don't have to have the same form factor plug on both ends lmao. If AMD felt like it they could do USB C to DP. Indeed some vendors do exactly that. But good attempt to paint AMD as perfect, as usual, "oh they segmented bandwidth because gamers would actually prefer lower bandwidth ports for their $1000+ monitors, instead of buying a $10 cable or using the one in the box, instead of limiting high bandwidth to more expensive cards".
Wendell specifically says the mini-DP to DP cables don't work for 80 gigabit. Exact timecode link:
and auto-generated transcript: "so here's the bad news about that display port 2.1 mini display port connector all of the cables and other connectors and anything that you might get to go from mini display port 2.1 to full-size display port like if you were going to hook it up to a fullsize display port 2.1 50 gbit monitor are terrible so don't be tempted to do that"

I think there comes a time, maybe it's after someone you're arguing with has said four times that their claims are backed by the contents of the video, that you just gotta watch the video before writing any more retorts. (Or maybe you did and glossed over this part. I also hate watching videos or deciphering transcripts instead of reading. It's why I'm a monthly supporter of TPU and not of any YT channels.)
 
Wendell specifically says the mini-DP to DP cables don't work for 80 gigabit. Exact timecode link:
and auto-generated transcript: "so here's the bad news about that display port 2.1 mini display port connector all of the cables and other connectors and anything that you might get to go from mini display port 2.1 to full-size display port like if you were going to hook it up to a fullsize display port 2.1 50 gbit monitor are terrible so don't be tempted to do that"

I think there comes a time, maybe it's after someone you're arguing with has said four times that their claims are backed by the contents of the video, that you just gotta watch the video before writing any more retorts. (Or maybe you did and glossed over this part. I also hate watching videos or deciphering transcripts instead of reading. It's why I'm a monthly supporter of TPU and not of any YT channels.)
People being unable to source high quality certified cables is not my problem. I also had to check VESA verified certification lists before buying my latest cables. The vast majority of advertised cables were not on that list. The actual cable is the same between the connections. So perhaps instead of linking a Wendell video I don't have time to sit down and watch, you can explain to me in this text based forum (not YouTube) how, in your mind, the mini version of a connector couldn't connect to the full size version, considering it's the same standard with the same cable between the connections. Ignoring the fact that most monitors don't have a port that supports that bitrate in the first place, and most well reviewed cables on Amazon and other places are not in fact certified, rather being Chinesium.

I mean even if it is impossible, which I'm not discounting the possibility of, just buy a monitor with mini DP? People buying specific cards for specific connector standards and specific bitrates are probably capable of checking the specs of the monitor they buy too?
 
Wendell specifically says the mini-DP to DP cables don't work for 80 gigabit. Exact timecode link:
and auto-generated transcript: "so here's the bad news about that display port 2.1 mini display port connector all of the cables and other connectors and anything that you might get to go from mini display port 2.1 to full-size display port like if you were going to hook it up to a fullsize display port 2.1 50 gbit monitor are terrible so don't be tempted to do that"

I think there comes a time, maybe it's after someone you're arguing with has said four times that their claims are backed by the contents of the video, that you just gotta watch the video before writing any more retorts. (Or maybe you did and glossed over this part. I also hate watching videos or deciphering transcripts instead of reading. It's why I'm a monthly supporter of TPU and not of any YT channels.)

There was some truth to this. Such cables were commercially unavailable at that time, when RDNA 3 was new. However, equipment to ensure this worked was available and such cables do exist in 2025.


What I know of mDP and DP is that they are only form factors, and otherwise electrically compatible. What I do not know is if there are any peculiar things on RDNA 3 itself that would prevent this from happening, my initial theory is that it's a simple FW lock, but I have no evidence or no substantial knowledge on this subject which is why I am happy to concede here. I've always been an HDMI guy, anyway.
 
There's a monitor revolution simply waiting on such a link standard to become widely adopted, and it'll be DP that will have to up their game to match it.

Is there? First HDMI 2.2 barely increases the bandwidth from DP2.1 and both standards are enough for everything the monitor market even hinted at having available.

12-bit? Nobody cares about 12-bit. The point of upgrade is to have similar bandwidth available in both ports, so that there is flexibility for users to connect GPUs and consoles to either TV or monitor withHDMI 2.2

The whole ecosystem will transition to 2.2 bandwidth in 2027 and onwards, as current 48 Gbps on HDMI cannot do more than 4K/144 10-bit RGB.

Dolby Vision cares about 12bit, but it also doesn't need anywhere near this bandwidth because it doesn't use 240hz.

HDMI 2.2 will have a huge traction in TV-based entertainment and gaming.

How? Films don't push anywhere near this bandwidth (low frame rates) and consoles won't push framerates meaningfully higher either - they barely do 60fps now, no way they're jumping to 240hz by next generation.

The problem with the Display world is no standards. Most monitors now come with Display Port only or some with HDMI only. Most newer video cards now only have Display port and no HDMI. Then you have to get changers and adapters. They need to make standards period. One thing I hate with Display ports is that sometimes you cannot get into the bios on certain boards unless connected to HDMI.

Display Port and HDMI are standards. Your comment doesn't make any sense.

Mini DP is just DP my dude

You missed the point, maybe there weren't DP ports available that were able to meet the requirements for a 80gbit connections, while they existed for mDP with AMD choosing to limit the bandwidth instead of using mDP ports and have people complain of having to buy a cheap adapter cable
 
Is there? First HDMI 2.2 barely increases the bandwidth from DP2.1 and both standards are enough for everything the monitor market even hinted at having available.

Yup. 4K360 will exceed DP80's capabilities, as will 8K90, 7680x2160 DUHDs at 165Hz, etc.

Seems extreme now, but eventually these monitors will become commonplace as GPUs start being able to power them
 
Yup. 4K360 will exceed DP80's capabilities, as will 8K90, 7680x2160 DUHDs at 165Hz, etc.

Seems extreme now, but eventually these monitors will become commonplace as GPUs start being able to power them

They're not using the required and available interfaces for 4k above 120hz, I don't see why they're or will be in any rush to use them for 4k360
 
Dolby Vision cares about 12bit, but it also doesn't need anywhere near this bandwidth because it doesn't use 240hz.
DV is usually encoded in 12-bit 4-2-2, but it's delivered in a container for view on 10-bit displays. Commercial display industry will not move to native 12-bit images and content in next decade. It's far away. Beyond 2035.

How? Films don't push anywhere near this bandwidth (low frame rates) and consoles won't push framerates meaningfully higher either - they barely do 60fps now, no way they're jumping to 240hz by next generation.
Current LG OLED is 4K/165Hz and it's forced to use DSC on HDMI 2.1, like any display with image beyond 4K/144Hz 10-bit RGB.

LG and Samsung create in-house HDMI chipsets for their TVs. They will be racing to be the first to release the new ports. Last time, LG beat Samsung in 2019 with 48 Gbps ports. Samsung didn't forget this. Their first TVs with HDMI 2.2 are expected around CES 2027, in time for new consoles and graphics cards with the same ports.

New consoles will target 4K/120 native in popular titles and more with upscaling. Several shooters and other, less demanding games can easily reach 4K/200 and above.

So, if you plan home tech upgrade around 2027 and your gear needs more video bandwidth, make sure you buy devices with the new HDMI 2.2 ports.

Mini DP is just DP my dude. You can get one cable that plugs into standard port monitors. They don't have to have the same form factor plug on both ends lmao. If AMD felt like it they could do USB C to DP. Indeed some vendors do exactly that. But good attempt to paint AMD as perfect, as usual, "oh they segmented bandwidth because gamers would actually prefer lower bandwidth ports for their $1000+ monitors, instead of buying a $10 cable or using the one in the box, instead of limiting high bandwidth to more expensive cards".
Did you actually watch the video that I linked in post 19? If you did, what did you learn from it?

Besides, it's irrelevant for gamers, not only because no monitor vendor at that time offered any displays with DP 2.1 ports, but it's not an issue. AMD was alone in the industry waiting for others to get their act together, including Nvidia and monitor industry in large. A few initial dispays that did come out in 2023 featured only DP40 and DP54 ports. Check it out.

The only cards that were actually able to run fully the image in the most demanding monitor bandwidth-wise, 57-ich Samsung 8K/2K/240Hz, were in fact RDNA3 cards only. Nvidia 4000 card could manage only 120Hz due to limited display heads on DP 1.4. 4090, the then most expensive client card on the planet, couldn't run the monitor at its full image. That was funny.

Only now, with 5000 series, those cards can properly run this monitor. Guess what? It doesn't matter if this monitor runs over DP80 or DP54 port from GPU. In both cases it works as DSC is used.

Gamers have played on 4K/144Hz monitors for years on DP 1.4 ports, and somehow nobody complained that there was not enough bandwidth. So, don't be silly.
I always use DisplayPort that is the superior port, so this is a meaningless move by AMD.
Nonsense. Try using DisplayPort on a large, beautiful 4K OLED TV to play games. Good luck with adapters.
Yeah, I get the explanation and plausible reasons but I also don't think AMD did the right thing here by intentionally limiting the connection speed.
Did you understand the explanation or you pretend to have understood it, since you mentioned the word 'intentionally'?

I have a feeling that several comments I have read in this forum from a few members sound conspiratorial at times, for no apparent reason.
GPUs usually stick around for some time in the market, and disabling this functionality serves no meaningful purpose, especially since inducing upselling when you're AMD is quite the double edged sword: if an user is forced to upgrade their GPU due to an imposed limitation on the display engine, chances are they are going to buy something faster to ensure a better experience at such high resolutions, and AMD does not offer something faster at this point in time.
This is pure nonsense. Could you give an example of an monitor that either RDNA3 or RDNA4 cards cannot display its image fully? I will tell you the answer. There is no such display.

Games have played on 4K/144Hz displays with limited bandwidth on DP 1.4 ports for many years. Have you heard anyone complaining about their experience?

Users are forced to upgrade from Nvidia 4000 to Nvidia 5000 because older cards have DP 1.4 ports and cannot drive fully monitors such as 57-inch Samsung 8K/2K/240Hz. 4090 managed only 120Hz due to display heads limitations. 5000 cards do bring better experience in this monitor because those cards can finally display a full image. Neither RDNA3 nor RDNA4 cards have ever had any issue with this monitor thanks to more bandwidth on DP 2.1 with DSC.
Like I said, I don't claim to know anything about the electrical limitations but, mDP is used in Pro GPUs so they can have 6 display outs, as far as I'm aware. Not because they allow higher stability or bandwidths.
No. You can have six HDMI or six full size DP ports too. It would simply take more space.

If you don't know something, that's fine. Ask questions and we will do our best to answer. The worst thing I always find in tech fora is that many pretend that they know things and rush into superficial judgements.
Good job securing another 5080/5090 sale, I guess?
I hope black screens are sorted out and all ROPs correctly counted.

you can explain to me in this text based forum (not YouTube) how, in your mind, the mini version of a connector couldn't connect to the full size version, considering it's the same standard with the same cable between the connections.
I explained it a few times. Did you read it? Here is again. Initially, physical characteristics of full size DP ports produced too much noise in DP80 signal, and therefore signal integrity was not great. Wendell showed the eye diagram to demonstrate this. DP80 signal would work, but it could produce artifacts, such as snow, smudges, etc. So, the designers of the port needed to go back to a drawing board and procure ports with better materials for such high speed. Mini-DP didn't have this issue. Plus, there was an obvious practical issue with 1 meter cables, as you already know.

So, early days with full speed port were rough due to lack of opportunities to troubleshoot at plug fests that were cancelled during the pandemic.

There is also a reason why DP 2.1 was upgraded from DP 2.1a to DP 2.1b, to allow DP80LL cables.

I mean even if it is impossible, which I'm not discounting the possibility of, just buy a monitor with mini DP?
Very rare in gaming segment. Those are more prevalent for Apple workstations or in professional space, usually at 60Hz.

That's why I said that gamers would have been very upset to see mini-DP on AMD GPUs, or Nvidia cards for that matter, as that port is not popular in mainstream gaming monitors. There would have been a hysterical outcry. You know very well how hysterical and noisy Internet and tech press can be... I can hear it already: "waste of port", "no monitors with mDP", etc.

There was some truth to this. Such cables were commercially unavailable at that time, when RDNA 3 was new. However, equipment to ensure this worked was available and such cables do exist in 2025
Sure, but it will not make a difference in 2025. All connected monitors will work anyway, both on DP54 and DP80 ports. You wouldn't even know which port is connected, unless you looked into EDID to check the digital signage of video signal.

Like Nvidia in 2022, AMD reused PCB with the same video traces on RDNA4 cards. They are designing a new PCB for HDMI 2.2 port and DP80 upgrade. That's more important.

What I know of mDP and DP is that they are only form factors, and otherwise electrically compatible. What I do not know is if there are any peculiar things on RDNA 3 itself that would prevent this from happening, my initial theory is that it's a simple FW lock, but I have no evidence or no substantial knowledge on this subject which is why I am happy to concede here. I've always been an HDMI guy, anyway.
The linked video literally answers this question. It's singal integrity on full size ports. Wendell measured it by specialist equipment that is used for standard testing of cables in the industry.

Yup. 4K360 will exceed DP80's capabilities, as will 8K90, 7680x2160 DUHDs at 165Hz, etc.

Seems extreme now, but eventually these monitors will become commonplace as GPUs start being able to power them
This can all work on today's ports with DSC.
 
Last edited:
Back
Top