• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Next-Gen UDNA-Based Radeon GPUs to Use 80 Gbit/s HDMI 2.2

Did you understand the explanation or you pretend to have understood it, since you mentioned the word 'intentionally'?

All RTX 50 series cards can do it, even the cheap RTX 5060's. Addressing something you mentioned later regarding signal integrity, I do not think that AIB 9070 XT's such as Nitro or Taichi have a worse PCB than say, a 5060 Ventus 2X. The conclusion that remains is that this is an intentional decision for market segmentation purposes, at least on RDNA 4.

This is pure nonsense. Could you give an example of an monitor that either RDNA3 or RDNA4 cards cannot display its image fully? I will tell you the answer. There is no such display.

You are already aware of the Odyssey Neo G9 with a 7680x2160/240 resolution and its insatiable bandwidth demands. You are also aware that while RDNA 3 could handle it, with DSC, and Ada could not. Safe to assume that you are also aware that the situation is presently inverted and will eventually show itself again, once similarly extreme monitors with even higher specs appear on the market. At that time, the Nvidia cards will be ready for them, and the AMD ones won't. So, no, it's not nonsense, and even though a "theoretical", it's one that has presented itself very recently, after all.

I hope black screens are sorted out and all ROPs correctly counted.

1750644518268.png



1750644674165.png



Like Nvidia in 2022, AMD reused PCB with the same video traces on RDNA4 cards. They are designing a new PCB for HDMI 2.2 port and DP80 upgrade. That's more important.

Hold up, though. So back then Nvidia could have added things like DP 2.0 on Ada boards despite the "PCB" purportedly being reused from Ampere, but now reusing RDNA 3 PCBs for RDNA 4 somehow excuses the throttled bandwidth. Wouldn't this be a double standard?

This can all work on today's ports with DSC.

I've never been one to be picky regarding DSC, because it's supposed to be visually lossless. I'll cut you this one, because honestly as long as the image looks visually lossless, I couldn't care less (example: raw PCM audio vs. FLAC), however, if we're to factor DSC into account, the higher bandwidth port could take resolutions even higher still. The aforementioned G9 was designed to run at the limits of DP54 with DSC, for example. We've been at the 1080p, 1440p and thereabouts for over 20 years now, and 4K's no longer a new thing either. It's time to start preparing for the future and both DP2.1 and HDMI 2.2 will have a crucial role here IMHO - and the more bandwidth, the better.

The linked video literally answers this question. It's singal integrity on full size ports. Wendell measured it by specialist equipment that is used for standard testing of cables in the industry.

The only issue being, measured on previous generation hardware. The W7800 is just a cut down Navi 31 GPU, no different from the 7900 XTX. Both AMD and NVIDIA designed all-new display engines for RDNA 4 and Blackwell. The due diligence to enable DP80 on current generation hardware was done by both companies, this much seems clear. On the end note, since I quote spammed enough, while you technically can have 6 full size DP ports on a card, it's generally not done that way to avoid obstructing the blower exhaust on these workstation cards. That much is also obvious, so I don't even know what we are arguing about at this point :D
 
You are already aware of the Odyssey Neo G9 with a 7680x2160/240 resolution and its insatiable bandwidth demands. You are also aware that while RDNA 3 could handle it, with DSC, and Ada could not. Safe to assume that you are also aware that the situation is presently inverted and will eventually show itself again, once similarly extreme monitors with even higher specs appear on the market. At that time, the Nvidia cards will be ready for them, and the AMD ones won't. So, no, it's not nonsense, and even though a "theoretical", it's one that has presented itself very recently, after all.
Such scenario will not happen on commercial monitors meant for mid range and entry cards. It's purely academic point at this moment, aka theoretical without inverted commas. Monitors beyond 8K/165 and beyond 4K/480 are not coming to market any time soon, even not by 2030, and those resolutions and refresh rates are supported. So, there is no practical point here to be made.
Hold up, though. So back then Nvidia could have added things like DP 2.0 on Ada boards despite the "PCB" purportedly being reused from Ampere, but now reusing RDNA 3 PCBs for RDNA 4 somehow excuses the throttled bandwidth. Wouldn't this be a double standard?
No, because once entire ecosystem moves away from DP 1.4, issues such as driving halo monitors like 57-inch Samsung 8K/2K/240Hz disappear. AMD competes in middle range and entry segments this generation, so current solution is appropriate. The difference between reusing PCB with DP 1.4 and the one with DP54 is colossal. Different magnitude of things, to the point that the then most expensive card on the planet 4090 wasn't able to drive Samsung's display. Such situation will not happen now with cards from either vendor. There is no point to be made here either.
It's time to start preparing for the future and both DP2.1 and HDMI 2.2 will have a crucial role here IMHO - and the more bandwidth, the better.
Good point. I agree.
That much is also obvious, so I don't even know what we are arguing about at this point :D
We are not anyone, as we have fine tuned the area of reasonable and practical consensus without being theoretically stubborn. A good outcome. Well done.
 
I've never been one to be picky regarding DSC, because it's supposed to be visually lossless. I'll cut you this one, because honestly as long as the image looks visually lossless, I couldn't care less (example: raw PCM audio vs. FLAC)
Nit: FLAC is actually lossless. DSC is not. It's perceptually lossless, which is a confusing way of saying "lossy, but not in a way that people usually notice." The level of perceptual transparency achieved by DSC (check Wikipedia and one of its references for specifics) I'd say is more comparable to the transparency from medium-high-bitrate MP3. In other words, some images are perceptually altered by DSC (you'd notice if DSC was rapidly toggled on and off), but most images aren't.
 
Last edited:
In other words, some images are perceptually altered by DSC (you'd notice if DSC was rapidly toggled on and off), but most images aren't.
This is true and it was measured in studies. Some participants can notice it more than others, so there are individual differences in perception, in the same way as there are individual differences in perception of refresh rate and motion.

Professional photographers working with large HDR image files and high colour accuracy are usually advised to avoid connections to monitors with DSC, as some photographs might contain compression artefacts. But their monitors are usually 60-100Hz so the issue is unlikely to occur in the first place as bandwidth is not a problem. Again, it's niche effect.
 
Back
Top