• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Navi Radeon Display Engine and Multimedia Engine Detailed

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Two of the often overlooked components of a new graphics architecture are the I/O and multimedia capabilities. With its Radeon RX 5700-series "Navi 10" graphics processor, AMD gave the two their first major update in over two years, with the new Radeon Display Engine, and Radeon Multimedia Engine. The Display Engine is a hardware component that handles the graphics card's physical display I/O. The Radeon Multimedia Engine is a set of fixed-function hardware that provides CODEC-specific acceleration to offload your CPU.

The Navi Radeon Display Engine features an updated DisplayPort 1.4 HDR implementation that's capable of handling 8K displays at 60 Hz with a single cable. It can also handle 4K UHD at 240 Hz with a single cable. These also include HDR and 10-bit color. It achieves this by implementing DSC 1.2a (Display Stream Compression). The display controller also supports 30 bpp internal color-depth. The HDMI implementation remains HDMI 2.0. The multi-plane overlay protocol (MPO) implementation now supports a low-power mode. This should, in theory, reduce the GPU's power draw when idling or playing back video.



The Radeon Multimedia Engine is updated with support for more CODECs. The "Navi 10" GPU provides hardware-acceleration for decoding VP9 video at formats of up to 4K @ 90 fps (frames per second), or 8K @ 24 fps. The H.265 HEVC implementation is more substantial, with hardware-accelerated encoding of 4K at frame-rates of up to 60 fps. H.265 HEVC decoding is accelerated at 8K @ 24 fps, and 4K @ 90 fps, and 1080p at up to 360 fps. H.264 MPEG4 encoding gets a boost of 4K @ 150 fps, and 1080p @ 600 fps decoding; and 4K @ 90 fps and 1080p @ 150 fps encoding.

View at TechPowerUp Main Site
 
"optimized for high resolution HDR displays"

And yet, they fail to implement HDMI 2.1, lol
 
"optimized for high resolution HDR displays"

And yet, they fail to implement HDMI 2.1, lol
Why Nvidia's superior architecture dont have HDMI 2.1.
 
"optimized for high resolution HDR displays"

And yet, they fail to implement HDMI 2.1, lol

Your right its not HDMI 2.1 but they will be adopting DSC

It achieves this by implementing DSC 1.2a (Display Stream Compression). The display controller also supports 30 bpp internal color-depth.

If you have a compatible monitor. You wont be dumb down to 8-bit 4:2:0 when going over 4k 90hz like current cards. You'll be able to do up to 8K 16bit 4:4:4
 
Last edited:
Your right its not HDMI 2.1 but they will be adopting DSC



If you have a compatible monitor. You wont be dumb down to 8-bit 4:2:0 when going over 4k 90hz like current cards. You'll be able to do up to 8K 16bit 4:4:4

This isnt about monitors, but TV's. Not everyone wants to game on papertissue sized displays ;)
LG-s this year (OLED) lineup already has HDMI 2.1, so you could use 4K 120Hz, with HDR, and full RGB if you want. And even VRR.
(last years lineup can also do 4K60 or FHD120Hz, but in HDR you cant get full RGB 10-12bit, since the HDMI 2.0 bandwidth is not enough)
 
"optimized for high resolution HDR displays"

And yet, they fail to implement HDMI 2.1, lol
This isnt about monitors, but TV's. Not everyone wants to game on papertissue sized displays ;)
LG-s this year (OLED) lineup already has HDMI 2.1, so you could use 4K 120Hz, with HDR, and full RGB if you want. And even VRR.
(last years lineup can also do 4K60 or FHD120Hz, but in HDR you cant get full RGB 10-12bit, since the HDMI 2.0 bandwidth is not enough)
From what I've read this comes down to the HDMI consortium to a large degree - they focus their example hardware designs and other early support on the AV industry almost exclusively, leading to PC hardware always being late adopting new HDMI standards. Designing and implementing the hardware required to drive a display output is not a trivial task, and without the support of the organization designing the interface everything is bound to be slow work. Heck, HDMI 2.0 was on TVs and STBs at least a year before the first PCs and GPUs.
 
"optimized for high resolution HDR displays"

And yet, they fail to implement HDMI 2.1, lol

It should be noted that almost no one is implementing HDMI 2.1, because HDMI 2.1 is quite a large leap forward. Furthermore, you can implement HDMI 2.1 features into HDMI 2.0 and still be compliant with HDMI 2.0, which is what alot of companies have been doing. I'm pretty sure the only company with HDMI 2.1 atm is LG.

The funny thing is, Sony's rated spec of the PS5 means the PS5 will be running HDMI 2.1, so its obvious that AMD already has mostly figured out its implementation however.
 
From what I've read this comes down to the HDMI consortium to a large degree - they focus their example hardware designs and other early support on the AV industry almost exclusively, leading to PC hardware always being late adopting new HDMI standards. Designing and implementing the hardware required to drive a display output is not a trivial task, and without the support of the organization designing the interface everything is bound to be slow work. Heck, HDMI 2.0 was on TVs and STBs at least a year before the first PCs and GPUs.

Yeah, but what else will utilize HDMI 2.1, if not the PC video cards? What else could do 120Hz in 4K with VRR, and why...
So the decision at RTG for the new Navis not to have HDMI 2.1 is not really understandable. Or they have rushed the announcement too much. Either way, it wasnt the right decision.

...Furthermore, you can implement HDMI 2.1 features into HDMI 2.0 and still be compliant with HDMI 2.0, which is what alot of companies have been doing....

Please explain this? Seems like science fiction to me...
 
It could be the hardware supports 2.1 but it's not enabled yet, perhaps.

Or they will have an IP block done for the next versions/consoles.

The comment about putting 2.1 features onto 2.1 should be correct; if you can keep the total b/w under the 2.0 spec (18Gbps?), you could in theory add in eARC and other things. e.g. if you were running at 4k/60, SDR there might be room for eARC to send 3D sound. Not ideal, but it could work.
 
Either way, it wasnt the right decision.
Please explain this? Seems like science fiction to me...

Gotta release the card at some point. Its also pretty obvious we are getting a 2nd Navi refresh next year. Shrug. PC card in the PC space means DisplayPort is more important (of which it is fully compliant anyway). Sure, I'd like more features, but PC's connected to high end tv's is pretty niche.

As for HDMI 2.1 features on HDMI 2.0, a few manufacturers have made some features available that are slated for 2.1. ALLM is the one that immediately comes to mind. What nemesis.ie said is pretty well much correct, anything that requires bandwidth won't come until HDMI 2.1, things that don't manufacturers have been adding via firmware (especially Samsung).
 
Where's the USB-C connector for VR headsets though? They claim "Single IO connectivity" for head mounted displays, but apparently don't have it...
AMD is even part of the VirtualLink consortium...
 
Maybe they mean for a future DP1.4 standard? Or you have a splitter box on the headset/user's belt or an adapter on the DP?
 
  • Like
Reactions: HTC
So LG was able to come out with HDMI 2.1 in may, but AMD didnt manage the same? They both had the same time.

And as someone wrote here earlier, AMD HAS to adopt the technology too, since the new consoles cant give you the announced 4K 120Hz without that, and they do deliver hardware for them.

So all in all, this was just a rushed release, on a small budget. Like everything in the past years of RTG was.
 
So all in all, this was just a rushed release, on a small budget. Like everything in the past years of RTG was.

So Samsung, Sony, TCL, HiSense, insert any other brand here are on small budgets and rushed releases as well? Your statement is cherry picking at its finest.
 
So all in all, this was just a rushed release, on a small budget. Like everything in the past years of RTG was.

Oh man I am sure people will go crazy they can't get 4K120Hz with their budget card and almost nonexistent and prohibitively expensive monitors to support it. A tragedy, really.
 
Last edited:
Yes.

And you CAN get it, if you buy a DP monitor in due course - which will likely be very expensive (sadly).

It's "only" excluding 8k TVs at the moment and maybe there are a few 4k/120Hz TVs appearing too, both of which are also expensive.

It would be nice, but it's still very "early adopter" which you normally pay a lot extra for it.

Probably these screens will start appearing in mainstream next year and the next Navi/consoles will be ready.
 
Last edited:
See this is what's wrong with the general public... the list of entitlement never ends. It's sad that some of the most ignoramus comments/statements come from some of the smartest ppl. Smh
 
Navi's development was simply too far advanced to implement HDMI 2.1 when the HDMI consortium released the final 2.1 specs.
 
"optimized for high resolution HDR displays"

And yet, they fail to implement HDMI 2.1, lol

Why complain though? Expecting HDMI 2.1 from Vega on PC is really dumb when even the hyper expensive 2080Ti only supports HDMI 2.0b.
 
Yeah, but what else will utilize HDMI 2.1, if not the PC video cards? What else could do 120Hz in 4K with VRR, and why...
So the decision at RTG for the new Navis not to have HDMI 2.1 is not really understandable. Or they have rushed the announcement too much. Either way, it wasnt the right decision.
It sounds like you're oversimplifying this a bit - it's not a "decision" in terms of "do we include this feature, y/n?". The HDMI 2.1 spec was released in November 2017, which is very likely too short a time to get working hardware implemented into Navi (A hard launch in July for Navi means mass production started in May at the latest, meaning the silicon design was likely near finalized early in the new year. That leaves barely a year to integrate a brand-new I/O standard.). The required hardware to implement HDMI 2.1 output on a PC GPU doesn't exist yet, and the HDMI standard likely only gives requirements on the required functionality of said hardware, not how to implement it in an IC. In other words, companies either need time to develop this hardware, or assistance from organizations with this already in place. When the HDMI Forum prioritizes assisting the AV industry (as that's their main focus and their main contributors/partners), PC hardware manufacturers are left to themselves - leading to longer development cycles and slower adoption in this industry. The only other option would be for AMD and Nvidia to allocate significant hardware engineering resources to fast-track a HDMI 2.1 implementation, which would have direct detrimental effects on the parts of their hardware that actually matter in terms of performance, or result in delayed launches anyhow. In short: this would be a meaningless and silly thing to prioritize, and the only real solution would be for the HDMI Forum to more actively support PC hardware development. Until that happens, we have DP - which is arguably a better standard anyhow.

As for what will utilize HDMI 2.1 in the eyes of the HDMI Forum: 8k TV broadcasts, specifically from the upcoming olympics. Remember, TV/AV is a far bigger industry than PC hardware. How many TVs are sold globally in a year? I'd be surprised if gaming PCs capable of driving high refresh rates even at 1080p amounted to 1% of that. TV sports, on the other hand, sells TVs like crazy.
 
So LG was able to come out with HDMI 2.1 in may, but AMD didnt manage the same? They both had the same time.

And as someone wrote here earlier, AMD HAS to adopt the technology too, since the new consoles cant give you the announced 4K 120Hz without that, and they do deliver hardware for them.

So all in all, this was just a rushed release, on a small budget. Like everything in the past years of RTG was.

Is that HDMI 2.1 on their TVs or monitors?
 
I wonder is designing a TV HDMI sink (probably as part of a scaler chip that will be mass produced) easier than doing a source embedded in a GPU?
 
We don't have to worry about HDMI 2.1. We already have DP 1.4 on PC video cards. That does support 8K 60HZ or 50 HZ depending on 8 or 10 bit panel and 4K 120HZ or 100HZ but from the OP it can support up to 240HZ (too bad only 4K TVs have that) . The only thing is that GPU performance needs to catch up before we can enjoy any of those and in the monitor space, the cost of displays that have any of those specs is eye watering.
 
That said, HDMI 2.1 would be nice if you have a big TV in the living room or want to buy a possibly more affordable TV instead of a monitor.

Panasonic had DP on some of their TVs a couple of years back but it seems to be gone now. :(
 
Back
Top