Wednesday, June 12th 2019

AMD Navi Radeon Display Engine and Multimedia Engine Detailed

Two of the often overlooked components of a new graphics architecture are the I/O and multimedia capabilities. With its Radeon RX 5700-series "Navi 10" graphics processor, AMD gave the two their first major update in over two years, with the new Radeon Display Engine, and Radeon Multimedia Engine. The Display Engine is a hardware component that handles the graphics card's physical display I/O. The Radeon Multimedia Engine is a set of fixed-function hardware that provides CODEC-specific acceleration to offload your CPU.

The Navi Radeon Display Engine features an updated DisplayPort 1.4 HDR implementation that's capable of handling 8K displays at 60 Hz with a single cable. It can also handle 4K UHD at 240 Hz with a single cable. These also include HDR and 10-bit color. It achieves this by implementing DSC 1.2a (Display Stream Compression). The display controller also supports 30 bpp internal color-depth. The HDMI implementation remains HDMI 2.0. The multi-plane overlay protocol (MPO) implementation now supports a low-power mode. This should, in theory, reduce the GPU's power draw when idling or playing back video.
The Radeon Multimedia Engine is updated with support for more CODECs. The "Navi 10" GPU provides hardware-acceleration for decoding VP9 video at formats of up to 4K @ 90 fps (frames per second), or 8K @ 24 fps. The H.265 HEVC implementation is more substantial, with hardware-accelerated encoding of 4K at frame-rates of up to 60 fps. H.265 HEVC decoding is accelerated at 8K @ 24 fps, and 4K @ 90 fps, and 1080p at up to 360 fps. H.264 MPEG4 encoding gets a boost of 4K @ 150 fps, and 1080p @ 600 fps decoding; and 4K @ 90 fps and 1080p @ 150 fps encoding.
Add your own comment

37 Comments on AMD Navi Radeon Display Engine and Multimedia Engine Detailed

#26
Valantar
MikeMurphyAMD had 8K60 decode support back in 2013 with the Tonga architecture. These were the R9 285 and R9 380 cards.

Why don't we have 8K60 decode support now?
Wikipedia disagrees with you, saying the R9 285 supported 4k60. Also, if they had, what codec would that be back in 2014? H.264? H.264 in 8k would be uselessly large. Not to mention that those cards supported DP1.3 output at best. Definitely not getting 8k over that.

@eidairaman1 care to elaborate on that reaction to my post? Can't say I can quite figure out what you mean :p
Posted on Reply
#27
FordGT90Concept
"I go fast!1!11!1!"
MikeMurphyWhy don't we have 8K60 decode support now?
Because adding support for that when displays barely exist for it...is pointless? The fact it can do 8K24 via VP9 and HEVC is impressive, especially for a midrange card.
Posted on Reply
#28
JAB Creations
LocutusHYeah, but what else will utilize HDMI 2.1, if not the PC video cards?
Your post suggests that the PS5 is coming out this summer along with the new Radeon line. None of your posts make sense and you're making a lot of armchair commander comments; I'd recommend reading and learning more before sharing such sharp opinions.
Posted on Reply
#29
Arvint
I would also like to see hdmi 2.1 support soon with the AMD Radeon cards. For sure it will come next year, hopefully before. New 8k TVs this year from LG, Sony and Samsung support hdmi 2.1. I would like to run a simulation (own software) project on 8k on big screen, for now it seams impossible. None of the 10 or so big 8k TVs on the market yet support DIsplayPort 1.4, and no graphics card yet support hdmi 2.1. The Dell 32” 8k Monitor with do 1.4 seams only option, but screen is too small (for showing simulation to a group of people). I don’t need video decoding, and I will be ok with 8k @ 30fps on a big TV (minimum 65”). The new Radeon RX 5700 XT (or Radeon VII) would be great if they could be connected to any screen >= 65” with 8k@30fps. Currently using Macs.
Posted on Reply
#30
FordGT90Concept
"I go fast!1!11!1!"
Wait for Acturus and Ampere. They should have HDMI 2.1. Well, maybe not Ampere, but I'd be shocked if Acturus doesn't.
Posted on Reply
#32
nemesis.ie
@Zoolook

How much do those cost?

I imagine if they are included on a monitor PCB or as part of the main monitor chipset, they will be cheaper than buying separately.
Posted on Reply
#33
Valantar
nemesis.ie@Zoolook

How much do those cost?

I imagine if they are included on a monitor PCB or as part of the main monitor chipset, they will be cheaper than buying separately.
Why would a monitor have an onboard DP to HDMI adapter? Nobody would ever know it's there, as it would look like a bog-standard DP input unless you read the spec sheet thoroughly. And does HDMI 2.1 do anything DP 1.4 doesn't? On an output device makes a lot more sense, but I doubt that will happen for a while - for cost reasons. Of course most iGPU HDMI outputs are already based off on-board DP-to-HDMI adapters, but most iGPUs don't support DP 1.4.
Posted on Reply
#34
nemesis.ie
I mean the monitor having an HDMI 2.1 input.
Posted on Reply
#35
LocutusH
This discussion is pointless, as there are NO dp1.4 to hdmi 2.1 converters.
That realtek stuff is just in the development... who knows when a product is available, and at what cost.
Fact is, HDMI 2.1 can not be fed as of yet, even if the TV supports it.
Posted on Reply
#36
Valantar
nemesis.ieI mean the monitor having an HDMI 2.1 input.
Well, then it would have an HDMI port and not a DP port, so an onboard DP-to-HDMI converter would be rather useless. Unless you're suggesting they make an HDMI port that also supports DP 1.4 input signalling, which again would be a feature that nobody would know about, that breaks any and all standards, and would in a worst-case scenario even require bespoke cables. Doesn't sound like a good idea.
LocutusHThis discussion is pointless, as there are NO dp1.4 to hdmi 2.1 converters.
That realtek stuff is just in the development... who knows when a product is available, and at what cost.
Fact is, HDMI 2.1 can not be fed as of yet, even if the TV supports it.
I guess we'll see HDMI arrive on source devices once there are source devices that can realistically need it. In a year or two would be my guess (it's more or less announced that next-gen consoles will support it, but for other products - we'll see).
Posted on Reply
Add your own comment
Apr 19th, 2024 20:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts