• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD RDNA2 Graphics Architecture Features AV1 Decode Hardware-Acceleration

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,696 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD's RDNA2 graphics architecture features hardware-accelerated decoding of the AV1 video format, according to a Microsoft blog announcing the format's integration with Windows 10. The blog mentions the three latest graphics architectures among those that support accelerated decoding of the format—Intel Gen12 Iris Xe, NVIDIA RTX 30-series "Ampere," and AMD RX 6000-series "RDNA2." The AV1 format is being actively promoted by major hardware vendors to online streaming content providers, as it offers 50% better compression than the prevalent H.264 (translating into that much bandwidth savings), and 20% better compression than VP9. You don't need these GPUs to use AV1, anyone can use it with Windows 10 (version 1909 or later), by installing the AV1 Video Extension from the Microsoft Store. The codec will use software (CPU) decode in the absence of hardware acceleration.



View at TechPowerUp Main Site
 
That is a lot of bandwidth saved if it does become the norm. Wow that is impressive.
 
Hows it compare to H265 tho?


Coming soon to next gen chromecast and rokus etc, in about 5 years...
 
That is a lot of bandwidth saved if it does become the norm. Wow that is impressive.
Iirc encoding media at home to AV1 is still not feasible. I wanted to do it but too few software support and even then it's too damn slow. Had to go with x265 10-bit.

Atm the gains are only for big powerhouses like Google (for Youtube), Netflix etc. where they can just brute-force encode with their server-farms.
 
Hows it compare to H265 tho?


Coming soon to next gen chromecast and rokus etc, in about 5 years...

It was hyped to save upto 50% bandwidth over HEVC but I think in reality its more like 30%. Ofc, AV1 is opensource so it has that going for it. Fwiw AV1 requires more powerful hardware during encoding. But HEVC has more adoption... er is more or less fully adopted.
 
Good, nice to have all major vendors decoding AV1.
I hope it's not a meme decoder like VP9, which recently got disabled in the latest Radeon drivers.
 
any software emulation for CPU or GPU with no AV1 support ?
 
Hows it compare to H265 tho?


Coming soon to next gen chromecast and rokus etc, in about 5 years...

That is my question also.

My current Andriod box does H264,H265.
 
Hows it compare to H265 tho?


Coming soon to next gen chromecast and rokus etc, in about 5 years...

AOMedia Video 1 (AV1) is an open, royalty-free video coding format.

High Efficiency Video Coding
0 - 100,000 units/year = no royalty (available to one Legal Entity in an affiliated group) • US $0.20 per unit after first 100,000 units each year • Maximum annual royalty payable by an Enterprise (Legal Entity and Affiliates) is $25M for present coverage during the first License Term
 
Why is data compression still used, thus wasting hardware resources? Haven't the volumes of HDD and SSD increased enough that there is no need for compression? Аt least as far as certain types of files are concerned, can't it just be over and the compression and decompression just remain a historical fact.
 
Why is data compression still used, thus wasting hardware resources? Haven't the volumes of HDD and SSD increased enough that there is no need for compression? Аt least as far as certain types of files are concerned, can't it just be over and the compression and decompression just remain a historical fact.


Uuuhhh... i somehow think you dont understand this at all.
 
Why is data compression still used, thus wasting hardware resources? Haven't the volumes of HDD and SSD increased enough that there is no need for compression? Аt least as far as certain types of files are concerned, can't it just be over and the compression and decompression just remain a historical fact.


Format
Uncompressed 1080 10-bit
Resolution 1920x1080
Frame rate
30
Video length
10 minutes
Total space: 130.36 GB


Format
H.264 1080
Resolution 1920x1080
Frame rate
30
Video length
10 minutes
Total space: 7.23 GB

The actual space taken up may differ slightly due to embedded audio, differing frame sizes and aspect ratios, and inter-frame compression / pulldown

Worth it ?
 
Code:
https://www.twitch.tv/videos/637388605

Not realtime rendered, but shows what might be possible at those bitrates at some point down the line with a real time hardware encoder.
 
Last edited:
Hmm, I don't mean the difference between the volume of the raw video file and the primary processed, but to remove the secondary compression, which saves no more than 50% volume.
 
Format
Uncompressed 1080 10-bit
Resolution 1920x1080
Frame rate
30
Video length
10 minutes
Total space: 130.36 GB


Format
H.264 1080
Resolution 1920x1080
Frame rate
30
Video length
10 minutes
Total space: 7.23 GB



Worth it ?

Its more than worth it, sure one might have the space to store it, but moving it over the internet in the amounts its getting moved today, we dont have enough bandwith, as a matter of fact its predicted that the network speed developments and integration of new technologies that bring higher speed, doesnt evolve fast enough to cope with the amount of data being moved. And that is with compression. Most of the internet trafic is video.
 
Well, it would have been really surprising if it wasn't supported, but nice to get confirmation.
 
It was hyped to save upto 50% bandwidth over HEVC but I think in reality its more like 30%. Ofc, AV1 is opensource so it has that going for it. Fwiw AV1 requires more powerful hardware during encoding. But HEVC has more adoption... er is more or less fully adopted
AV1 isn't even 3 years old IIRC, it can still get there & yeah you're probably looking at a best case scenario for that like most other such claims.

And it's till more expensive than AV1, which is why it will be superseded eventually.
 
AV1 isn't even 3 years old IIRC, it can still get there & yeah you're probably looking at a best case scenario for that like most other such claims.

And it's till more expensive than AV1, which is why it will be superseded eventually.
When you talk adoption, it helps to specify a metric for that. I mean, with Google/Android and Netflix behind AV1 you have two players, but probably half the video traffic covered. 2020 TVs also do AV1. With that big picture in mind, PC is almost irrelevant (but still close to my heart).
 
Its more than worth it, sure one might have the space to store it, but moving it over the internet in the amounts its getting moved today, we dont have enough bandwith, as a matter of fact its predicted that the network speed developments and integration of new technologies that bring higher speed, doesnt evolve fast enough to cope with the amount of data being moved. And that is with compression. Most of the internet trafic is video.
This is another side of problem. I do not see how it can change in the near future. Before 10+ year was attentions for 1 terabit per second in 2013 and up to 100 terabits per second for mainframe of internet providers around 2020 but big players decided that people don't deserve high speeds, so they can save a dollar and invest in the meager 400 gigabits per second. But more of this for discussion in specialized for internet section of forum if exist.
 
Hows it compare to H265 tho?


Coming soon to next gen chromecast and rokus etc, in about 5 years...
Nah, it might already be in the new Chromecast with remote, depending on which Amlogic chip they're using.
It's also in some chips from Broadcom, Qualcomm and Realtek, so it'll be here sooner rather than later.
 
I really don't care about the amount of bandwidth being used to stream content. What I'd like is an option from all the streaming services to have them offer up the best quality streams they possibly can. Let me choose how much bandwidth I use while watching content, please.
 
Hmm, I don't mean the difference between the volume of the raw video file and the primary processed, but to remove the secondary compression, which saves no more than 50% volume.
What secondary compression? You seem to be clueless about this topic.
As pointed out by others, Netflix and Google are moving to AV1 as their future standard compression algorithm, so everything they'll be streaming in the future, will be in AV1 from their servers to you. Right now, several different standards are in use, as older content hasn't always been re-encoded to newer codecs and new content is using HEVC or something similar, as it's the most efficient, but also widely supported standard out there. A lot of content is also available in multiple codecs, as the player is requesting the content it can play. It's not as if Netflix and YouTube would be transcoding content on the fly, as it would be too slow to do. Please see the link below for an example of how Netflix encodes their shows in multiple formats for different devices.
That said, weirdly enough the Netflix Windows app seems to using AVC1 (H.264) whereas going through Chrome I get VP9 encoded content. If you want to check the tech specs on a PC, hit Ctrl+Alt+Shift+Q while playing a video.
 
Last edited:
I really don't care about the amount of bandwidth being used to stream content. What I'd like is an option from all the streaming services to have them offer up the best quality streams they possibly can. Let me choose how much bandwidth I use while watching content, please.
So you'd be ok with video stuttering and skipping when your connection acts up?
I mean, I also find it pathetic how little control end users have, but you have to look at this from all angles.
 
So you'd be ok with video stuttering and skipping when your connection acts up?
I mean, I also find it pathetic how little control end users have, but you have to look at this from all angles.
Yes, some user control would be nice, but within reason. As pointed out above, why do I end up with worse encoded content using the Netflix app than a browser? In Edge videos are encoded H.264 btw. Browsers only give me 720p content though, whereas the app gives me 1080p (I don't pay for 4K as I have no 4K TV yet).
Edit: Seems like the browser resolution thing is a Netflix limitation to prevent people from bypassing the software DRM... Even so, I don't understand why Chrome then uses a more efficient codec than their own app.
It should also be pointed out that there's something of a quality setting in Netflix that you can change on their website:

netflix_codecs.jpg
 
Last edited:
Back
Top