• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

[AV1, AV2, AV3 codecs...] Have they given up on the AOMedia project?

Joined
Oct 24, 2022
Messages
303 (0.33/day)
Have they given up on improving the AV1 codec and developing its successors?

From what I've seen on the Internet, the AV1 codec was born inferior to the H.266 codec in every aspect: H.266 can maintain video quality with a lower bitrate than AV1 and is faster than AV1 in encoding. And the H.267 codec project is already underway.

From what I know, encoding videos with the AV1 codec using CPUs is a disaster: AV1 encoders are extremely inefficient, poorly optimized, ultra-slow and some of them only use a single CPU core. What can be done is to use a hardware encoder, which generates videos with much lower image quality compared to videos encoded by a CPU at the maximum preset.

In fact, the only advantage of AV1 is that it is free, but it is inferior to H.266 in all other aspects of technology.

I do not see a huge commitment and involvement from the world's major computer companies to launch a revolutionary "AV2" codec, as good or better than the future H.267 and still being completely free, like AV1.

I do not see cameras of all types and prices with the AV1 codec, TV receivers, TVs, video receivers, or subscription streaming platforms (like Netflix) supporting the AV1 codec.
Unfortunately, what I still see is the omnipresence of the archaic and quite flawed H.264 codec.
 
It doesn't look like anyone is giving up on AV1 but more of the really stupid MX standoff and stagnation with streaming platforms that don't have it.
I am not optimistic about the future of AV1 at the moment for the places that do have it, which is a problem because I have no home with them.
 
Av1 is royalty free. Companies like saving money :).

GPU encoding of AV1 is super fast. Yes the file size is bigger, but I get 45fps vs 1-2 using the CPU.

 
As an AMD user i'm disappointed with AV1 quality older HEVC is better more efficient.......
 
Last edited:
There was an 8 years gap between H.265 and 266. Otoh, AV1 is 7 yo.
Also, 266 is yet to see any significant adoption. Even 265 is still struggling to dominate. As you said yourself, most -probably- still use 264 (I would not call it "flawed" tho).

@Other posts: People use hardware encoders for offline compression? Why, just, why?
 
SVT-AV1 specifically runs multicore just fine on my home server's i5-8400, but I'm guessing that's part of the library and beyond the scope of your original complaint?
 
People use hardware encoders for offline compression
Because time. Or just don't care much. Maybe don't even know about the tradeoffs.

I never really tested how much Handbrake settings can improve things when using the hardware encoder tho. For recordings, on my RX 580 card, I do notice 20 Mbps at H265 gets me quite near lossless quality so I've more or less settled on using that when using Radeon Relive (which uses the hardware encoder only).
From what I've seen on the Internet, the AV1 codec was born inferior to the H.266 codec in every aspect
AV1 was born as an alternative to HEVC, at a time when browsers were considering that they might have to pay royalties over simply having decoding capabilities.

As it's based on Google's VP10 project, alongside other contemporary projects, the expectation is that it performs well quality-wise for an early/mid 2010s video codec. Same as HEVC.
Though, yes, encoding speed sucks, at least last I checked, which was a while ago... I think VP9 also had the same problem? Though I'm not really sure. Then again, that has been a reocurring theme with all codecs that replace older ones. Quality and compression go up, but the performance required to encode a whole video within reasonable times also goes up. Although in AV1 it seems to be worse.

I do not see cameras of all types and prices with the AV1 codec, TV receivers, TVs, video receivers, or subscription streaming platforms (like Netflix) supporting the AV1 codec.
Netflix does support it, as does Youtube, but depends on hardware support for devices such as TVs where all decoding needs to be offloaded to a specialized piece of hardware. Youtube as far as I know is selective about what codecs to use for any given situation, so they switch between H264, VP9 and AV1 accordingly. Most likely Netflix does the same.

As for devices/hardware, it's worth mentioning that support for both playing and capturing h265 videos on smartphones is still a bit of a recent development, like three years old or so. And with the time it takes for hardware encoders/decoders to be made (like a year at least since the codec specification is complete) and then being put into devices people use, and the time it takes for a significant number of people to switch from an old device to a new one, I think it's safe to say AV1 adoption still has a ways to go.
Unfortunately, what I still see is the omnipresence of the archaic and quite flawed H.264 codec.
H265 was a disaster, basically, with regards to licensing and AV1 is still not widely supported enough. So most places have stuck to H264. Maybe VP9 at most.
I do not see a huge commitment and involvement from the world's major computer companies to launch a revolutionary "AV2" codec, as good or better than the future H.267 and still being completely free, like AV1.
AV1 was originally started as a codec geared towards web video consumption, so in that regard you could say it's more than enough currently.

Like, sure, we can nitpick all we want, but the majority of people don't give two flying ducks about what codec Youtube or Netflix use as long as the quality is not terrible. And you just don't hear people shouting "Youtube quality sucks" outside of forums like this one or content creation communities. It's not a generalized complaint, because most people are fine with the current quality provided by AV1/VP9. Heck, H264 is also good enough in a number of cases.

All that aside, however, they're already working on a successor codec, since 2020 or so. There was a recent-ish update on developments, you can read about that here:

They also have a public Gitlab repository, which is fairly active.
 
Yep. AMD encode quality has been garbage for years.
HEVC is not garbage AV1 is garbage that would be a more accurate explanation.
 
I never really tested how much Handbrake settings can improve things when using the hardware encoder tho. For recordings, on my RX 580 card, I do notice 20 Mbps at H265 gets me quite near lossless quality so I've more or less settled on using that when using Radeon Relive (which uses the hardware encoder only).
Could be my ageing 1080's nvenc (and whatever Intel quicksyncs I have laying around). Never could get those things to output decent quality when compressing good quality sources, even with handbrake. Very useful for transcoding (recordings, etc) for their speed, yeah, but I don't really care much whether my presentations have pixel-perfect serifs or they band like an 8bit grayscale. Let the kids suffer. :cool:

But then again, 20Mbps is much higher than my usual targets.

HEVC is not garbage AV1 is garbage that would be a more accurate explanation.
One implementation out of many. Hardly a measure for the standard's efficiency.
 
Love AV1 on my 4090 ,small hit ,but no HDR , I use HEVC for HDR .
 
HEVC is not garbage AV1 is garbage that would be a more accurate explanation.
Compared to CPU, Quick Sync and NV is it when using FFMetrics. So while maybe your eye can't see the blocky video, it is detected
 
Love AV1 on my 4090 ,small hit ,but no HDR , I use HEVC for HDR .
without HDR is wash out , AV1 downside , live is where AV1 wins, if you have the hardware .
 
Last edited:
From what I've seen on the Internet, the AV1 codec was born inferior to the H.266 codec in every aspect: H.266 can maintain video quality with a lower bitrate than AV1 and is faster than AV1 in encoding. And the H.267 codec project is already underway.

Source?

From every source I'm able to find, AV1 is significantly faster:




From what I know, encoding videos with the AV1 codec using CPUs is a disaster: AV1 encoders are extremely inefficient, poorly optimized, ultra-slow and some of them only use a single CPU core. What can be done is to use a hardware encoder, which generates videos with much lower image quality compared to videos encoded by a CPU at the maximum preset.

The 2nd link above is a study done showing AV1 as faster.

It's not the trade-off you are presenting here.

In fact, the only advantage of AV1 is that it is free, but it is inferior to H.266 in all other aspects of technology.

Just focusing on the free part here as the rest of this comment has already been mentioned above,

Being free is a HUGE advantage when you are talking about the amount of videos that are served over the internet. That cannot be understated.

There is a significant interest from online video providers to continue pushing it. In fact, everyone should want to promote it, the internet should be open and free.

I do not see a huge commitment and involvement from the world's major computer companies to launch a revolutionary "AV2" codec, as good or better than the future H.267 and still being completely free, like AV1.

AV2 is in active development: https://gitlab.com/AOMediaCodec/avm

Early reports point at around a 22% bitrate savings.

It won't be revolutionary but then again neither will H266. The only thing that could perhaps create a huge lead is AI based encoding, otherwise moderate gains are the norm.

I do not see cameras of all types and prices with the AV1 codec, TV receivers, TVs, video receivers, or subscription streaming platforms (like Netflix) supporting the AV1 codec.
Unfortunately, what I still see is the omnipresence of the archaic and quite flawed H.264 codec.

Roke, firestick, Chromecast, and AppleTV all support AV1. By extension this means most SmartTVs do as well. Not sure about TV receivers but those are mostly already replaced by the forementioned smart devices and largely outmoded.

Most streaming services already use AV1 including Hulu, Amazon, and Netflix. Most skipped right over H265 and that really says something about the confidence in the AV1 codec.

I'm not sure where you are getting your info from, but AV1 has been widely adopted on hardware and software platforms since 2023 and that mass adoption process was kicked off in 2021.

H265 was a disaster, basically, with regards to licensing and AV1 is still not widely supported enough. So most places have stuck to H264. Maybe VP9 at most.

All the major platforms have transitioned to AV1.

Yep. AMD encode quality has been garbage for years. Intel Quick Sync or Nvidia NV is the only real options for quick and good encoding. CPU is the best, but might take weeks to complete a single task lol.

Key word is has, as AMD significantly improved it's HVEC and H264 implementations over the last 2 gens, although AV still provides the best quality among the three regardless of which GPU vendor you use.

Notice how he said he prefers AMD's HVEC over AV1.

AV1-SVT encoding at the highest present (-1) with an RF of 13 (very high) for a 1080p video takes about a day per video. Mind you -1 is the placebo setting and has very little difference as compared to 0 other than the increased time it takes. At 0 you reduce the time to 15 hours per video. There are graphs out there that show the correlation of video quality to time, typically 3-5 are considered the sweetspot for quality and time taken. An RF of 13 typically results in a drop in the VMAF (as opposed to the source) of between 0.2 to 2. RF 20 of between 2 and 4. RF also has a direct correlation to encode time.
 
One way for AMD to easily increase sales of its GPUs would be to add a hardware encoder of the AV1 codec in its maximum quality preset and with support for 2-pass encoding.
 
As an AMD user i'm disappointed with AV1 quality older HEVC is better more efficient.......
Yeah even h264 looks significantly better than AV1 on Radeons.
But RDNA4 has improved it a lot.
 
Yeah even h264 looks significantly better than AV1 on Radeons.
But RDNA4 has improved it a lot.
Keep in mind there is that weird hardware bug that keeps the 7xxx series from natively doing 1080p without rescaling in regards to encode, so yeah, thay will always look bad... my bet is were it not for that bug it'd be halfway decent.
 
Have they given up on improving the AV1 codec and developing its successors?

From what I've seen on the Internet, the AV1 codec was born inferior to the H.266 codec in every aspect: H.266 can maintain video quality with a lower bitrate than AV1 and is faster than AV1 in encoding. And the H.267 codec project is already underway.

From what I know, encoding videos with the AV1 codec using CPUs is a disaster: AV1 encoders are extremely inefficient, poorly optimized, ultra-slow and some of them only use a single CPU core. What can be done is to use a hardware encoder, which generates videos with much lower image quality compared to videos encoded by a CPU at the maximum preset.

In fact, the only advantage of AV1 is that it is free, but it is inferior to H.266 in all other aspects of technology.

I do not see a huge commitment and involvement from the world's major computer companies to launch a revolutionary "AV2" codec, as good or better than the future H.267 and still being completely free, like AV1.

I do not see cameras of all types and prices with the AV1 codec, TV receivers, TVs, video receivers, or subscription streaming platforms (like Netflix) supporting the AV1 codec.
Unfortunately, what I still see is the omnipresence of the archaic and quite flawed H.264 codec.
Successors to AV1 are being developed as we speak with AVM being the public results of that development.

Slow adoption of AV1 (and any other recent codecs and formats) is normal, companies only switch once they know they will benefit 100% with no caveats even if most modern browsers can play avif/av1 just fine, and decoding performance is fast enough for most hardware using dav1d. All that being said, the adoption of AV1 has been quicker than most other formats in the past.

VVC while does have better compression quality than AV1, that really doesn't mean much outside of hobbyists compressing local videos, as the licensing fees are going to stop anyone from using them and the lack of hardware encoders/decoders stops wider adoption.

Also AV1 encoding is much faster than VVC, lol what are you on about? From what I've seen from the community, even using the reference encoder (libaom) is miles faster than any current VVC encoder, nevermind SVT-AV1, which can allow for real-time recording performance as long as either you have a recent high core cpu (7950x or 9950x), or use a lowered preset and resolution (preset 8, 720p60)

Community efforts through SVT-AV1-PSY prove that you can improve the quality of av1 encoding decently, X264 is as good as it is because of how old it is and how much fidelity was sequeezed out of it.

Here are some of Emre's testing results for what it's worth
1746435606335.png
1746435656664.png


I didn't even care for VVC after seeing it underperform. One of the encodes took 7 hours on my machine and I have the top of the line hardware/software.

On the other hand, with these settings, VP9 and X265 are extremely slow (VP9 even slower). These are not realistic settings at all.

If we exclude x264, svt-av1 was the fastest here even with --preset -1. If we compare preset 2 or 4; and competitive speeds for other encoders; I am 100% sure that the difference would have been huge. But still, even with the speed diff; svt-av1 is still extremely competitive.
Edit:
I forgot to mention the Openbenchmark results for both codecs!
While at the slower presets, all the encoders are quite close to each other, SVT still beats VVENC, but is neck and neck with UVG.
VVEnc scales worse on the faster presets compared to SVT-AV1, See how the Ryzen AI 365 is double the fps when comparing SVT to VVEnc?
UVG266 is better and worse at the same time than VVEnc to the point where an Ryzen AI 370 is almost the same as a 5600X! (Zen 5 vs Zen 3!), not sure about x266 but you see how inconsistent the results are when it comes to current VVC encoders? It will take a long while until VVC is actually worth using, who knows, maybe AV2/AVM will be out by then.
(this scaling is also seen at lower resolutions)
1746437659067.png
1746437692004.png
1746437762211.png


1746437614836.png
1746437723410.png
1746437858074.png

One way for AMD to easily increase sales of its GPUs would be to add a hardware encoder of the AV1 codec in its maximum quality preset and with support for 2-pass encoding.
You heavily overestimate how many people care about encoding quality, even as someone who is interested in this topic I don't really care about the minute encoding quality since I could capture at a higher bitrate for replays and then optimize later by using SVT and running the encoding all night.
If people cared about encoding quality so much, why is Twitch still the most popular streaming platform but caps bitrate at 8Mbps?
 

Attachments

  • 1746436442662.png
    1746436442662.png
    230.3 KB · Views: 12
  • 1746436498577.png
    1746436498577.png
    185 KB · Views: 12
  • 1746436632251.png
    1746436632251.png
    368 KB · Views: 12
  • 1746437204943.png
    1746437204943.png
    235.6 KB · Views: 4
Last edited:
One way for AMD to easily increase sales of its GPUs would be to add a hardware encoder of the AV1 codec in its maximum quality preset and with support for 2-pass encoding.
Nah, just add chroma 444 and it'd be fine by me. Intel&NVIDIA has had that for a bit now.
 
Nah, just add chroma 444 and it'd be fine by me. Intel&NVIDIA has had that for a bit now.

I disagree. The reason for the existence of AV1 and its successors is precisely to be able to generate videos with very good image quality with low bitrate.

So, hardware manufacturers should develop chips that can convert videos to AV1 with maximum quality in a short time (quickly).
 
Good for you but I know I'm not alone in that want. It'd be useful for real time game streaming, which is a growing thing.
 
Good for you but I know I'm not alone in that want. It'd be useful for real time game streaming, which is a growing thing.
Correct, also as of recent it's starting to grow out of control. I'm catching Twitch streamers with insane bitrates like 9600 sending 2K60 default.
How? No idea but even on desktop it's annoying to manage and locked to an assload of bandwidth.

I'm perfectly okay with 1080p as the max and depending on the content, even that might be too much. Lucky loadout for me:
1746454905036.png


If I ever need better I'll employ h.264 encode from a GPU.

Since x264 is basically the fastest and most accessible guarantee, the move to AV1 will take ages.
 
Back
Top