Thursday, September 13th 2018

YouTube Begins Beta-testing AV1 CODEC on Beta Web-browsers

YouTube began posting its first test videos that implement the AV1 video CODEC, which aims to significantly reduce video stream bandwidths without sacrificing quality, exceeding the compression standards set by even HEVC. AV1 provides an architecture for both moving and still images, and Google, which is partly funding its development, foresees a future in which it replaces entrenched standards such as JPEG and H.264. Besides better compression, its key USP is its royalty-free license, which could translate to tangible operating-cost savings for YouTube and other video streaming services.

YouTube developers posted this playlist with a selection of videos that are encoded in AV1. You may not notice a reduction in your data consumption just yet, because the first batch of videos have been encoded at a very high bitrate to test performance. Future playlists (which will pop up on YouTube Developers channel), could test the CODEC's other more important aspects, such as data savings. To watch them, and test YouTube's AV1 player for them, you'll either need Chrome 70 beta or the latest nightly-build of Firefox (64.0a1), which pack AV1 support.
Add your own comment

80 Comments on YouTube Begins Beta-testing AV1 CODEC on Beta Web-browsers

#51
R0H1T
lexluthermiester
Take your fanbotism elsewhere. We're not talking about price of hardware at all.
What, first of all you accused me of making absurd arguments about skipping over RTX! Now when I've posted more than a singular reason you're accusing me of fanboyism, FYI my last 2 GPU's were Nvidia!
Maybe take your penchant for fight elsewhere? When did price not factor into a purchase decision, for GPU or CPU, or does everyone on this forum get hardware for free :rolleyes:
Posted on Reply
#52
lexluthermiester
R0H1T
What, first of all you accused me of making absurd arguments about skipping over RTX! Now when I've posted more than a singular argument you're accusing me of fanboyism, FYI my last 2 GPU's were Nvidia!
Maybe take your penchant for fight elsewhere? When did price not factor into a purchase decision, for GPU or CPU, or does everyone on this forum get hardware for free :rolleyes:
Because we're talking about a codec that runs on all platforms, not just PC, therefore price isn't part of the argument. Stop making such silly comments and you won't be called out on them.
Posted on Reply
#53
R0H1T
lexluthermiester
Because we're talking about a codec that runs on all platforms, not just PC, therefore price isn't part of the argument. Stop making such silly comments and you won't be called out on them.
No I was talking about one of the (many) reasons why I would skip RTX, then you're telling me that the reason(s) are absurd? How about you stop telling me why it's wrong, for me, to skip Turing? As for platform agnostic, where does that fit into what you're replying to? h.264, HEVC, VP9, VP8 et al are available on multiple platforms, you just have to deal with software decoding in each case.
Posted on Reply
#54
lexluthermiester
R0H1T
No I was talking about one of the (many) reasons why I would skip RTX, then you're telling me that the reason(s) are absurd?
Not the reasons, the comment. You implied with your original comment that RTX might have problems running the codec(which is wildly absurd) to which I responded. In the context of the discussion about this new codec, none of us care or need to know if you think it's a good reason to skip RTX.
R0H1T
As for platform agnostic, where does that fit into what you're replying to? h.264, HEVC, VP9, VP8 et al are available on multiple platforms, you just have to deal with software decoding in each case.
Because software decoding only happens in the absence of hardware support. My point was that even low end ARM SOC's can and do decode in software without flaw. Therefore ANY modern CPU/GPU can do software decoding. Your original points are pointless.
Posted on Reply
#55
R0H1T
lexluthermiester
Not the reasons, the comment. You implied with your original comment that RTX might have problems running the codec(which is wildly absurd) to which I responded. In the context of the discussion about this new codec, none of us care or need to know if you think it's a good reason to skip RTX.

Because software decoding only happens in the absence of hardware support. My point was that even low end ARM SOC's can and do decode in software without flaw. Therefore ANY modern CPU/GPU can do software decoding.
Your original points are pointless.
No I didn't, I explained exactly what I meant on the first page.

Your arguments are noted, however exceptions to my posts will be handled by the mods. I don't care what others think about my posts.

That's called hybrid decoding, for instance HEVC in case of 750/Ti while software decoding's always done by the CPU AFAIK.

I'll stop right here, but I'd like to remind you again to not take forums too personally ~ which it seems, you're doing!
Posted on Reply
#56
FordGT90Concept
"I go fast!1!11!1!"
Software decoding...is the last resort. The higher the resolution gets, the less likely software can do it in real time.
Posted on Reply
#57
hat
Enthusiast
R0H1T
No I didn't, I explained exactly what I meant on the first page.

Your arguments are noted, however exceptions to my posts will be handled by the mods. I don't care what others think about my posts.

I'll stop right here, but I'd like to remind you again to not take forums too personally ~ which it seems, you're doing!
I gotta say, you've got me confused here as well. I understood the original point which seemed to be Turing lacking hardware VM1 support? Beyond that though, you seemed to just be trolling about how awful Turing is... :laugh:

FordGT90Concept
Software decoding...is the last resort. The higher the resolution gets, the less likely software can do it in real time.
You'd need one hell of a CPU. I wouldn't expect my i5 2400 to handle HEVC beyond 1080p24 (and even that would be flaky).
Posted on Reply
#58
R0H1T
hat
I gotta say, you've got me confused here as well. I understood the original point which seemed to be Turing lacking hardware VM1 support? Beyond that though, you seemed to just be trolling about how awful Turing is... :laugh:



You'd need one hell of a CPU. I wouldn't expect my i5 2400 to handle HEVC beyond 1080p24 (and even that would be flaky).
Just trying to lay out the rest of the reasons, when the other poster calls me a fanboy!

Yes, no GPU can do "software" decoding without partial "hardware" support for something like AV1, or HEVC previously.

https://www.tomshardware.com/reviews/video-transcoding-amd-app-nvidia-cuda-intel-quicksync,2839-6.html
Posted on Reply
#59
FordGT90Concept
"I go fast!1!11!1!"
AMD, NVIDIA, and Intel will likely bake AV1 into their hardware decoders soonish (Navi could have it, for example).
Posted on Reply
#60
hat
Enthusiast
FordGT90Concept
AMD, NVIDIA, and Intel will likely bake AV1 into their hardware decoders soonish (Navi could have it, for example).
You know, it would really upset me if suddenly there's all this content flying around on a new codec when HEVC and others are still relatively new, and my fast and expensive GTX1070 can't decode it. How hard would it be to "brute force" it with CUDA or something?
Posted on Reply
#61
FordGT90Concept
"I go fast!1!11!1!"
My R9 390 can't do HEVC so what I watch and record is AVC (even YouTube as my first post in this thread indicated). It really doesn't matter what codec is used so long as it can be decoded in real time.

Why would you bother trying to brute force it? YouTube just put this playlist up for developers to easily test their codecs.
Posted on Reply
#62
hat
Enthusiast
I don't mean now, I mean if/when AV1 content actually appears. No way it's gonna run on the CPU...
Posted on Reply
#63
FordGT90Concept
"I go fast!1!11!1!"
It'll be the same as now, they'll keep supporting VP9, HEVC, and AVC for backwards compatibility. They use the most efficient one the device supports to cut down on their bandwidth use.
Posted on Reply
#64
R0H1T
ZeDestructor
How's the CPU usage looking?
On a quad core (8 threads) Intel ULV (laptop) the usage is generally below 50% always, even for full HD stream. I'm still trying all the videos on all resolutions, also with video players ~ sadly the few (free) alternatives I tried haven't worked.
Tomorrow
Nope. No luck. Even double checking that i have today's build: 64.0a1 (2018-09-13) (64-bit) and setting both preferences to true and restarting nightly, testtube still says AV1 is not supported on my browser:


Maybe Youtube does not like that i'm running Windows 7?
It could be a FF thing, you can try it with chrome 70 :confused:
Posted on Reply
#65
lexluthermiester
R0H1T
I'll stop right here, but I'd like to remind you again to not take forums too personally ~ which it seems, you're doing!
Yeah, that must be it, right? LOL! :kookoo:
hat
I gotta say, you've got me confused here as well. I understood the original point which seemed to be Turing lacking hardware VM1 support?
Which is a silly notion because all modern GPU's do.
hat
Beyond that though, you seemed to just be trolling about how awful Turing is... :laugh:
That's how it seemed to me.. Maybe we're both idiots?
Posted on Reply
#66
hat
Enthusiast
lexluthermiester
That's how it seemed to me.. Maybe we're both idiots?
I don't know about you, but I sure am! :kookoo::laugh:
Posted on Reply
#67
Renald
For a news, this is an "heavy one" !!



Ok, I'm out already :D
Posted on Reply
#68
mtcn77
Can anyone tell what is so hard about codec optimisation? It should be pretty obvious which parameters matter most with respect to accrued performance cost... It should have been even easier than Nvidia's ai driver update proposal.
Posted on Reply
#69
R-T-B
You know, when everyone is confused about what I said, I usually figure I probably said something confusing.

That's the only advice I got here.
Posted on Reply
#70
Tomorrow
Ok so after todays Firefox Nightly update i was able to play in AV1. On my 2500K CPU usage was 25-40% range and GPU usage was 3-5% (GTX 1080). That was 1080p.
Posted on Reply
#71
ZeDestructor
_A.T.Omix_
It's not only available in 480p, youtube-dl can download higher resolution AV1 videos from those links.




MPV can do the playback, stutters a bit even if my Ryzen 1700 is under 20% load (with VMware Workstation running VMs and other softs running in the background) videos look like they are sharpened.




code:
youtube-dl -f 399 https://www.youtube.com/playlist?list=PLyqf6gJt7KuHBmeVzZteZUlNUQAVLwrZS

To grab the videos in 1080p with AV1
I'm liking these increased bitrates, on a codec with better bandwidth efficiency no less!.

TheGuruStud
Yes, you are exactly what I'm talking about. Current quality isn't even acceptable.

This isn't progress, it's status quo.

Streaming is a joke and this won't help.
I do agree that the current H.265/VP9 bitrates are terrible though, upgrading to AV1 is most definitely not just keeping the status quo: AV1 is 25-50% better than H.265 HEVC/VP9 at the same bit rate (when under 25mbit/s) depending on the scene. With that in mind, not only is youtube shipping AV1, they're shipping it at higher bitrates, so this is very much a bit of a large leap forward.

Also, you have to remember, Google isn't serving just you and your gigabit everywhere, they're serving third world countries like India and Brazil where things are quite a bit slower, nevermind the massive transit bills youtube as a whole generates just by existing at all!

mtcn77
Can anyone tell what is so hard about codec optimisation? It should be pretty obvious which parameters matter most with respect to accrued performance cost... It should have been even easier than Nvidia's ai driver update proposal.
Depends what you mean by that. If you mean designing the codec itself, that's a matter of optimising some very hard math, and all the ensuing tradeoffs. It's hard work and not many people are able to do it at all.

If you mean achieving good image quality while using the code, that's a matter of balancing resolution, framerate, image quality, encoding time, and bandwidth that you can serve. On something like a blu-ray, you can really crank up the bitrate (FHD is 30-50mbit/s, UHD 4K is 50-120mbit/s averaging 70-80mbit/s), but on something like web streaming or broadcast TV you get quite pressed in by how much bandwidth you have and/or can serve to your users.
Posted on Reply
#72
mtcn77
ZeDestructor
Depends what you mean by that. If you mean designing the codec itself, that's a matter of optimising some very hard math, and all the ensuing tradeoffs. It's hard work and not many people are able to do it at all.

If you mean achieving good image quality while using the code, that's a matter of balancing resolution, framerate, image quality, encoding time, and bandwidth that you can serve. On something like a blu-ray, you can really crank up the bitrate (FHD is 30-50mbit/s, UHD 4K is 50-120mbit/s averaging 70-80mbit/s), but on something like web streaming or broadcast TV you get quite pressed in by how much bandwidth you have and/or can serve to your users.
Turns out that is indeed scheduled for development:
Content Aware Encoding (CAE) which uses machine learning to compare content against known parameters for a given device and/or media player type, can boost the picture quality and reduce the distribution cost are proving to prolong the life of AVC.
I also recall these past codecs(HEVC and VP9) having set bitrates for a given quality level, so bandwidth bound streaming will sure continue as an issue.
This part is interesting as well;
“The AV1 reference encoder is [as of today] a hundred times slower than an HEVC one,” says Fautier. “That will likely improve in the hands of encoding vendors but I still expect the additional complexity of the encoder to be around ten times vs HEVC.”
[Source]
Posted on Reply
#73
ZeDestructor
mtcn77
Turns out that is indeed scheduled for development:
Back in the day, encoders would do it manually scene by scene. Neat that that's getting automated though, I do like.

mtcn77
I also recall these past codecs(HEVC and VP9) having set bitrates for a given quality level, so bandwidth bound streaming will sure continue as an issue.
Not quite. H264, H265, VP8 and VP9 other two modes: constant quality, and constant bitrate with a seperate "preset" that defines which sub-encoder to use (slower gives better quality per bit, at the cost of longer encoding times).

mtcn77
This part is interesting as well;
[Source]
Early days. H265 and VP9 were real slow at the start. In fact, libvpx, the VP9 encoder, is still only something like 2 or 4 threads to this day (iirc). Ultimately though, the encoder performance isn't too much of a problem. Decoder performance is, and that'll be solved when we get true fixed-function decode blocks in our GPUs, probably next year (remember, AV1 was only finalized a few months ago.. too soon to go in any immediately upcoming GPU).
Posted on Reply
#74
mtcn77
ZeDestructor
Not quite. H264, H265, VP8 and VP9 other two modes: constant quality, and constant bitrate with a seperate "preset" that defines which sub-encoder to use (slower gives better quality per bit, at the cost of longer encoding times).
You drop from the channel stream unless you can dedicate the bandwidth, constant or otherwise. The quality levels don't do a good job of artifact clearance with respect to bandwidth use. Let's see if content awareness offers a solution.
Posted on Reply
#75
ZeDestructor
mtcn77
You drop from the channel stream unless you can dedicate the bandwidth, constant or otherwise. The quality levels don't do a good job of artifact clearance with respect to bandwidth use. Let's see if content awareness offers a solution.
Much of that is from being too constrained. On something like blu-ray things look excellent (this is why I have a big ol' NAS....)
Posted on Reply
Add your own comment