Thursday, September 13th 2018

YouTube Begins Beta-testing AV1 CODEC on Beta Web-browsers

YouTube began posting its first test videos that implement the AV1 video CODEC, which aims to significantly reduce video stream bandwidths without sacrificing quality, exceeding the compression standards set by even HEVC. AV1 provides an architecture for both moving and still images, and Google, which is partly funding its development, foresees a future in which it replaces entrenched standards such as JPEG and H.264. Besides better compression, its key USP is its royalty-free license, which could translate to tangible operating-cost savings for YouTube and other video streaming services.

YouTube developers posted this playlist with a selection of videos that are encoded in AV1. You may not notice a reduction in your data consumption just yet, because the first batch of videos have been encoded at a very high bitrate to test performance. Future playlists (which will pop up on YouTube Developers channel), could test the CODEC's other more important aspects, such as data savings. To watch them, and test YouTube's AV1 player for them, you'll either need Chrome 70 beta or the latest nightly-build of Firefox (64.0a1), which pack AV1 support.
Add your own comment

80 Comments on YouTube Begins Beta-testing AV1 CODEC on Beta Web-browsers

#26
Tomorrow
Nope. No luck. Even double checking that i have today's build: 64.0a1 (2018-09-13) (64-bit) and setting both preferences to true and restarting nightly, testtube still says AV1 is not supported on my browser:
AV1 decoding is not available on this browser yet. For full support, use Chrome 70 or newer, or Firefox 63 or newer with the media.av1.enabled pref set.
Maybe Youtube does not like that i'm running Windows 7?
Posted on Reply
#27
lexluthermiester
R0H1T
One of the (many) reasons why you can skip Turing.
That is easily one of the most absurd statements made lately. How, in any reasonable technological way, is this development a valid reason to skip RTX cards? It can't be for hardware optimizations, because even low end ARM SOC's can render 10bit UHD60 HEVC flawlessly. An RTX will have zero trouble with it. Low end GTX, GT or RX have no problems running such. Even Intel's offerings have zero issues. So please explain your logic in a way that actually makes reasonable sense. My bet is that you can not.

Tomorrow
Maybe Youtube does not like that i'm running Windows 7?
Doubtful. Might be a driver issue, though that is also unlikely. Keep in mind this is all in beta right now. Given time it'll be fixed.
Posted on Reply
#28
FordGT90Concept
"I go fast!1!11!1!"
So...……….I tried to compare old codec and new but if you right-click on the video and select "stats for nerds" it shows that it is using "AVC1" for all videos... at least on Edge.

I think AVC is h.254 which makes sense because that's what my R9 390 can decode, which it was, at 10%. So...???
Posted on Reply
#29
_A.T.Omix_
It's not only available in 480p, youtube-dl can download higher resolution AV1 videos from those links.




MPV can do the playback, stutters a bit even if my Ryzen 1700 is under 20% load (with VMware Workstation running VMs and other softs running in the background) videos look like they are sharpened.




code:
youtube-dl -f 399 https://www.youtube.com/playlist?list=PLyqf6gJt7KuHBmeVzZteZUlNUQAVLwrZS

To grab the videos in 1080p with AV1
Posted on Reply
#30
FordGT90Concept
"I go fast!1!11!1!"
av01
425x240 mp4 container
640x360 mp4 container
854x480 mp4 container
1280x720 mp4 container
1920x1080 mp4 container


854x480 has avc1 (21.22 MiB mp4), av01 (22.17 MiB mp4), and vp9 (18.06 MiB webm)
Posted on Reply
#31
TheGuruStud
I don't care about junk encoders. HIGHER bitrate, now. Youtube is such garbage that good channels upscale to get the higher bitrates.
Posted on Reply
#32
hat
Enthusiast
TheGuruStud
I don't care about junk encoders. HIGHER bitrate, now. Youtube is such garbage that good channels upscale to get the higher bitrates.
Why would you not want better codecs? Ramming things with bitrate is just silly. Smaller bitrate means the content is more easily accessible to people with less than stellar internet connections (be it plain low speed, data caps or both). It means being able to fit more content in the same storage space.
Posted on Reply
#33
TheGuruStud
hat
Why would you not want better codecs? Ramming things with bitrate is just silly. Smaller bitrate means the content is more easily accessible to people with less than stellar internet connections (be it plain low speed, data caps or both). It means being able to fit more content in the same storage space.
B/c they're not used responsibly. Image quality is clearly not a concern. If they're not going to maintain bitrate with newer codecs, then we don't need it.

And I don't care about the plebs. They can watch 360 garbage for all I care.
Posted on Reply
#34
hat
Enthusiast
Not used responsibly? Hmm... I will rip a bluray disc, convert to HEVC while cutting it down to 720p with a low-ish CRF value (that is, low on the quality side, better for compression) which cuts down bitrate, and therefore the resulting filesize considerably. I've even done DVDrips in the same way (though I do not reduce the resolution of DVD video any) and my rips look perfect on a 50+ inch TV. Image quality is very much a concern, though, because I spent a very long time arriving at the process I use to transcode video to achieve the best results.

What about maintaining bitrate? If you have a 5000kbps video, and you have a new codec that can give you the same quality at half the bitrate, why not go with a smaller video? You can only push quality so much. There are a lot of "plebs" who would welcome a new codec that can chop bitrate in half without sacrificing quality. Ask anybody with a data cap and/or slow internet.
Posted on Reply
#35
_A.T.Omix_
TheGuruStud
B/c they're not used responsibly. Image quality is clearly not a concern. If they're not going to maintain bitrate with newer codecs, then we don't need it.

And I don't care about the plebs. They can watch 360 garbage for all I care.
Look at the screenshot I posted, av01 @ 1080p has more than 2x the bitrate of avc1 and 3x the bitrate of VP9.
What's the matter then?
Posted on Reply
#36
TheGuruStud
hat
Not used responsibly? Hmm... I will rip a bluray disc, convert to HEVC while cutting it down to 720p with a low-ish CRF value (that is, low on the quality side, better for compression) which cuts down bitrate, and therefore the resulting filesize considerably. I've even done DVDrips in the same way (though I do not reduce the resolution of DVD video any) and my rips look perfect on a 50+ inch TV. Image quality is very much a concern, though, because I spent a very long time arriving at the process I use to transcode video to achieve the best results.

What about maintaining bitrate? If you have a 5000kbps video, and you have a new codec that can give you the same quality at half the bitrate, why not go with a smaller video? You can only push quality so much. There are a lot of "plebs" who would welcome a new codec that can chop bitrate in half without sacrificing quality. Ask anybody with a data cap and/or slow internet.
Yes, you are exactly what I'm talking about. Current quality isn't even acceptable.

This isn't progress, it's status quo.

Streaming is a joke and this won't help.
Posted on Reply
#37
hat
Enthusiast
There's no discernible quality loss, to my eyes. Certainly not when actually watching something. Maybe dissecting it frame by frame (which I actually did when I noticed interlace artifacts left behind by Handbrake's decomb filter... a terrible method for handling interlaced media, but it was my first attempt as a n00b).

Current quality not acceptable? 25Mb/s bluray discs not clear enough? 4k not enough?
Posted on Reply
#38
TheGuruStud
hat
There's no discernible quality loss, to my eyes. Certainly not when actually watching something. Maybe dissecting it frame by frame (which I actually did when I noticed interlace artifacts left behind by Handbrake's decomb filter... a terrible method for handling interlaced media, but it was my first attempt as a n00b).

Current quality not acceptable? 25Mb/s bluray discs not clear enough? 4k not enough?
Youtube/streaming services, they're awful.
Posted on Reply
#39
hat
Enthusiast
Then advanced codecs are good for you. If Netflix adopts a superior codec, for example, they can put out higher quality video, and they can do so at lower bitrates (or even better quality at the same bitrate).

Now youtube is a whole nother mess. Anybody and their brother can upload to youtube. If I started a youtube channel, I could assure you my content would be high quality (at least as far as audio/video is concerned :p), because I don't like to produce garbage, and I know enough about that sort of thing to produce a quality result. There's no standard, no guidelines... if I wanted to I could capture a video of a cool base I built in 7 Days to Die for example, but upload a terrible 100kbps 320x240 video. Youtube also goes back quite a few years. Teamfourstar channel is a wonderful example of this. When they started DBZ Abridged, the audio sucked, the videos were small and of poor quality. As things went on in years the quality of their video (and audio) improved significantly.
Posted on Reply
#40
Prima.Vera
How is AV1 compared to H.265 ?? From both quality/size perspective?
Posted on Reply
#41
FordGT90Concept
"I go fast!1!11!1!"
AV1 is supposed to be better (picture and compression) than HEVC (h.265) but HEVC is likely going to be used for ATSC 3.0 (because MPEG lobby wants their money) so like AVC, HEVC is likely to be much more popular. AV1 is something Google is pushing to replace VP9 (if memory serves) so Google services will likely gravitate towards AV1. Because AV1 is royalty free, AV1 is also likely to see more widespread adoption in the open source community. Going forward, there's gonna be a lot of both, sadly.
Posted on Reply
#42
hat
Enthusiast
Not sure why we wouldn't move forward with AV1. Why use a worse product that costs money when there's a free alternative?
Posted on Reply
#43
lexluthermiester
hat
Not sure why we wouldn't move forward with AV1. Why use a worse product that costs money when there's a free alternative?
Exactly. AV1 seems like it actually going to be better, spec wise,, and save everyone a ton of money.
Posted on Reply
#44
FordGT90Concept
"I go fast!1!11!1!"
hat
Not sure why we wouldn't move forward with AV1. Why use a worse product that costs money when there's a free alternative?
The primary reason is because HEVC allows for more packet loss than AV1.
Posted on Reply
#45
hat
Enthusiast
FordGT90Concept
The primary reason is because HEVC allows for more packet loss than AV1.
That seems silly. Even streaming is not an application that relies on internet performance like that. That's why buffering exists - it loads the content ahead of time to ensure smooth play.
Posted on Reply
#46
FordGT90Concept
"I go fast!1!11!1!"
ATSC is intended for terrestrial broadcast where it's impossible to request resending of lost packets.
Posted on Reply
#47
R0H1T
lexluthermiester
That is easily one of the most absurd statements made lately. How, in any reasonable technological way, is this development a valid reason to skip RTX cards? It can't be for hardware optimizations, because even low end ARM SOC's can render 10bit UHD60 HEVC flawlessly. An RTX will have zero trouble with it. Low end GTX, GT or RX have no problems running such. Even Intel's offerings have zero issues. So please explain your logic in a way that actually makes reasonable sense. My bet is that you can not.


Doubtful. Might be a driver issue, though that is also unlikely. Keep in mind this is all in beta right now. Given time it'll be fixed.
So you're not counting the absurd pricing, the RT botch job & of course the Nvidia tax?

If it doesn't have dedicated hardware (or hybrid) encode/decode blocks for AV1 then how do you suppose the GPU will handle it? AFAIK the CPU will do most if not all the decode/encode for AV1, unless my understanding of this matter is lacking.

What logic is that ~ HEVC was late, more expensive & still not nearly as popular (as it should've been) wrt h.264?

Remind me why a freer/better alternative to HEVC is bad ~ because that's what my argument is!
Posted on Reply
#48
hat
Enthusiast
FordGT90Concept
ATSC is intended for terrestrial broadcast where it's impossible to request resending of lost packets.
You mean like Cable TV? Sounds like someone needs to rework that a little bit.
Posted on Reply
#49
FordGT90Concept
"I go fast!1!11!1!"
hat
You mean like Cable TV? Sounds like someone needs to rework that a little bit.
Cable, satellite, and terrestrial all carry the same payload but terrestrial has no encryption where cable and satellite do. In the case of all three, there's no way to request lost packets by design so the encoding has some recovery capability baked in.
Posted on Reply
#50
lexluthermiester
R0H1T
So you're not counting the absurd pricing, the RT botch job & of course the Nvidia tax?
Take your nonsense elsewhere. We're not talking about price of hardware at all.
Posted on Reply
Add your own comment