• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

YouTube Begins Beta-testing AV1 CODEC on Beta Web-browsers

There's couple of settings you need to change in about:config & then choose AV1 at youtube.com/testtube :cool:
View attachment 106826

I chose Always Prefer AV1 & that enabled AV1 at 480p & resolutions above.

How's the CPU usage looking?
 
Nope. No luck. Even double checking that i have today's build: 64.0a1 (2018-09-13) (64-bit) and setting both preferences to true and restarting nightly, testtube still says AV1 is not supported on my browser:
AV1 decoding is not available on this browser yet. For full support, use Chrome 70 or newer, or Firefox 63 or newer with the media.av1.enabled pref set.

Maybe Youtube does not like that i'm running Windows 7?
 
One of the (many) reasons why you can skip Turing.
That is easily one of the most absurd statements made lately. How, in any reasonable technological way, is this development a valid reason to skip RTX cards? It can't be for hardware optimizations, because even low end ARM SOC's can render 10bit UHD60 HEVC flawlessly. An RTX will have zero trouble with it. Low end GTX, GT or RX have no problems running such. Even Intel's offerings have zero issues. So please explain your logic in a way that actually makes reasonable sense. My bet is that you can not.

Maybe Youtube does not like that i'm running Windows 7?
Doubtful. Might be a driver issue, though that is also unlikely. Keep in mind this is all in beta right now. Given time it'll be fixed.
 
So...……….I tried to compare old codec and new but if you right-click on the video and select "stats for nerds" it shows that it is using "AVC1" for all videos... at least on Edge.

I think AVC is h.254 which makes sense because that's what my R9 390 can decode, which it was, at 10%. So...???
 
It's not only available in 480p, youtube-dl can download higher resolution AV1 videos from those links.

Capture.PNG



MPV can do the playback, stutters a bit even if my Ryzen 1700 is under 20% load (with VMware Workstation running VMs and other softs running in the background) videos look like they are sharpened.

Capture.PNG


Code:
youtube-dl -f 399 https://www.youtube.com/playlist?list=PLyqf6gJt7KuHBmeVzZteZUlNUQAVLwrZS
To grab the videos in 1080p with AV1
 
Last edited:
av01
425x240 mp4 container
640x360 mp4 container
854x480 mp4 container
1280x720 mp4 container
1920x1080 mp4 container


854x480 has avc1 (21.22 MiB mp4), av01 (22.17 MiB mp4), and vp9 (18.06 MiB webm)
 
Last edited:
I don't care about junk encoders. HIGHER bitrate, now. Youtube is such garbage that good channels upscale to get the higher bitrates.
 
I don't care about junk encoders. HIGHER bitrate, now. Youtube is such garbage that good channels upscale to get the higher bitrates.
Why would you not want better codecs? Ramming things with bitrate is just silly. Smaller bitrate means the content is more easily accessible to people with less than stellar internet connections (be it plain low speed, data caps or both). It means being able to fit more content in the same storage space.
 
Why would you not want better codecs? Ramming things with bitrate is just silly. Smaller bitrate means the content is more easily accessible to people with less than stellar internet connections (be it plain low speed, data caps or both). It means being able to fit more content in the same storage space.

B/c they're not used responsibly. Image quality is clearly not a concern. If they're not going to maintain bitrate with newer codecs, then we don't need it.

And I don't care about the plebs. They can watch 360 garbage for all I care.
 
Not used responsibly? Hmm... I will rip a bluray disc, convert to HEVC while cutting it down to 720p with a low-ish CRF value (that is, low on the quality side, better for compression) which cuts down bitrate, and therefore the resulting filesize considerably. I've even done DVDrips in the same way (though I do not reduce the resolution of DVD video any) and my rips look perfect on a 50+ inch TV. Image quality is very much a concern, though, because I spent a very long time arriving at the process I use to transcode video to achieve the best results.

What about maintaining bitrate? If you have a 5000kbps video, and you have a new codec that can give you the same quality at half the bitrate, why not go with a smaller video? You can only push quality so much. There are a lot of "plebs" who would welcome a new codec that can chop bitrate in half without sacrificing quality. Ask anybody with a data cap and/or slow internet.
 
B/c they're not used responsibly. Image quality is clearly not a concern. If they're not going to maintain bitrate with newer codecs, then we don't need it.

And I don't care about the plebs. They can watch 360 garbage for all I care.

Look at the screenshot I posted, av01 @ 1080p has more than 2x the bitrate of avc1 and 3x the bitrate of VP9.
What's the matter then?
 
Not used responsibly? Hmm... I will rip a bluray disc, convert to HEVC while cutting it down to 720p with a low-ish CRF value (that is, low on the quality side, better for compression) which cuts down bitrate, and therefore the resulting filesize considerably. I've even done DVDrips in the same way (though I do not reduce the resolution of DVD video any) and my rips look perfect on a 50+ inch TV. Image quality is very much a concern, though, because I spent a very long time arriving at the process I use to transcode video to achieve the best results.

What about maintaining bitrate? If you have a 5000kbps video, and you have a new codec that can give you the same quality at half the bitrate, why not go with a smaller video? You can only push quality so much. There are a lot of "plebs" who would welcome a new codec that can chop bitrate in half without sacrificing quality. Ask anybody with a data cap and/or slow internet.

Yes, you are exactly what I'm talking about. Current quality isn't even acceptable.

This isn't progress, it's status quo.

Streaming is a joke and this won't help.
 
There's no discernible quality loss, to my eyes. Certainly not when actually watching something. Maybe dissecting it frame by frame (which I actually did when I noticed interlace artifacts left behind by Handbrake's decomb filter... a terrible method for handling interlaced media, but it was my first attempt as a n00b).

Current quality not acceptable? 25Mb/s bluray discs not clear enough? 4k not enough?
 
There's no discernible quality loss, to my eyes. Certainly not when actually watching something. Maybe dissecting it frame by frame (which I actually did when I noticed interlace artifacts left behind by Handbrake's decomb filter... a terrible method for handling interlaced media, but it was my first attempt as a n00b).

Current quality not acceptable? 25Mb/s bluray discs not clear enough? 4k not enough?

Youtube/streaming services, they're awful.
 
Then advanced codecs are good for you. If Netflix adopts a superior codec, for example, they can put out higher quality video, and they can do so at lower bitrates (or even better quality at the same bitrate).

Now youtube is a whole nother mess. Anybody and their brother can upload to youtube. If I started a youtube channel, I could assure you my content would be high quality (at least as far as audio/video is concerned :p), because I don't like to produce garbage, and I know enough about that sort of thing to produce a quality result. There's no standard, no guidelines... if I wanted to I could capture a video of a cool base I built in 7 Days to Die for example, but upload a terrible 100kbps 320x240 video. Youtube also goes back quite a few years. Teamfourstar channel is a wonderful example of this. When they started DBZ Abridged, the audio sucked, the videos were small and of poor quality. As things went on in years the quality of their video (and audio) improved significantly.
 
How is AV1 compared to H.265 ?? From both quality/size perspective?
 
AV1 is supposed to be better (picture and compression) than HEVC (h.265) but HEVC is likely going to be used for ATSC 3.0 (because MPEG lobby wants their money) so like AVC, HEVC is likely to be much more popular. AV1 is something Google is pushing to replace VP9 (if memory serves) so Google services will likely gravitate towards AV1. Because AV1 is royalty free, AV1 is also likely to see more widespread adoption in the open source community. Going forward, there's gonna be a lot of both, sadly.
 
Not sure why we wouldn't move forward with AV1. Why use a worse product that costs money when there's a free alternative?
 
Not sure why we wouldn't move forward with AV1. Why use a worse product that costs money when there's a free alternative?
Exactly. AV1 seems like it actually going to be better, spec wise,, and save everyone a ton of money.
 
Not sure why we wouldn't move forward with AV1. Why use a worse product that costs money when there's a free alternative?
The primary reason is because HEVC allows for more packet loss than AV1.
 
The primary reason is because HEVC allows for more packet loss than AV1.
That seems silly. Even streaming is not an application that relies on internet performance like that. That's why buffering exists - it loads the content ahead of time to ensure smooth play.
 
ATSC is intended for terrestrial broadcast where it's impossible to request resending of lost packets.
 
That is easily one of the most absurd statements made lately. How, in any reasonable technological way, is this development a valid reason to skip RTX cards? It can't be for hardware optimizations, because even low end ARM SOC's can render 10bit UHD60 HEVC flawlessly. An RTX will have zero trouble with it. Low end GTX, GT or RX have no problems running such. Even Intel's offerings have zero issues. So please explain your logic in a way that actually makes reasonable sense. My bet is that you can not.


Doubtful. Might be a driver issue, though that is also unlikely. Keep in mind this is all in beta right now. Given time it'll be fixed.
So you're not counting the absurd pricing, the RT botch job & of course the Nvidia tax?

If it doesn't have dedicated hardware (or hybrid) encode/decode blocks for AV1 then how do you suppose the GPU will handle it? AFAIK the CPU will do most if not all the decode/encode for AV1, unless my understanding of this matter is lacking.

What logic is that ~ HEVC was late, more expensive & still not nearly as popular (as it should've been) wrt h.264?

Remind me why a freer/better alternative to HEVC is bad ~ because that's what my argument is!
 
ATSC is intended for terrestrial broadcast where it's impossible to request resending of lost packets.
You mean like Cable TV? Sounds like someone needs to rework that a little bit.
 
You mean like Cable TV? Sounds like someone needs to rework that a little bit.
Cable, satellite, and terrestrial all carry the same payload but terrestrial has no encryption where cable and satellite do. In the case of all three, there's no way to request lost packets by design so the encoding has some recovery capability baked in.
 
Back
Top