• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

YouTube Begins Beta-testing AV1 CODEC on Beta Web-browsers

Take your fanbotism elsewhere. We're not talking about price of hardware at all.
What, first of all you accused me of making absurd arguments about skipping over RTX! Now when I've posted more than a singular reason you're accusing me of fanboyism, FYI my last 2 GPU's were Nvidia!
Maybe take your penchant for fight elsewhere? When did price not factor into a purchase decision, for GPU or CPU, or does everyone on this forum get hardware for free :rolleyes:
 
What, first of all you accused me of making absurd arguments about skipping over RTX! Now when I've posted more than a singular argument you're accusing me of fanboyism, FYI my last 2 GPU's were Nvidia!
Maybe take your penchant for fight elsewhere? When did price not factor into a purchase decision, for GPU or CPU, or does everyone on this forum get hardware for free :rolleyes:
Because we're talking about a codec that runs on all platforms, not just PC, therefore price isn't part of the argument. Stop making such silly comments and you won't be called out on them.
 
Because we're talking about a codec that runs on all platforms, not just PC, therefore price isn't part of the argument. Stop making such silly comments and you won't be called out on them.
No I was talking about one of the (many) reasons why I would skip RTX, then you're telling me that the reason(s) are absurd? How about you stop telling me why it's wrong, for me, to skip Turing? As for platform agnostic, where does that fit into what you're replying to? h.264, HEVC, VP9, VP8 et al are available on multiple platforms, you just have to deal with software decoding in each case.
 
No I was talking about one of the (many) reasons why I would skip RTX, then you're telling me that the reason(s) are absurd?
Not the reasons, the comment. You implied with your original comment that RTX might have problems running the codec(which is wildly absurd) to which I responded. In the context of the discussion about this new codec, none of us care or need to know if you think it's a good reason to skip RTX.
As for platform agnostic, where does that fit into what you're replying to? h.264, HEVC, VP9, VP8 et al are available on multiple platforms, you just have to deal with software decoding in each case.
Because software decoding only happens in the absence of hardware support. My point was that even low end ARM SOC's can and do decode in software without flaw. Therefore ANY modern CPU/GPU can do software decoding. Your original points are pointless.
 
Not the reasons, the comment. You implied with your original comment that RTX might have problems running the codec(which is wildly absurd) to which I responded. In the context of the discussion about this new codec, none of us care or need to know if you think it's a good reason to skip RTX.

Because software decoding only happens in the absence of hardware support. My point was that even low end ARM SOC's can and do decode in software without flaw. Therefore ANY modern CPU/GPU can do software decoding.
Your original points are pointless.
No I didn't, I explained exactly what I meant on the first page.

Your arguments are noted, however exceptions to my posts will be handled by the mods. I don't care what others think about my posts.

That's called hybrid decoding, for instance HEVC in case of 750/Ti while software decoding's always done by the CPU AFAIK.

I'll stop right here, but I'd like to remind you again to not take forums too personally ~ which it seems, you're doing!
 
Last edited:
Software decoding...is the last resort. The higher the resolution gets, the less likely software can do it in real time.
 
No I didn't, I explained exactly what I meant on the first page.

Your arguments are noted, however exceptions to my posts will be handled by the mods. I don't care what others think about my posts.

I'll stop right here, but I'd like to remind you again to not take forums too personally ~ which it seems, you're doing!

I gotta say, you've got me confused here as well. I understood the original point which seemed to be Turing lacking hardware VM1 support? Beyond that though, you seemed to just be trolling about how awful Turing is... :laugh:

Software decoding...is the last resort. The higher the resolution gets, the less likely software can do it in real time.

You'd need one hell of a CPU. I wouldn't expect my i5 2400 to handle HEVC beyond 1080p24 (and even that would be flaky).
 
I gotta say, you've got me confused here as well. I understood the original point which seemed to be Turing lacking hardware VM1 support? Beyond that though, you seemed to just be trolling about how awful Turing is... :laugh:



You'd need one hell of a CPU. I wouldn't expect my i5 2400 to handle HEVC beyond 1080p24 (and even that would be flaky).
Just trying to lay out the rest of the reasons, when the other poster calls me a fanboy!

Yes, no GPU can do "software" decoding without partial "hardware" support for something like AV1, or HEVC previously.

https://www.tomshardware.com/review...d-app-nvidia-cuda-intel-quicksync,2839-6.html
 
  • Like
Reactions: hat
AMD, NVIDIA, and Intel will likely bake AV1 into their hardware decoders soonish (Navi could have it, for example).
 
AMD, NVIDIA, and Intel will likely bake AV1 into their hardware decoders soonish (Navi could have it, for example).
You know, it would really upset me if suddenly there's all this content flying around on a new codec when HEVC and others are still relatively new, and my fast and expensive GTX1070 can't decode it. How hard would it be to "brute force" it with CUDA or something?
 
My R9 390 can't do HEVC so what I watch and record is AVC (even YouTube as my first post in this thread indicated). It really doesn't matter what codec is used so long as it can be decoded in real time.

Why would you bother trying to brute force it? YouTube just put this playlist up for developers to easily test their codecs.
 
I don't mean now, I mean if/when AV1 content actually appears. No way it's gonna run on the CPU...
 
It'll be the same as now, they'll keep supporting VP9, HEVC, and AVC for backwards compatibility. They use the most efficient one the device supports to cut down on their bandwidth use.
 
How's the CPU usage looking?
On a quad core (8 threads) Intel ULV (laptop) the usage is generally below 50% always, even for full HD stream. I'm still trying all the videos on all resolutions, also with video players ~ sadly the few (free) alternatives I tried haven't worked.
Nope. No luck. Even double checking that i have today's build: 64.0a1 (2018-09-13) (64-bit) and setting both preferences to true and restarting nightly, testtube still says AV1 is not supported on my browser:


Maybe Youtube does not like that i'm running Windows 7?
It could be a FF thing, you can try it with chrome 70 :confused:
 
I'll stop right here, but I'd like to remind you again to not take forums too personally ~ which it seems, you're doing!
Yeah, that must be it, right? LOL! :kookoo:
I gotta say, you've got me confused here as well. I understood the original point which seemed to be Turing lacking hardware VM1 support?
Which is a silly notion because all modern GPU's do.
Beyond that though, you seemed to just be trolling about how awful Turing is... :laugh:
That's how it seemed to me.. Maybe we're both idiots?
 
Can anyone tell what is so hard about codec optimisation? It should be pretty obvious which parameters matter most with respect to accrued performance cost... It should have been even easier than Nvidia's ai driver update proposal.
 
You know, when everyone is confused about what I said, I usually figure I probably said something confusing.

That's the only advice I got here.
 
Ok so after todays Firefox Nightly update i was able to play in AV1. On my 2500K CPU usage was 25-40% range and GPU usage was 3-5% (GTX 1080). That was 1080p.
 
It's not only available in 480p, youtube-dl can download higher resolution AV1 videos from those links.

View attachment 106849


MPV can do the playback, stutters a bit even if my Ryzen 1700 is under 20% load (with VMware Workstation running VMs and other softs running in the background) videos look like they are sharpened.

View attachment 106850


Code:
youtube-dl -f 399 https://www.youtube.com/playlist?list=PLyqf6gJt7KuHBmeVzZteZUlNUQAVLwrZS
To grab the videos in 1080p with AV1

I'm liking these increased bitrates, on a codec with better bandwidth efficiency no less!.

Yes, you are exactly what I'm talking about. Current quality isn't even acceptable.

This isn't progress, it's status quo.

Streaming is a joke and this won't help.

I do agree that the current H.265/VP9 bitrates are terrible though, upgrading to AV1 is most definitely not just keeping the status quo: AV1 is 25-50% better than H.265 HEVC/VP9 at the same bit rate (when under 25mbit/s) depending on the scene. With that in mind, not only is youtube shipping AV1, they're shipping it at higher bitrates, so this is very much a bit of a large leap forward.

Also, you have to remember, Google isn't serving just you and your gigabit everywhere, they're serving third world countries like India and Brazil where things are quite a bit slower, nevermind the massive transit bills youtube as a whole generates just by existing at all!

Can anyone tell what is so hard about codec optimisation? It should be pretty obvious which parameters matter most with respect to accrued performance cost... It should have been even easier than Nvidia's ai driver update proposal.

Depends what you mean by that. If you mean designing the codec itself, that's a matter of optimising some very hard math, and all the ensuing tradeoffs. It's hard work and not many people are able to do it at all.

If you mean achieving good image quality while using the code, that's a matter of balancing resolution, framerate, image quality, encoding time, and bandwidth that you can serve. On something like a blu-ray, you can really crank up the bitrate (FHD is 30-50mbit/s, UHD 4K is 50-120mbit/s averaging 70-80mbit/s), but on something like web streaming or broadcast TV you get quite pressed in by how much bandwidth you have and/or can serve to your users.
 
Depends what you mean by that. If you mean designing the codec itself, that's a matter of optimising some very hard math, and all the ensuing tradeoffs. It's hard work and not many people are able to do it at all.

If you mean achieving good image quality while using the code, that's a matter of balancing resolution, framerate, image quality, encoding time, and bandwidth that you can serve. On something like a blu-ray, you can really crank up the bitrate (FHD is 30-50mbit/s, UHD 4K is 50-120mbit/s averaging 70-80mbit/s), but on something like web streaming or broadcast TV you get quite pressed in by how much bandwidth you have and/or can serve to your users.
Turns out that is indeed scheduled for development:
Content Aware Encoding (CAE) which uses machine learning to compare content against known parameters for a given device and/or media player type, can boost the picture quality and reduce the distribution cost are proving to prolong the life of AVC.
I also recall these past codecs(HEVC and VP9) having set bitrates for a given quality level, so bandwidth bound streaming will sure continue as an issue.
This part is interesting as well;
“The AV1 reference encoder is [as of today] a hundred times slower than an HEVC one,” says Fautier. “That will likely improve in the hands of encoding vendors but I still expect the additional complexity of the encoder to be around ten times vs HEVC.”
[Source]
 
Turns out that is indeed scheduled for development:

Back in the day, encoders would do it manually scene by scene. Neat that that's getting automated though, I do like.

I also recall these past codecs(HEVC and VP9) having set bitrates for a given quality level, so bandwidth bound streaming will sure continue as an issue.

Not quite. H264, H265, VP8 and VP9 other two modes: constant quality, and constant bitrate with a seperate "preset" that defines which sub-encoder to use (slower gives better quality per bit, at the cost of longer encoding times).

This part is interesting as well;
[Source]

Early days. H265 and VP9 were real slow at the start. In fact, libvpx, the VP9 encoder, is still only something like 2 or 4 threads to this day (iirc). Ultimately though, the encoder performance isn't too much of a problem. Decoder performance is, and that'll be solved when we get true fixed-function decode blocks in our GPUs, probably next year (remember, AV1 was only finalized a few months ago.. too soon to go in any immediately upcoming GPU).
 
Not quite. H264, H265, VP8 and VP9 other two modes: constant quality, and constant bitrate with a seperate "preset" that defines which sub-encoder to use (slower gives better quality per bit, at the cost of longer encoding times).
You drop from the channel stream unless you can dedicate the bandwidth, constant or otherwise. The quality levels don't do a good job of artifact clearance with respect to bandwidth use. Let's see if content awareness offers a solution.
 
Back
Top