• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Updates Video Encode and Decode Matrix with Reference to Ampere GPUs

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,999 (1.07/day)
NVIDIA has today updated its video encode and decode matrix with references to the latest Ampere GPU family. The video encode/decode matrix represents a table of supported video encoding and decoding standards on different NVIDIA GPUs. The matrix has a reference dating back to the Maxwell generation of NVIDIA graphics cards, showing what video codecs are supported by each generation. That is a useful tool for reference purposes, as customers can check if their existing or upcoming GPUs support a specific codec standard if they need any for video reproduction purposes. The update to the matrix comes in a form of Ampere GPUs, which are now present there.

For example, the table shows that, while supporting all of the previous generations of encoding standards, the Ampere based GPUs feature support for HEVC B Frame standard. For decoding purposes, the Ampere lineup now includes support for AV1 8-bit and 10-bit formats, while also supporting all of the previous generation formats. For a more detailed look at the table please go to NVIDIA's website here.


View at TechPowerUp Main Site
 
nGreedia really dropped the ball on this. No excuse for not improving the hardware encoding of NVENC, as it's pretty awful for any serious use.
 
Microsoft Confirms...

AMD RDNA2 graphics cards will provide hardware acceleration for AV1 (AOMedia Video 1) video codec.
 
Microsoft Confirms...

AMD RDNA2 graphics cards will provide hardware acceleration for AV1 (AOMedia Video 1) video codec.

It is odd to see this shift in leadership, slowly but surely, manifest. I mean its just a video codec... but the writing's on the wall.

Fascinating, too.
 
I wonder when will AV1 hardware encoding come to consumer? Hopper?
Tell me honesty, do you really want that? NVENC is good for streaming but CPU (software) encoding still produces smaller file than NVENC (hardware) encoding ever will. At least when I used X265 last time.
 
Microsoft Confirms...
AMD RDNA2 graphics cards will provide hardware acceleration for AV1 (AOMedia Video 1) video codec.
I am not even going to look for a source and just say this is decoding support, not encoding.
 
Wow
Still lack of 10bit 4:2:2 HW Decoding
So f*kin BRILLIANT
Just curious, what exactly comes at 4:2:2, 10-bit or not? Some better video cameras are what come to mind, anything else?
Movies/series' from Bluray or practically all streaming services should be 4:2:0 and RGB/no subsampling is 4:4:4.
 
YouTube failed to deliver a good income at their content creators.
Why investing in a sector that there is no profit (video encode) ?
 
I would like to suggest or propose - and in no mean demand or request - to staff including AV1 encoding (and, perhaps decoding) test in the standard benchmark suite (both CPU and GPU can be tested)

[This thesaurus-like sentence is because I had some misunderstanding last time when I made suggestion, I was misunderstood and want to be clear this time - it's just a suggestion]

I'm sure that the staff knows and understands AV1, but I'll put a couple of lines about it, for people not so familiar with codecs.

AV1 is, in my opinion, The Next God of video compression codecs. x.264 is quite old (but really fast to encode, real-time in most cases). Same goes for x.265/HEVC, which is several times slower in encoding, but with considerably better quality. VP9 (made by google, used partially on YouTube) is even better quality, but much slower to encode - it's the newest codec in relatively wide use, and... it's from 2013th.

AV1 is a huge step forward, but (guess) slow to decode - though mostly everything can do it, even with just software decoder; in fact, hardware decoder just takes off some load from CPU (if we use anything reasonably capable - mobile phones can do it, if I remember correctly). Encoding is, however, painfully slow and far, far away from real-time encoding. AV1 compressed files can be found on certain, dedicated places on YouTube.

Who needs this test? Well, people who are into streaming, who do lots of uncompressed material encoding - if speed problem is solved, probably everyone would use it in some way (note that we can't watch AV1 compressed files, because people who do actual encoding need too much time) - same applies to VP9, to a point... In a way, it's of more importance than, say, Cinebench

General note: I'm aware of the current good encoder problems, but believe that adequate solution will be found soon...
 
I would like to suggest or propose - and in no mean demand or request - to staff including AV1 encoding (and, perhaps decoding) test in the standard benchmark suite (both CPU and GPU can be tested)

[This thesaurus-like sentence is because I had some misunderstanding last time when I made suggestion, I was misunderstood and want to be clear this time - it's just a suggestion]

I'm sure that the staff knows and understands AV1, but I'll put a couple of lines about it, for people not so familiar with codecs.

AV1 is, in my opinion, The Next God of video compression codecs. x.264 is quite old (but really fast to encode, real-time in most cases). Same goes for x.265/HEVC, which is several times slower in encoding, but with considerably better quality. VP9 (made by google, used partially on YouTube) is even better quality, but much slower to encode - it's the newest codec in relatively wide use, and... it's from 2013th.

AV1 is a huge step forward, but (guess) slow to decode - though mostly everything can do it, even with just software decoder; in fact, hardware decoder just takes off some load from CPU (if we use anything reasonably capable - mobile phones can do it, if I remember correctly). Encoding is, however, painfully slow and far, far away from real-time encoding. AV1 compressed files can be found on certain, dedicated places on YouTube.

Who needs this test? Well, people who are into streaming, who do lots of uncompressed material encoding - if speed problem is solved, probably everyone would use it in some way (note that we can't watch AV1 compressed files, because people who do actual encoding need too much time) - same applies to VP9, to a point... In a way, it's of more importance than, say, Cinebench

General note: I'm aware of the current good encoder problems, but believe that adequate solution will be found soon...
Is there any industry standard application that supports AV1? I don't think Handbrake does and it’s the only widely used application used in benchmarks by my guess.
 
Microsoft Confirms...

AMD RDNA2 graphics cards will provide hardware acceleration for AV1 (AOMedia Video 1) video codec.

decoding, not encoding. Same as nvidia.

I am not even going to look for a source and just say this is decoding support, not encoding.

It is.

YouTube failed to deliver a good income at their content creators.
Why investing in a sector that there is no profit (video encode) ?

You do realize people encode videos for all sorts of things besides youtube, right?
 
YouTube failed to deliver a good income at their content creators.
Why investing in a sector that there is no profit (video encode) ?
Youtube is not profitting from its video services. It could provide a stimulus for them to cut down on expenses.
 
So uh, Turing NVENC = Ampere NVENC?

1602870088116.png

I am 90% certain the 960 Ti exists.....
 
So uh, Turing NVENC = Ampere NVENC?

Yes. The same NVENC. Like for example on 1660 Ti, lowest graphics card with full featured Turing NVENC and without RTX. NVENC improvement in Ampere will be only in software updates distributed with drivers.
 
nGreedia really dropped the ball on this.
I always have to wonder at the age of the writer, or their experience or exposure to real life business when I see this. It makes me chuckle.
 
It is odd to see this shift in leadership, slowly but surely, manifest. I mean its just a video codec... but the writing's on the wall.

Fascinating, too.

The writing’s on the wall that you are completely oblivious to what’s going on and didn’t realize Big Navi will have AV1 decode, not encode.
 
So a pointless upgrade for anyone with a GPU for media consumption using something like MadVR.
 
I always have to wonder at the age of the writer, or their experience or exposure to real life business when I see this. It makes me chuckle.
Maybe if all you do is play Fortnite all day, then quality probably doesn't matter to your addled brain, as it would lack the intelligence to even see the difference.

...Meanwhile people who use their graphics cards for a living would quite like to see progress made on these non-gamer features.
 
Maybe if all you do is play Fortnite all day, then quality probably doesn't matter to your addled brain, as it would lack the intelligence to even see the difference.

...Meanwhile people who use their graphics cards for a living would quite like to see progress made on these non-gamer features.
Excuse me? You know nothing about me (education level, life experiences, successes, accomplishments, etc). Nothing that would qualify you to comment on my supposedly addled brain or intelligence. Nice try at trolling tho. To properly troll, you actually need to be in a superior position to begin with. But you’re, not, as evidenced by your support of the childish term “ngreedia”.

Now, calling corporations like Nvidia “ngreedia” merely demonstrates your immaturity and complete lack of understanding of economics. You know, the world of grownups. It’s nice that you were able to share with us though.

Oh, and with the you know nothing about me, I don’t play fortnite and never will. I guess that was your go to for obvious reasons. If you use gaming gpu’s for your living, you’re doing it wrong...unless you’re playing fortnite professionally. ;)
 
Last edited:
Excuse me? You know nothing about me (education level, life experiences, successes, accomplishments, etc). Nothing that would qualify you to comment on my supposedly addled brain or intelligence. Nice try at trolling tho. To properly troll, you actually need to be in a superior position to begin with. But you’re, not, as evidenced by your support of the childish term “ngreedia”.
You literally said NVENC is awful. That tells us all we need to know about your intelligence level.
 
You literally said NVENC is awful. That tells us all we need to know about your intelligence level.
You are an imaginitive, yet dimwitted sort. Look at my wuote. The only thing I talked about was the person I quoted using the term “ngreedia”, which tells adults all they need to know about the writer’s experience baseline. Thanks for playing!
 
nGreedia really dropped the ball on this. No excuse for not improving the hardware encoding of NVENC, as it's pretty awful for any serious use.
Explain how it is awful for professional use? It's better than AMD's encoder in a heartbeat, better than Intel QuickSync, better than EVEN MOST CPU encoding scenarios, yeah, and a 1650 Super only costs what, 180$? It is EXTREMELY cost effective, and is one of the best encoders for streaming (not to mention it will take WAY less energy than CPU encoding).
 
Back
Top