Friday, October 16th 2020

NVIDIA Updates Video Encode and Decode Matrix with Reference to Ampere GPUs

NVIDIA has today updated its video encode and decode matrix with references to the latest Ampere GPU family. The video encode/decode matrix represents a table of supported video encoding and decoding standards on different NVIDIA GPUs. The matrix has a reference dating back to the Maxwell generation of NVIDIA graphics cards, showing what video codecs are supported by each generation. That is a useful tool for reference purposes, as customers can check if their existing or upcoming GPUs support a specific codec standard if they need any for video reproduction purposes. The update to the matrix comes in a form of Ampere GPUs, which are now present there.

For example, the table shows that, while supporting all of the previous generations of encoding standards, the Ampere based GPUs feature support for HEVC B Frame standard. For decoding purposes, the Ampere lineup now includes support for AV1 8-bit and 10-bit formats, while also supporting all of the previous generation formats. For a more detailed look at the table please go toNVIDIA's website here.
NVIDIA Encoding and Decoding Standards
Source: NVIDIA Video Encode and Decode Matrix
Add your own comment

30 Comments on NVIDIA Updates Video Encode and Decode Matrix with Reference to Ampere GPUs

#1
stimpy88
nGreedia really dropped the ball on this. No excuse for not improving the hardware encoding of NVENC, as it's pretty awful for any serious use.
Posted on Reply
#2
Verpal
I wonder when will AV1 hardware encoding come to consumer? Hopper?
Posted on Reply
#3
Unregistered
Microsoft Confirms...

AMD RDNA2 graphics cards will provide hardware acceleration for AV1 (AOMedia Video 1) video codec.
#4
Vayra86
beedooMicrosoft Confirms...

AMD RDNA2 graphics cards will provide hardware acceleration for AV1 (AOMedia Video 1) video codec.
It is odd to see this shift in leadership, slowly but surely, manifest. I mean its just a video codec... but the writing's on the wall.

Fascinating, too.
Posted on Reply
#5
Sybaris_Caesar
VerpalI wonder when will AV1 hardware encoding come to consumer? Hopper?
Tell me honesty, do you really want that? NVENC is good for streaming but CPU (software) encoding still produces smaller file than NVENC (hardware) encoding ever will. At least when I used X265 last time.
Posted on Reply
#6
londiste
beedooMicrosoft Confirms...
AMD RDNA2 graphics cards will provide hardware acceleration for AV1 (AOMedia Video 1) video codec.
I am not even going to look for a source and just say this is decoding support, not encoding.
Posted on Reply
#7
JJJJJamesSZH
Wow
Still lack of 10bit 4:2:2 HW Decoding
So f*kin BRILLIANT
Posted on Reply
#8
londiste
JJJJJamesSZHWow
Still lack of 10bit 4:2:2 HW Decoding
So f*kin BRILLIANT
Just curious, what exactly comes at 4:2:2, 10-bit or not? Some better video cameras are what come to mind, anything else?
Movies/series' from Bluray or practically all streaming services should be 4:2:0 and RGB/no subsampling is 4:4:4.
Posted on Reply
#9
kiriakost
YouTube failed to deliver a good income at their content creators.
Why investing in a sector that there is no profit (video encode) ?
Posted on Reply
#10
Mouth of Sauron
I would like to suggest or propose - and in no mean demand or request - to staff including AV1 encoding (and, perhaps decoding) test in the standard benchmark suite (both CPU and GPU can be tested)

[This thesaurus-like sentence is because I had some misunderstanding last time when I made suggestion, I was misunderstood and want to be clear this time - it's just a suggestion]

I'm sure that the staff knows and understands AV1, but I'll put a couple of lines about it, for people not so familiar with codecs.

AV1 is, in my opinion, The Next God of video compression codecs. x.264 is quite old (but really fast to encode, real-time in most cases). Same goes for x.265/HEVC, which is several times slower in encoding, but with considerably better quality. VP9 (made by google, used partially on YouTube) is even better quality, but much slower to encode - it's the newest codec in relatively wide use, and... it's from 2013th.

AV1 is a huge step forward, but (guess) slow to decode - though mostly everything can do it, even with just software decoder; in fact, hardware decoder just takes off some load from CPU (if we use anything reasonably capable - mobile phones can do it, if I remember correctly). Encoding is, however, painfully slow and far, far away from real-time encoding. AV1 compressed files can be found on certain, dedicated places on YouTube.

Who needs this test? Well, people who are into streaming, who do lots of uncompressed material encoding - if speed problem is solved, probably everyone would use it in some way (note that we can't watch AV1 compressed files, because people who do actual encoding need too much time) - same applies to VP9, to a point... In a way, it's of more importance than, say, Cinebench

General note: I'm aware of the current good encoder problems, but believe that adequate solution will be found soon...
Posted on Reply
#11
Sybaris_Caesar
Mouth of SauronI would like to suggest or propose - and in no mean demand or request - to staff including AV1 encoding (and, perhaps decoding) test in the standard benchmark suite (both CPU and GPU can be tested)

[This thesaurus-like sentence is because I had some misunderstanding last time when I made suggestion, I was misunderstood and want to be clear this time - it's just a suggestion]

I'm sure that the staff knows and understands AV1, but I'll put a couple of lines about it, for people not so familiar with codecs.

AV1 is, in my opinion, The Next God of video compression codecs. x.264 is quite old (but really fast to encode, real-time in most cases). Same goes for x.265/HEVC, which is several times slower in encoding, but with considerably better quality. VP9 (made by google, used partially on YouTube) is even better quality, but much slower to encode - it's the newest codec in relatively wide use, and... it's from 2013th.

AV1 is a huge step forward, but (guess) slow to decode - though mostly everything can do it, even with just software decoder; in fact, hardware decoder just takes off some load from CPU (if we use anything reasonably capable - mobile phones can do it, if I remember correctly). Encoding is, however, painfully slow and far, far away from real-time encoding. AV1 compressed files can be found on certain, dedicated places on YouTube.

Who needs this test? Well, people who are into streaming, who do lots of uncompressed material encoding - if speed problem is solved, probably everyone would use it in some way (note that we can't watch AV1 compressed files, because people who do actual encoding need too much time) - same applies to VP9, to a point... In a way, it's of more importance than, say, Cinebench

General note: I'm aware of the current good encoder problems, but believe that adequate solution will be found soon...
Is there any industry standard application that supports AV1? I don't think Handbrake does and it’s the only widely used application used in benchmarks by my guess.
Posted on Reply
#12
R-T-B
beedooMicrosoft Confirms...

AMD RDNA2 graphics cards will provide hardware acceleration for AV1 (AOMedia Video 1) video codec.
decoding, not encoding. Same as nvidia.
londisteI am not even going to look for a source and just say this is decoding support, not encoding.
It is.
kiriakostYouTube failed to deliver a good income at their content creators.
Why investing in a sector that there is no profit (video encode) ?
You do realize people encode videos for all sorts of things besides youtube, right?
Posted on Reply
#13
mtcn77
kiriakostYouTube failed to deliver a good income at their content creators.
Why investing in a sector that there is no profit (video encode) ?
Youtube is not profitting from its video services. It could provide a stimulus for them to cut down on expenses.
Posted on Reply
#14
Berfs1
So uh, Turing NVENC = Ampere NVENC?


I am 90% certain the 960 Ti exists.....
Posted on Reply
#15
rbgc
Berfs1So uh, Turing NVENC = Ampere NVENC?
Yes. The same NVENC. Like for example on 1660 Ti, lowest graphics card with full featured Turing NVENC and without RTX. NVENC improvement in Ampere will be only in software updates distributed with drivers.
Posted on Reply
#16
rtwjunkie
PC Gaming Enthusiast
stimpy88nGreedia really dropped the ball on this.
I always have to wonder at the age of the writer, or their experience or exposure to real life business when I see this. It makes me chuckle.
Posted on Reply
#17
squallheart
Vayra86It is odd to see this shift in leadership, slowly but surely, manifest. I mean its just a video codec... but the writing's on the wall.

Fascinating, too.
The writing’s on the wall that you are completely oblivious to what’s going on and didn’t realize Big Navi will have AV1 decode, not encode.
Posted on Reply
#18
R-T-B
mtcn77Youtube is not profitting from its video services.
Yes, they are, I assure you.
Posted on Reply
#19
BluesFanUK
So a pointless upgrade for anyone with a GPU for media consumption using something like MadVR.
Posted on Reply
#20
stimpy88
rtwjunkieI always have to wonder at the age of the writer, or their experience or exposure to real life business when I see this. It makes me chuckle.
Maybe if all you do is play Fortnite all day, then quality probably doesn't matter to your addled brain, as it would lack the intelligence to even see the difference.

...Meanwhile people who use their graphics cards for a living would quite like to see progress made on these non-gamer features.
Posted on Reply
#21
rtwjunkie
PC Gaming Enthusiast
stimpy88Maybe if all you do is play Fortnite all day, then quality probably doesn't matter to your addled brain, as it would lack the intelligence to even see the difference.

...Meanwhile people who use their graphics cards for a living would quite like to see progress made on these non-gamer features.
Excuse me? You know nothing about me (education level, life experiences, successes, accomplishments, etc). Nothing that would qualify you to comment on my supposedly addled brain or intelligence. Nice try at trolling tho. To properly troll, you actually need to be in a superior position to begin with. But you’re, not, as evidenced by your support of the childish term “ngreedia”.

Now, calling corporations like Nvidia “ngreedia” merely demonstrates your immaturity and complete lack of understanding of economics. You know, the world of grownups. It’s nice that you were able to share with us though.

Oh, and with the you know nothing about me, I don’t play fortnite and never will. I guess that was your go to for obvious reasons. If you use gaming gpu’s for your living, you’re doing it wrong...unless you’re playing fortnite professionally. ;)
Posted on Reply
#22
Berfs1
rtwjunkieExcuse me? You know nothing about me (education level, life experiences, successes, accomplishments, etc). Nothing that would qualify you to comment on my supposedly addled brain or intelligence. Nice try at trolling tho. To properly troll, you actually need to be in a superior position to begin with. But you’re, not, as evidenced by your support of the childish term “ngreedia”.
You literally said NVENC is awful. That tells us all we need to know about your intelligence level.
Posted on Reply
#23
rtwjunkie
PC Gaming Enthusiast
Berfs1You literally said NVENC is awful. That tells us all we need to know about your intelligence level.
You are an imaginitive, yet dimwitted sort. Look at my wuote. The only thing I talked about was the person I quoted using the term “ngreedia”, which tells adults all they need to know about the writer’s experience baseline. Thanks for playing!
Posted on Reply
#24
Berfs1
stimpy88nGreedia really dropped the ball on this. No excuse for not improving the hardware encoding of NVENC, as it's pretty awful for any serious use.
Explain how it is awful for professional use? It's better than AMD's encoder in a heartbeat, better than Intel QuickSync, better than EVEN MOST CPU encoding scenarios, yeah, and a 1650 Super only costs what, 180$? It is EXTREMELY cost effective, and is one of the best encoders for streaming (not to mention it will take WAY less energy than CPU encoding).
Posted on Reply
#25
Mouth of Sauron
Basically, neither decode nor encode absolutely *needs* hardware support - 3090, with hardware support, still doesn't allow real-time encoding for higher resolution - perhaps maybe 720p, there isn't lots of tests online.

Same goes for Intel CPUs, which have hardware support for both encoding and decoding.

As for decoding, so far hardware acceleration on GPU just lessens the burden on CPU side - virtually anything can decode file in real-time, except load is bigger if there isn't hardware support (but then again, both AMD and NVIDIA have support for decoding, we just don't know how much - the only results I found were for 3080 and I can't remember which CPU - utilization dropped from ~90% to some 50-70%, but not all cores were utilized).

Perhaps it would be interesting to test if 16 core 5950x can decode faster then Intel offering with HW support, or how many FPS which offering makes (though I doubt anything will allow real-time).

VP9 is considerably faster, and ffdmpeg supports both.

I realize this isn't something that everybody needs, but it means something to streamers or to people who compress raw video material - AV1 gives considerably better quality than VP9 at lower bitrate, and same goes for VP9 and x.265. Currently, most real-time encoding is done by x.264, which is rather old (VP9 is from 2013th, I think, and AV1 is from 2018th, x.264 is ancient).

Just an idea, encoding is a highly CPU-intensive test, HW does help - but only to the point.

Back in the days, I did some testing on VP9, but hardware was much less advanced (I mean just CPUs, I don't have a number of CPUs to test, and HW support was non-existent at that time).

I don't know how much is ffdmpeg optimized for either Intel, AMD or NVIDIA, too...
Posted on Reply
Add your own comment
May 9th, 2024 06:21 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts