Wednesday, April 24th 2019

NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine

NVIDIA GeForce GTX 1650 has a significantly watered down multimedia feature-set compared to the other GeForce GTX 16-series GPUs. The card was launched this Tuesday (23 April) without any meaningful technical documentation for reviewers, which caused many, including us, to assume that NVIDIA carried over the "Turing" NVENC encoder, giving you a feature-rich HTPC or streaming card at $150. Apparently that is not the case. According to full specifications put out by NVIDIA on its website product-page that went up hours after product launch, the GTX 1650 (and the TU117 silicon) features a multimedia engine that's been carried over from the older "Volta" architecture.

Turing's NVENC is known to have around 15 percent performance uplift over Volta's, which means the GTX 1650 will have worse game livestreaming performance than expected. The GTX 1650 has sufficient muscle for playing e-Sports titles such as PUBG at 1080p, and with an up-to-date accelerated encoder, would have pulled droves of more amateur streamers to the mainstream on Twitch and YouTube Gaming. Alas, the $220 GTX 1660 would be your ticket to that.
Sources: Quickshot_Gaming (Reddit), NVIDIA
Add your own comment

92 Comments on NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine

#26
Camm
This feels like such an own goal by Nvidia, sigh.
Posted on Reply
#27
notb
Chloe PriceAh, so I won't bother. Well, Ryzen 5 2600 handles the software encoding fine.

In fact I don't even know how streaming with two PCs work, so no, I don't have an another machine for that. :D
Well, the only method that doesn't really affect the main PC performance is normal video capturing, i.e. you have to buy a capture card (a USB dongle for $100 will do) and you send the signal via a normal video cable (like with a second monitor in clone mode). The second PC captures it an encodes/streams.
There are some other options - like LAN streaming - but they'll always use your CPU in some way. That said, they may use less of it than encoding itself. :)

You can always try the encoder in your Radeon. It will work. It's just not as polished as competition.
Some people noticed problems with reserving GPU resources, i.e. when your GPU loads get near 100%, AMD cards often prioritize gaming. Basically, instead of losing 10-20% of fps on your monitor, you'll lose 95% of fps in the stream. This could have been fixed, though.
AMD encoding adds the most lag, so it's not recommended for scenarios when you play on the receiver (e.g. I stream to a TV). You'll get less lag encoding in Intel QuickSync, which means signal has to move over PCIe first.
Chloe PriceSoftware encoding with "faster" preset, 720p output @ 3000kbit/s and watchers said that works flawlessly.
Yup, 720p is not an issue for sure. Ryzen will do that easily. :)
1080p should work as well. Give it a go.
It gets problematic when you start thinking about 4K. :p
Posted on Reply
#28
xorbe
It seems unlikely to me that they'd go that far out of their way to make a special hardward mix for a lower end gpu. It's probably just strangulation via drivers.
Posted on Reply
#29
Keullo-e
S.T.A.R.S.
The biggest problem is my interwebz connection, I'm kicking with 4G (connection from phone as a hotspot, dongle in PC), performance probably isn't the bottleneck..

But thanks for the tips! At least I need a capture card for some console gameplay, though which is the best way to capture component video (PS2)? A PCI-E HDMI input card would handle PS3.
xorbeIt seems unlikely to me that they'd go that far out of their way to make a special hardward mix for a lower end gpu. It's probably just strangulation via software drivers.
I have the same feeling. Maybe Turing's NVENC shares features with Volta's one, and they just disabled those?
Posted on Reply
#30
R-T-B
notbSo first of all:
NO, 1650 still has the NVENC encoder. The only missing feature is the HEVC B-frame support (introduced in Turing).
Basically, it's the same encoder we had in Pascal.

How solid must 1650 be if the usual anti-Nvidia nitpicking results in an article about missing hardware encoder? And the one that's not true? :eek:
There is a performance difference. That said, this is still a storm in a teacup. Research what you buy.
Posted on Reply
#31
notb
xorbeIt seems unlikely to me that they'd go that far out of their way to make a special hardward mix for a lower end gpu. It's probably just strangulation via drivers.
Yeah, they did it for a reason. There's no way this saves them any money.

Maybe B frame support is power hungry? It is actually quite complex. You're not just compressing a single frame, but also comparing it to neighbours.
They had 75W budget. Idle draw + fan is already 10W easily. You're left with 50-55W for game rendering. Even if B frame sucks 2W, it would make a difference.

It would be nice if a Turing owner could test it. :-)
R-T-BResearch what you buy.
100% disagree. This is a simple card put into mainstream OEM PCs or bought by people that aren't really spending a lot on computers (of either money and time).
The only thing a buyer should be forced to "research" is whether it can run the game he wants.

Which brings us to the issue of... whether lack of B frame support is really an issue for the target consumer. It's not, right?
Posted on Reply
#32
R-T-B
notb100% disagree. This is a simple card put into mainstream OEM PCs or bought by people that aren't really spending a lot on computers (of either money and time).
The only thing a buyer should be forced to "research" is whether it can run the game he wants.
You're going to have a hard time disagreeing when you realize that's 100% what I meant. Use case research is all you should ever expect. :p
Posted on Reply
#33
Keullo-e
S.T.A.R.S.
It is a significant feature to drop from 1650, since these are fine for esports gaming.
Posted on Reply
#34
notb
Chloe PriceIt is a significant feature to drop from 1650, since these are fine for esports gaming.
But 1650 can still encode and stream! :)
B frame support simply produces smaller files at the same quality.
People use 1050Ti for esports and they can use 1650 just as well. It has the same NVENC encoding capabilities.

B frame support matters when you need to lower the bandwidth use. I don't think that's a huge problem for 1650 owners. It's not like they'll be streaming 4K@60fps. ;-)
Posted on Reply
#36
danbert2000
I was going to say, this is a non-issue. You're not going to be streaming much beyond 1080p with this card anyway, and somehow 1080 Ti owners are still doing just fine with their NVENC block. I would like the Turing block in a card someday as I enjoy being able to transcode to HEVC but I do notice the lack of efficiency compared to CPU encoding. I heard Turing closed the gap somewhat. But if you're streaming, who cares about the final file size at a certain point. It's a stream. As long as you have a decent upload, you'll be fine. And if you don't, then I doubt Turing NVENC is going to be enough to make it usable in most situations.

This is interesting, though. In the Maxwell days, it was the 960 and 950 that had the better encoding and decoding compared to the 970 and 980 Ti. Now we have the exact opposite. It always did rub me the wrong way that my 980 was in some ways inferior to the cheaper and less performant product. I prefer this situation more.

Also, there must be some expansion of the NVENC block for Turing that made this decision make sense in the first place. They must have been trying to use as little silicon as possible, otherwise why not put your best foot forward?
Posted on Reply
#37
GoldenX
danbert2000I was going to say, this is a non-issue. You're not going to be streaming much beyond 1080p with this card anyway, and somehow 1080 Ti owners are still doing just fine with their NVENC block. I would like the Turing block in a card someday as I enjoy being able to transcode to HEVC but I do notice the lack of efficiency compared to CPU encoding. I heard Turing closed the gap somewhat. But if you're streaming, who cares about the final file size at a certain point. It's a stream. As long as you have a decent upload, you'll be fine. And if you don't, then I doubt Turing NVENC is going to be enough to make it usable in most situations.

This is interesting, though. In the Maxwell days, it was the 960 and 950 that had the better encoding and decoding compared to the 970 and 980 Ti. Now we have the exact opposite. It always did rub me the wrong way that my 980 was in some ways inferior to the cheaper and less performant product. I prefer this situation more.

Also, there must be some expansion of the NVENC block for Turing that made this decision make sense in the first place. They must have been trying to use as little silicon as possible, otherwise why not put your best foot forward?
We could just have the same encoding capabilities on all cards. It's not that hard, any other GPU manufacturer does just that.
Why does Nvidia, who has the best hardware, best drivers, highest market share, has to do stupid shit like this all the time?
Posted on Reply
#38
danbert2000
GoldenXWe could just have the same encoding capabilities on all cards. It's not that hard, any other GPU manufacturer does just that.
Why does Nvidia, who has the best hardware, best drivers, highest market share, has to do stupid shit like this all the time?
It's an easy way to improve profit margin. They're going to be selling hundreds of thousands of these chips, by reducing silicon area or creating an artificial feature hierarchy with their cheap cards, they are going to improve profits and 99% of people won't know or won't care that they don't get B-frames in HEVC streams from the NVENC on a $150 card. It would be nice for them to include all of the Turing features on each card, but at the end of the day this only screws over people that were buying this card to transcode video, which is probably a very small portion of the people who will be buying this. This chip will probably be sold mostly in cheap gaming laptops, any serious streamer will probably want a better card anyway.
Posted on Reply
#39
jabbadap
danbert2000It's an easy way to improve profit margin. They're going to be selling hundreds of thousands of these chips, by reducing silicon area or creating an artificial feature hierarchy with their cheap cards, they are going to improve profits and 99% of people won't know or won't care that they don't get B-frames in HEVC streams from the NVENC on a $150 card. It would be nice for them to include all of the Turing features on each card, but at the end of the day this only screws over people that were buying this card to transcode video, which is probably a very small portion of the people who will be buying this. This chip will probably be sold mostly in cheap gaming laptops, any serious streamer will probably want a better card anyway.
Well more serious question, how many end users are encoding videos at all? Not to mention how many of them are really using nvenc and not software x264 or x265 for superior quality. The more important part for end users nvdec is still all Turing with full of the features it can have.
Posted on Reply
#40
Assimilator
TheLostSwedeIn terms of content or comments?
Both.

bta is posting unsourced inflammatory clickbait articles left right and centre, and the same braindead trolls are posting "Intel/NVIDIA/MSI/<insert-company-name-here> sucks" comments in response.
Posted on Reply
#41
notb
jabbadapWell more serious question, how many end users are encoding videos at all?
Hmm... and how many PC users are gaming on them?
Be careful with arguments like that. ;-)

But seriously, quite a lot. In fact most of us stream. :)
Keep in mind all kinds of remote desktops are basically streaming (RDP, Citrix, VMWare, Teamviewer and so on).
Have you ever used the "Project" function in Windows? How do you think that works? :p
We also stream video between devices we own (NAS to TV, PC/Mac to phone etc) - we just don't think about it. We click "show on TV" and it just magically works - like computers should :).

And yeah, the game streaming phenomenon is also quite significant.

There was also a time when people actually kept movies on drives and consciously encoded them all the time for their phones etc. I think this is a dying skill at the moment - it moved to the cloud mostly.
Not to mention how many of them are really using nvenc and not software x264 or x265 for superior quality. The more important part for end users nvdec is still all Turing with full of the features it can have.
Most people stream unconsciously. Computer chooses the technology for them.

Now, the actual support for NVENC is another matter. By default everything looks for a hardware encoder in the CPU/SoC.
But Citrix and VMware remote desktops already support NVENC (datacenter software - makes sense).
GoldenXWe could just have the same encoding capabilities on all cards. It's not that hard, any other GPU manufacturer does just that.
Why does Nvidia, who has the best hardware, best drivers, highest market share, has to do stupid shit like this all the time?
Really, you'll criticize Nvidia because they've introduced a very advanced encoding feature and it's not available in the cheapest GPU. Oh my...
Posted on Reply
#42
GoldenX
notbReally, you'll criticize Nvidia because they've introduced a very advanced encoding feature and it's not available in the cheapest GPU. Oh my...
Yes, same with Intel, why does anything lower than an i3 can't get an 8 years old extension (AVX) that is widely used?
Why did the GT 1030 had NVENC removed when even the small Nintendo Switch's SoC has the power to use it? Why does the 1050 successor gets a gimped version of it's NVENC, when the 1050 had a full feature set at the time?
There is zero cost difference with enabling those features already on the hardware, and gives you good marketing for free.
Posted on Reply
#43
notb
GoldenXYes, same with Intel, why does anything lower than an i3 can't get an 8 years old extension (AVX) that is widely used?
Why did the GT 1030 had NVENC removed when even the small Nintendo Switch's SoC has the power to use it? Why does the 1050 successor gets a gimped version of it's NVENC, when the 1050 had a full feature set at the time?
There is zero cost difference with enabling those features already on the hardware, and gives you good marketing for free.
So Nvidia and Intel are bad.
What about AMD? Are you sure they're offering the same encoding features in all their GPUs?

BTW: do you even know what features are supported? :-D
I have no idea how to check this. AMD website is a mess. I think it's not there.

I looked at wikipedia, I've asked google. Nothing. It seems like all the humanity knowledge about VCE is coming from slides and leaks. :-D
en.wikipedia.org/wiki/Video_Coding_Engine
Not much in general. Nothing about VCE 4.0. And there's even VCE 4.1 now!. :-D

Even the name is problematic. Most of the web seems to think "C" is for either "Coding" or "Codec". AMD says its "Code" (why haven't they fixed the wiki article?)
The only thing most of the web agrees on is that VCE is rubbish.
Posted on Reply
#44
GoldenX
notbSo Nvidia and Intel are bad.
What about AMD? Are you sure they're offering the same encoding features in all their GPUs?

BTW: do you even know what features are supported? :-D
I have no idea how to check this. AMD website is a mess. I think it's not there.

I looked at wikipedia, I've asked google. Nothing. It seems like all the humanity knowledge about VCE is coming from slides and leaks. :-D
en.wikipedia.org/wiki/Video_Coding_Engine
Not much in general. Nothing about VCE 4.0. And there's even VCE 4.1 now!. :-D

Even the name is problematic. Most of the web seems to think "C" is for either "Coding" or "Codec". AMD says its "Code" (why haven't they fixed the wiki article?)
The only thing most of the web agrees on is that VCE is rubbish.
AMD drivers and their whitepapers are a complete mess, there is a reason why everyone hates coding for AMD. But you get the same encoding capabilities with an RX 550, a Vega 8 (2200G) or a Radeon 7, none of them has VCE "removed" because "reasons". Even Intel GPUs are more consistent on their features than Nvidia ones, all GPUs are the same, spec wise.
Defending forced market segmentation on hardware that is perfectly capable of having a full feature set...
The only way to read this is: "Oh, you want to stream with a lower bitrate? Go get a (1660 to 2080Ti), even thou our low end hardware is more than capable of doing it".
Posted on Reply
#45
notb
GoldenXAMD drivers and their whitepapers are a complete mess. But you get the same encoding capabilities with an RX 550, a Vega 8 (2200G) or a Radeon 7, none of them has VCE "removed" because "reasons".
But how do you know that? There's no proof, no official statement even.
Have you checked them all personally? Has anyone, ever?

Do you even know what features are supported by your 270X?
According to wikipedia, your GPU has VCE 1.0. I have to base this on wiki, because - obviously - AMD website doesn't mention it. In fact it doesn't mention VCE at all:

www.amd.com/en/support/graphics/amd-radeon-r9-series/amd-radeon-r9-200-series/amd-radeon-r9-270x

Anyway, it came out at roughly the same time Nvidia launched Maxwell.
So thanks to this lovely chart: developer.nvidia.com/video-encode-decode-gpu-support-matrix I know Maxwell introduced lossless H.264.
Does your GPU support lossless H.264?
Posted on Reply
#46
GoldenX
notbBut how do you know that? There's no proof, no official statement even.
Have you checked them all personally? Has anyone, ever?

Do you even know what features are supported by your 270X?
According to wikipedia, your GPU has VCE 1.0. I have to base this on wiki, because - obviously - AMD website doesn't mention it. In fact it doesn't mention VCE at all:

www.amd.com/en/support/graphics/amd-radeon-r9-series/amd-radeon-r9-200-series/amd-radeon-r9-270x

Anyway, it came out at roughly the same time Nvidia launched Maxwell.
So thanks to this lovely chart: developer.nvidia.com/video-encode-decode-gpu-support-matrix I know Maxwell introduced lossless H.264.
Does your GPU support lossless H.264?
Does a 2013 card support it? Does a Kepler (Maxwell is 2014, and the 270x is in fact a horrible rename of an HD7800, so, 2012)? That's not the discussion.
I can get an AMD CPU, any of them, and I get AVX2. Good luck getting AVX (1, 2, 512) and FMA on any low end Intel CPU. I can get any AMD or Intel GPU and encode, good luck with that on a 1030 (free, useless Geforce Experience on them).
After some time, when B frames are standard, guess who will hate having a 1650?
Yes, Nvidia is the first to deploy it. There is absolutely no need to limit it to the most expensive hardware "just because".
Posted on Reply
#47
notb
GoldenXDoes a 2013 card support it? Does a Kepler (Maxwell is 2014, and the 270x is in fact a horrible rename of an HD7800, so, 2012)? That's not the discussion.
And that wasn't my question either.
Can you say, being 100% sure, whether your GPU supports the feature I mentioned or doesn't?

Because if you can't, how can you be sure that all AMD GPUs in a generation have the same functionality? Where do you get this knowledge from?
Posted on Reply
#48
GoldenX
notbAnd that wasn't my question either.
Can you say, being 100% sure, whether your GPU supports the feature I mentioned or doesn't?

Because if you can't, how can you be sure that all AMD GPUs in a generation have the same functionality? Where do you get this knowledge from?
github.com/obsproject/obs-amd-encoder/wiki/Hardware,-GCN-and-VCE-Limits
As long as the VCE version you are referring to is the same, there are no differences, as it should be. Same with Intel.
Now, if you want official information, too bad, AMD is the worst on that front. Never try their profiler, it doesn't even work.

Oh, look, AMD has B frames for H264 since VCE2.0, that's from 2013.
Posted on Reply
#49
lexluthermiester
Setting aside all the fanboying and special-snowflaking, this is a budget card. Streamers should not be looking at this card as an option. They should be going for a GTX 1070(ti), 1080(ti) or an RTX card. The GTX16xx budget cards and are just not made for streamers. All the back and forth arguments are kinda pointless.
OSdevrI feel like TPU is slowly becoming WCCFtech.
Seriously? Grow up.
Posted on Reply
#50
GoldenX
lexluthermiesterSetting aside all the fanboying and special-snowflaking, this is a budget card. Streamers should not be looking at this card as an option. They should be going for a GTX 1070(ti), 1080(ti) or an RTX card. The GTX16xx budget cards and are just not made for streamers. All the back and forth arguments are kinda pointless.


Seriously? Grow up.
Wouldn't it be better to have a cheap dedicated card for streaming, and the big boy for gaming? No performance loss.
Posted on Reply
Add your own comment
Apr 26th, 2024 09:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts