• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GTX 1650 Lacks Turing NVENC Encoder, Packs Volta's Multimedia Engine

Chloefile

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
10,878 (2.64/day)
Location
Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte B550M Aorus Elite
Cooling Custom loop (CPU+GPU, 240 & 120 rads)
Memory 32GB Kingston HyperX Fury @ DDR4-3466
Video Card(s) PowerColor RX 6700 XT Fighter
Storage ~4TB SSD + 6TB HDD
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Fractal Design Define Mini C
Audio Device(s) Asus TUF H3 Wireless
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 Legendary
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis remastered at 4K
It's a bit hit and miss. For AMD you should try AMD Relive, if that ain't give you acceptable quality non other software can(If your audio goes out of sync, enable HPET on bios. It had bug for OC Ryzens an timer did havoc on audio sync, I don't really know if it fixed).

Problem with streaming on gpus is h.264 that every streaming service uses, it can be done with gpu but the quality is crappy on lower bitrate, thus cpu encoding is usually way to go. H.265 would work better, but no service provider want's to pay that license fee for MPEG LA and I don't really blame them.
Didn't even install the Relive when I installed drivers. I just find that as an unneccessary bloatware.

Software encoding with "faster" preset, 720p output @ 3000kbit/s and watchers said that works flawlessly.
 
Joined
Sep 26, 2012
Messages
856 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
This feels like such an own goal by Nvidia, sigh.
 
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
Ah, so I won't bother. Well, Ryzen 5 2600 handles the software encoding fine.

In fact I don't even know how streaming with two PCs work, so no, I don't have an another machine for that. :D
Well, the only method that doesn't really affect the main PC performance is normal video capturing, i.e. you have to buy a capture card (a USB dongle for $100 will do) and you send the signal via a normal video cable (like with a second monitor in clone mode). The second PC captures it an encodes/streams.
There are some other options - like LAN streaming - but they'll always use your CPU in some way. That said, they may use less of it than encoding itself. :)

You can always try the encoder in your Radeon. It will work. It's just not as polished as competition.
Some people noticed problems with reserving GPU resources, i.e. when your GPU loads get near 100%, AMD cards often prioritize gaming. Basically, instead of losing 10-20% of fps on your monitor, you'll lose 95% of fps in the stream. This could have been fixed, though.
AMD encoding adds the most lag, so it's not recommended for scenarios when you play on the receiver (e.g. I stream to a TV). You'll get less lag encoding in Intel QuickSync, which means signal has to move over PCIe first.

Software encoding with "faster" preset, 720p output @ 3000kbit/s and watchers said that works flawlessly.
Yup, 720p is not an issue for sure. Ryzen will do that easily. :)
1080p should work as well. Give it a go.
It gets problematic when you start thinking about 4K. :p
 
Joined
Feb 14, 2012
Messages
2,304 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
It seems unlikely to me that they'd go that far out of their way to make a special hardward mix for a lower end gpu. It's probably just strangulation via drivers.
 

Chloefile

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
10,878 (2.64/day)
Location
Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte B550M Aorus Elite
Cooling Custom loop (CPU+GPU, 240 & 120 rads)
Memory 32GB Kingston HyperX Fury @ DDR4-3466
Video Card(s) PowerColor RX 6700 XT Fighter
Storage ~4TB SSD + 6TB HDD
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Fractal Design Define Mini C
Audio Device(s) Asus TUF H3 Wireless
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 Legendary
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis remastered at 4K
The biggest problem is my interwebz connection, I'm kicking with 4G (connection from phone as a hotspot, dongle in PC), performance probably isn't the bottleneck..

But thanks for the tips! At least I need a capture card for some console gameplay, though which is the best way to capture component video (PS2)? A PCI-E HDMI input card would handle PS3.

It seems unlikely to me that they'd go that far out of their way to make a special hardward mix for a lower end gpu. It's probably just strangulation via software drivers.

I have the same feeling. Maybe Turing's NVENC shares features with Volta's one, and they just disabled those?
 
Joined
Aug 20, 2007
Messages
20,709 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches
Software Windows 11 Enterprise (legit), Gentoo Linux x64
So first of all:
NO, 1650 still has the NVENC encoder. The only missing feature is the HEVC B-frame support (introduced in Turing).
Basically, it's the same encoder we had in Pascal.

How solid must 1650 be if the usual anti-Nvidia nitpicking results in an article about missing hardware encoder? And the one that's not true? :eek:

There is a performance difference. That said, this is still a storm in a teacup. Research what you buy.
 
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
It seems unlikely to me that they'd go that far out of their way to make a special hardward mix for a lower end gpu. It's probably just strangulation via drivers.
Yeah, they did it for a reason. There's no way this saves them any money.

Maybe B frame support is power hungry? It is actually quite complex. You're not just compressing a single frame, but also comparing it to neighbours.
They had 75W budget. Idle draw + fan is already 10W easily. You're left with 50-55W for game rendering. Even if B frame sucks 2W, it would make a difference.

It would be nice if a Turing owner could test it. :)
Research what you buy.
100% disagree. This is a simple card put into mainstream OEM PCs or bought by people that aren't really spending a lot on computers (of either money and time).
The only thing a buyer should be forced to "research" is whether it can run the game he wants.

Which brings us to the issue of... whether lack of B frame support is really an issue for the target consumer. It's not, right?
 
Joined
Aug 20, 2007
Messages
20,709 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches
Software Windows 11 Enterprise (legit), Gentoo Linux x64
100% disagree. This is a simple card put into mainstream OEM PCs or bought by people that aren't really spending a lot on computers (of either money and time).
The only thing a buyer should be forced to "research" is whether it can run the game he wants.

You're going to have a hard time disagreeing when you realize that's 100% what I meant. Use case research is all you should ever expect. :p
 

Chloefile

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
10,878 (2.64/day)
Location
Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte B550M Aorus Elite
Cooling Custom loop (CPU+GPU, 240 & 120 rads)
Memory 32GB Kingston HyperX Fury @ DDR4-3466
Video Card(s) PowerColor RX 6700 XT Fighter
Storage ~4TB SSD + 6TB HDD
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Fractal Design Define Mini C
Audio Device(s) Asus TUF H3 Wireless
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 Legendary
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis remastered at 4K
It is a significant feature to drop from 1650, since these are fine for esports gaming.
 
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
It is a significant feature to drop from 1650, since these are fine for esports gaming.
But 1650 can still encode and stream! :)
B frame support simply produces smaller files at the same quality.
People use 1050Ti for esports and they can use 1650 just as well. It has the same NVENC encoding capabilities.

B frame support matters when you need to lower the bandwidth use. I don't think that's a huge problem for 1650 owners. It's not like they'll be streaming 4K@60fps. ;-)
 
Joined
Jun 16, 2016
Messages
409 (0.14/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
I was going to say, this is a non-issue. You're not going to be streaming much beyond 1080p with this card anyway, and somehow 1080 Ti owners are still doing just fine with their NVENC block. I would like the Turing block in a card someday as I enjoy being able to transcode to HEVC but I do notice the lack of efficiency compared to CPU encoding. I heard Turing closed the gap somewhat. But if you're streaming, who cares about the final file size at a certain point. It's a stream. As long as you have a decent upload, you'll be fine. And if you don't, then I doubt Turing NVENC is going to be enough to make it usable in most situations.

This is interesting, though. In the Maxwell days, it was the 960 and 950 that had the better encoding and decoding compared to the 970 and 980 Ti. Now we have the exact opposite. It always did rub me the wrong way that my 980 was in some ways inferior to the cheaper and less performant product. I prefer this situation more.

Also, there must be some expansion of the NVENC block for Turing that made this decision make sense in the first place. They must have been trying to use as little silicon as possible, otherwise why not put your best foot forward?
 
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
I was going to say, this is a non-issue. You're not going to be streaming much beyond 1080p with this card anyway, and somehow 1080 Ti owners are still doing just fine with their NVENC block. I would like the Turing block in a card someday as I enjoy being able to transcode to HEVC but I do notice the lack of efficiency compared to CPU encoding. I heard Turing closed the gap somewhat. But if you're streaming, who cares about the final file size at a certain point. It's a stream. As long as you have a decent upload, you'll be fine. And if you don't, then I doubt Turing NVENC is going to be enough to make it usable in most situations.

This is interesting, though. In the Maxwell days, it was the 960 and 950 that had the better encoding and decoding compared to the 970 and 980 Ti. Now we have the exact opposite. It always did rub me the wrong way that my 980 was in some ways inferior to the cheaper and less performant product. I prefer this situation more.

Also, there must be some expansion of the NVENC block for Turing that made this decision make sense in the first place. They must have been trying to use as little silicon as possible, otherwise why not put your best foot forward?
We could just have the same encoding capabilities on all cards. It's not that hard, any other GPU manufacturer does just that.
Why does Nvidia, who has the best hardware, best drivers, highest market share, has to do stupid shit like this all the time?
 
Joined
Jun 16, 2016
Messages
409 (0.14/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
We could just have the same encoding capabilities on all cards. It's not that hard, any other GPU manufacturer does just that.
Why does Nvidia, who has the best hardware, best drivers, highest market share, has to do stupid shit like this all the time?

It's an easy way to improve profit margin. They're going to be selling hundreds of thousands of these chips, by reducing silicon area or creating an artificial feature hierarchy with their cheap cards, they are going to improve profits and 99% of people won't know or won't care that they don't get B-frames in HEVC streams from the NVENC on a $150 card. It would be nice for them to include all of the Turing features on each card, but at the end of the day this only screws over people that were buying this card to transcode video, which is probably a very small portion of the people who will be buying this. This chip will probably be sold mostly in cheap gaming laptops, any serious streamer will probably want a better card anyway.
 
Joined
Mar 10, 2014
Messages
1,793 (0.49/day)
It's an easy way to improve profit margin. They're going to be selling hundreds of thousands of these chips, by reducing silicon area or creating an artificial feature hierarchy with their cheap cards, they are going to improve profits and 99% of people won't know or won't care that they don't get B-frames in HEVC streams from the NVENC on a $150 card. It would be nice for them to include all of the Turing features on each card, but at the end of the day this only screws over people that were buying this card to transcode video, which is probably a very small portion of the people who will be buying this. This chip will probably be sold mostly in cheap gaming laptops, any serious streamer will probably want a better card anyway.

Well more serious question, how many end users are encoding videos at all? Not to mention how many of them are really using nvenc and not software x264 or x265 for superior quality. The more important part for end users nvdec is still all Turing with full of the features it can have.
 
Joined
Feb 18, 2005
Messages
5,239 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
In terms of content or comments?

Both.

bta is posting unsourced inflammatory clickbait articles left right and centre, and the same braindead trolls are posting "Intel/NVIDIA/MSI/<insert-company-name-here> sucks" comments in response.
 
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
Well more serious question, how many end users are encoding videos at all?
Hmm... and how many PC users are gaming on them?
Be careful with arguments like that. ;-)

But seriously, quite a lot. In fact most of us stream. :)
Keep in mind all kinds of remote desktops are basically streaming (RDP, Citrix, VMWare, Teamviewer and so on).
Have you ever used the "Project" function in Windows? How do you think that works? :p
We also stream video between devices we own (NAS to TV, PC/Mac to phone etc) - we just don't think about it. We click "show on TV" and it just magically works - like computers should :).

And yeah, the game streaming phenomenon is also quite significant.

There was also a time when people actually kept movies on drives and consciously encoded them all the time for their phones etc. I think this is a dying skill at the moment - it moved to the cloud mostly.
Not to mention how many of them are really using nvenc and not software x264 or x265 for superior quality. The more important part for end users nvdec is still all Turing with full of the features it can have.
Most people stream unconsciously. Computer chooses the technology for them.

Now, the actual support for NVENC is another matter. By default everything looks for a hardware encoder in the CPU/SoC.
But Citrix and VMware remote desktops already support NVENC (datacenter software - makes sense).
We could just have the same encoding capabilities on all cards. It's not that hard, any other GPU manufacturer does just that.
Why does Nvidia, who has the best hardware, best drivers, highest market share, has to do stupid shit like this all the time?
Really, you'll criticize Nvidia because they've introduced a very advanced encoding feature and it's not available in the cheapest GPU. Oh my...
 
Last edited:
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
Really, you'll criticize Nvidia because they've introduced a very advanced encoding feature and it's not available in the cheapest GPU. Oh my...
Yes, same with Intel, why does anything lower than an i3 can't get an 8 years old extension (AVX) that is widely used?
Why did the GT 1030 had NVENC removed when even the small Nintendo Switch's SoC has the power to use it? Why does the 1050 successor gets a gimped version of it's NVENC, when the 1050 had a full feature set at the time?
There is zero cost difference with enabling those features already on the hardware, and gives you good marketing for free.
 
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
Yes, same with Intel, why does anything lower than an i3 can't get an 8 years old extension (AVX) that is widely used?
Why did the GT 1030 had NVENC removed when even the small Nintendo Switch's SoC has the power to use it? Why does the 1050 successor gets a gimped version of it's NVENC, when the 1050 had a full feature set at the time?
There is zero cost difference with enabling those features already on the hardware, and gives you good marketing for free.
So Nvidia and Intel are bad.
What about AMD? Are you sure they're offering the same encoding features in all their GPUs?

BTW: do you even know what features are supported? :-D
I have no idea how to check this. AMD website is a mess. I think it's not there.

I looked at wikipedia, I've asked google. Nothing. It seems like all the humanity knowledge about VCE is coming from slides and leaks. :-D
https://en.wikipedia.org/wiki/Video_Coding_Engine
Not much in general. Nothing about VCE 4.0. And there's even VCE 4.1 now!. :-D

Even the name is problematic. Most of the web seems to think "C" is for either "Coding" or "Codec". AMD says its "Code" (why haven't they fixed the wiki article?)
The only thing most of the web agrees on is that VCE is rubbish.
 
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
So Nvidia and Intel are bad.
What about AMD? Are you sure they're offering the same encoding features in all their GPUs?

BTW: do you even know what features are supported? :-D
I have no idea how to check this. AMD website is a mess. I think it's not there.

I looked at wikipedia, I've asked google. Nothing. It seems like all the humanity knowledge about VCE is coming from slides and leaks. :-D
https://en.wikipedia.org/wiki/Video_Coding_Engine
Not much in general. Nothing about VCE 4.0. And there's even VCE 4.1 now!. :-D

Even the name is problematic. Most of the web seems to think "C" is for either "Coding" or "Codec". AMD says its "Code" (why haven't they fixed the wiki article?)
The only thing most of the web agrees on is that VCE is rubbish.
AMD drivers and their whitepapers are a complete mess, there is a reason why everyone hates coding for AMD. But you get the same encoding capabilities with an RX 550, a Vega 8 (2200G) or a Radeon 7, none of them has VCE "removed" because "reasons". Even Intel GPUs are more consistent on their features than Nvidia ones, all GPUs are the same, spec wise.
Defending forced market segmentation on hardware that is perfectly capable of having a full feature set...
The only way to read this is: "Oh, you want to stream with a lower bitrate? Go get a (1660 to 2080Ti), even thou our low end hardware is more than capable of doing it".
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
AMD drivers and their whitepapers are a complete mess. But you get the same encoding capabilities with an RX 550, a Vega 8 (2200G) or a Radeon 7, none of them has VCE "removed" because "reasons".
But how do you know that? There's no proof, no official statement even.
Have you checked them all personally? Has anyone, ever?

Do you even know what features are supported by your 270X?
According to wikipedia, your GPU has VCE 1.0. I have to base this on wiki, because - obviously - AMD website doesn't mention it. In fact it doesn't mention VCE at all:
1556140794237.png

https://www.amd.com/en/support/grap...s/amd-radeon-r9-200-series/amd-radeon-r9-270x

Anyway, it came out at roughly the same time Nvidia launched Maxwell.
So thanks to this lovely chart: https://developer.nvidia.com/video-encode-decode-gpu-support-matrix I know Maxwell introduced lossless H.264.
Does your GPU support lossless H.264?
 
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
But how do you know that? There's no proof, no official statement even.
Have you checked them all personally? Has anyone, ever?

Do you even know what features are supported by your 270X?
According to wikipedia, your GPU has VCE 1.0. I have to base this on wiki, because - obviously - AMD website doesn't mention it. In fact it doesn't mention VCE at all:
View attachment 121698
https://www.amd.com/en/support/grap...s/amd-radeon-r9-200-series/amd-radeon-r9-270x

Anyway, it came out at roughly the same time Nvidia launched Maxwell.
So thanks to this lovely chart: https://developer.nvidia.com/video-encode-decode-gpu-support-matrix I know Maxwell introduced lossless H.264.
Does your GPU support lossless H.264?
Does a 2013 card support it? Does a Kepler (Maxwell is 2014, and the 270x is in fact a horrible rename of an HD7800, so, 2012)? That's not the discussion.
I can get an AMD CPU, any of them, and I get AVX2. Good luck getting AVX (1, 2, 512) and FMA on any low end Intel CPU. I can get any AMD or Intel GPU and encode, good luck with that on a 1030 (free, useless Geforce Experience on them).
After some time, when B frames are standard, guess who will hate having a 1650?
Yes, Nvidia is the first to deploy it. There is absolutely no need to limit it to the most expensive hardware "just because".
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
Does a 2013 card support it? Does a Kepler (Maxwell is 2014, and the 270x is in fact a horrible rename of an HD7800, so, 2012)? That's not the discussion.
And that wasn't my question either.
Can you say, being 100% sure, whether your GPU supports the feature I mentioned or doesn't?

Because if you can't, how can you be sure that all AMD GPUs in a generation have the same functionality? Where do you get this knowledge from?
 
Joined
Oct 2, 2015
Messages
2,986 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) Gigabyte G27Q + AOC 19'
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse Logitech G203
Keyboard VSG Alnilam
Software Windows 11 x64
And that wasn't my question either.
Can you say, being 100% sure, whether your GPU supports the feature I mentioned or doesn't?

Because if you can't, how can you be sure that all AMD GPUs in a generation have the same functionality? Where do you get this knowledge from?
https://github.com/obsproject/obs-amd-encoder/wiki/Hardware,-GCN-and-VCE-Limits
As long as the VCE version you are referring to is the same, there are no differences, as it should be. Same with Intel.
Now, if you want official information, too bad, AMD is the worst on that front. Never try their profiler, it doesn't even work.

Oh, look, AMD has B frames for H264 since VCE2.0, that's from 2013.
 
Last edited:
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
Setting aside all the fanboying and special-snowflaking, this is a budget card. Streamers should not be looking at this card as an option. They should be going for a GTX 1070(ti), 1080(ti) or an RTX card. The GTX16xx budget cards and are just not made for streamers. All the back and forth arguments are kinda pointless.

I feel like TPU is slowly becoming WCCFtech.
Seriously? Grow up.
 
Top