• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Nvidia culls 10bit video support on 50 series.

Joined
Mar 7, 2011
Messages
5,180 (1.00/day)
If people are using Geforce GPUs for video editing then better stick to 40 series as 10bit encode/decode support seems to be culled from 50 series.

 
Last edited:
Things that are culled from rtx5070:
  • 32bit physx
  • 256bit memory bus (also 16GB capacity)
  • Some bits of hotspot temperature reading.
Very lucky to retain full bits of pcie 5.0. I guess rtx6070 will be 128bit bus and 8x pcie lanes. 128bit bus will enable 16GB memory in 2027.

Rtx6070: 128bit, 7680 cores, 16GB, increased tensor core performance, pcie 8x lanes, $2300.
 
Last edited:
If people are using Geforce GPUs for video editing then better stick to 40 series as 10bit encode/decode support seems to be culled from 50 series.

Reading the link you posted it doesn’t seem like 10 bit support was removed at all?? It says it was broken in DaVinci Resolve at release but that was a new feature for NVENC; H265 422 chroma at 10 bit versus the previously supported 420 and 444.
 
Reading the link you posted it doesn’t seem like 10 bit support was removed at all?? It says it was broken in DaVinci Resolve at release but that was a new feature for NVENC; H265 422 chroma at 10 bit versus the previously supported 420 and 444.
That would've required OP to not be functionally illiterate.
 
Back
Top