• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Support 4K Netflix on GTX 10 Series Cards

bug

Joined
May 22, 2015
Messages
13,239 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Hate to break it to you, but nvidia is screwing you. You're not getting 10-bit from your game cards. You have to be using a Quadro. And for Adobe you have to enable 30 bit display in the preferences and advanced settings. And for the control panel driver setting it has to be set at 10-bit. Then there's the fact that your 10-bit monitor is probably 8+FRC, which is fake.

I wonder how they'll handle this scam they've been running when people try to play HDR content on their $2000 HDR monitors that go on sale this month. On the desktop your game cards only do 8-bit. They only do 10-bit in DX games.

Can you make up your mind?
The only thing that requires a Quadro is 10bit for OpenGL (which kinda sucks for Linux, but Linux has more stringent issues anyway).
And yes, 10bit requires an end-to-end solution to work, that's in no way particular to Nvidia.
 
Joined
Mar 10, 2014
Messages
1,793 (0.48/day)
Can you make up your mind?
The only thing that requires a Quadro is 10bit for OpenGL (which kinda sucks for Linux, but Linux has more stringent issues anyway).
And yes, 10bit requires an end-to-end solution to work, that's in no way particular to Nvidia.

That were actually loosen a bit from pascals, they support full screen 10bit games under OpenGL.
Anandtech said:
Meanwhile, with all of this chat of 10bit color support and HDR, I reached out to NVIDIA to see if they’d be changing their policies for professional software support. In brief, NVIDIA has traditionally used 10bit+ color support under OpenGL as a feature differentiator between GeForce and Quadro. GeForce cards can use 10bit color with Direct3D, but only Quadro cards supported 10bit color under OpenGL, and professional applications like Photoshop are usually OpenGL based.

For Pascal, NVIDIA is opening things up a bit more, but they are still going to keep the most important aspect of that feature differentiation in place. 10bit color is being enabled for fullscreen exclusive OpenGL applications – so your typical OpenGL game would be able to tap into deeper colors and HDR – however 10bit OpenGL windowed support is still limited to the Quadro cards. So professional users will still need Quadro cards for 10bit support in their common applications.
 

test1test2

New Member
Joined
May 14, 2017
Messages
5 (0.00/day)
It doesn't matter if it's from 2009. Movies have been shot and edited in 8-bit for the longest time, since 10-bit in Adobe is only a recent upgrade. And none of you had 10-bit IPS monitors to begin with. They're fairly new still. And yes, 10-bit for Adobe has only been for Quadro. Your game cards only had 10-bit access for DX games. Linux drivers for both cards support 10-bit, but who is going to use it? Adobe dun work for Linux, bruhs.

And it's not just OpenGL. Your pics will have banding in them even with 30 bit display enabled in Adobe unless you have a Quadro.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,239 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
It doesn't matter if it's from 2009. Movies have been shot and edited in 8-bit for the longest time, since 10-bit in Adobe is only a recent upgrade. And none of you had 10-bit IPS monitors to begin with. They're fairly new still. And yes, 10-bit for Adobe has only been for Quadro. Your game cards only had 10-bit access for DX games. Linux drivers for both cards support 10-bit, but who is going to use it? Adobe dun work for Linux, bruhs.

And it's not just OpenGL. Your pics will have banding in them even with 30 bit display enabled in Adobe unless you have a Quadro.
You've totally lost me. How is this related to the newly added Netflix support?
 
Joined
Sep 15, 2011
Messages
6,476 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
It seems like that non-professional cards are not properly supporting 10-bit / channel, but instead is something like 8+2 bit, by doing dithering, which is no way same as native 10-bit.
 

test1test2

New Member
Joined
May 14, 2017
Messages
5 (0.00/day)
Exactly. Nvidia is screwing you. You only get 10-bit on the desktop and in Adobe if you have a Quadro. So don't waste your money on an HDR or even 10-bit monitor unless you have a Quadro. And the companies are lying to you as well. Most 4K monitors under $1000 are not true 10-bit panels. They're 8+FRC.
 

bug

Joined
May 22, 2015
Messages
13,239 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
It seems like that non-professional cards are not properly supporting 10-bit / channel, but instead is something like 8+2 bit, by doing dithering, which is no way same as native 10-bit.

Nope. Monitors can do 8bit+FRC, the video card can't. And even then, the eye can't really tell the difference. We've had 8bit mimicked as 6bit+FRC before, it wasn't a big deal. Of course, for critical work you want the real deal, but for most people, it doesn't really matter.
For 10bit support, you need the video card+OS+monitor to work hand in hand. This isn't new, it's been known for ages. Most likely test1test2 made an uniformed buy and is now venting over here.
Exactly. Nvidia is screwing you. You only get 10-bit on the desktop and in Adobe if you have a Quadro. So don't waste your money on an HDR or even 10-bit monitor unless you have a Quadro. And the companies are lying to you as well. Most 4K monitors under $1000 are not true 10-bit panels. They're 8+FRC.
Again, how exactly is Nvidia screwing us? If you do professional work, you buy a professional card. If you play games/watch Netflix, a GeForce is enough. Where's the foul play? How is Nvidia different from AMD? And once again, how is this related to the topic at hand?
 

test1test2

New Member
Joined
May 14, 2017
Messages
5 (0.00/day)
Because one day you're going to want to watch an HDR MOVIE on your DESKTOP with your GAME card, and you will NOT be allowed to. HDR requires 10-bit and your GAME cards are LOCKED OUT. What do you not understand about the drivers restrict them to ONLY displaying 10-bit and HDR in DX games?? On the desktop and in Adobe your game cards only do 8-bit. And NO it has not been known for ages. Nvidia keeps this crap secret. I know for a FACT that you have no idea how to enable 30 bit display in Adobe. So don't give me any crap.

Adobe only added support for 30-bit in 2015.
 
Last edited:

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.17/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Because one day you're going to want to watch an HDR MOVIE on your DESKTOP with your GAME card, and you will NOT be allowed to. HDR requires 10-bit and your GAME cards are LOCKED OUT. What do you not understand about the drivers restrict them to ONLY displaying 10-bit and HDR in DX games?? On the desktop and in Adobe your game cards only do 8-bit. And NO it has not been known for ages. Nvidia keeps this crap secret. I know for a FACT that you have no idea how to enable 30 bit display in Adobe. So don't give me any crap.

Adobe only added support for 30-bit in 2015.

i dunno, my 4K UHD samsung HDTV quite happily runs HDR content (and yes, it notifies me when i enable it in games like resident evil or ME:A)

The limitations are on the TV side with HDMI, and not the GPU (HDMI cant do 4:4:4/RGB with the 12 bit color only 4:2:2 12 bit, DP can but my TV doesnt have that)
 

bug

Joined
May 22, 2015
Messages
13,239 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Because one day you're going to want to watch an HDR MOVIE on your DESKTOP with your GAME card, and you will NOT be allowed to. HDR requires 10-bit and your GAME cards are LOCKED OUT. What do you not understand about the drivers restrict them to ONLY displaying 10-bit and HDR in DX games?? On the desktop and in Adobe your game cards only do 8-bit. And NO it has not been known for ages. Nvidia keeps this crap secret. I know for a FACT that you have no idea how to enable 30 bit display in Adobe. So don't give me any crap.

Adobe only added support for 30-bit in 2015.
Only 5 posts and already git an attitude. Keep it up, dude.
 
Top