Sunday, April 30th 2017

NVIDIA to Support 4K Netflix on GTX 10 Series Cards

Up to now, only users with Intel's latest Kaby Lake architecture processors could enjoy 4K Netflix due to some strict DRM requirements. Now, NVIDIA is taking it upon itself to allow users with one of its GTX 10 series graphics cards (absent the 1050, at least for now) to enjoy some 4K Netflix and chillin'. Though you really do seem to have to go through some hoops to get there, none of these should pose a problem.

The requirements to enable Netflix UHD playback, as per NVIDIA, are so:
  • NVIDIA Driver version exclusively provided via Microsoft Windows Insider Program (currently 381.74).
  • No other GeForce driver will support this functionality at this time
  • If you are not currently registered for WIP, follow this link for instructions to join: insider.windows.com/
  • NVIDIA Pascal based GPU, GeForce GTX 1050 or greater with minimum 3GB memory
  • HDCP 2.2 capable monitor(s). Please see the additional section below if you are using multiple monitors and/or multiple GPUs.
  • Microsoft Edge browser or Netflix app from the Windows Store
  • Approximately 25Mbps (or faster) internet connection.
Single or multi GPU multi monitor configuration
In case of a multi monitor configuration on a single GPU or multiple GPUs where GPUs are not linked together in SLI/LDA mode, 4K UHD streaming will happen only if all the active monitors are HDCP2.2 capable. If any of the active monitors is not HDCP2.2 capable, the quality will be downgraded to FHD.
What do you think? Is this enough to tide you over to the green camp? Do you use Netflix on your PC?
Sources: NVIDIA Customer Help Portal, Eteknix
Add your own comment

59 Comments on NVIDIA to Support 4K Netflix on GTX 10 Series Cards

#51
jabbadap
bugCan you make up your mind?
The only thing that requires a Quadro is 10bit for OpenGL (which kinda sucks for Linux, but Linux has more stringent issues anyway).
And yes, 10bit requires an end-to-end solution to work, that's in no way particular to Nvidia.
That were actually loosen a bit from pascals, they support full screen 10bit games under OpenGL.
AnandtechMeanwhile, with all of this chat of 10bit color support and HDR, I reached out to NVIDIA to see if they’d be changing their policies for professional software support. In brief, NVIDIA has traditionally used 10bit+ color support under OpenGL as a feature differentiator between GeForce and Quadro. GeForce cards can use 10bit color with Direct3D, but only Quadro cards supported 10bit color under OpenGL, and professional applications like Photoshop are usually OpenGL based.

For Pascal, NVIDIA is opening things up a bit more, but they are still going to keep the most important aspect of that feature differentiation in place. 10bit color is being enabled for fullscreen exclusive OpenGL applications – so your typical OpenGL game would be able to tap into deeper colors and HDR – however 10bit OpenGL windowed support is still limited to the Quadro cards. So professional users will still need Quadro cards for 10bit support in their common applications.
Posted on Reply
#52
test1test2
It doesn't matter if it's from 2009. Movies have been shot and edited in 8-bit for the longest time, since 10-bit in Adobe is only a recent upgrade. And none of you had 10-bit IPS monitors to begin with. They're fairly new still. And yes, 10-bit for Adobe has only been for Quadro. Your game cards only had 10-bit access for DX games. Linux drivers for both cards support 10-bit, but who is going to use it? Adobe dun work for Linux, bruhs.

And it's not just OpenGL. Your pics will have banding in them even with 30 bit display enabled in Adobe unless you have a Quadro.
Posted on Reply
#53
bug
test1test2It doesn't matter if it's from 2009. Movies have been shot and edited in 8-bit for the longest time, since 10-bit in Adobe is only a recent upgrade. And none of you had 10-bit IPS monitors to begin with. They're fairly new still. And yes, 10-bit for Adobe has only been for Quadro. Your game cards only had 10-bit access for DX games. Linux drivers for both cards support 10-bit, but who is going to use it? Adobe dun work for Linux, bruhs.

And it's not just OpenGL. Your pics will have banding in them even with 30 bit display enabled in Adobe unless you have a Quadro.
You've totally lost me. How is this related to the newly added Netflix support?
Posted on Reply
#54
Prima.Vera
It seems like that non-professional cards are not properly supporting 10-bit / channel, but instead is something like 8+2 bit, by doing dithering, which is no way same as native 10-bit.
Posted on Reply
#55
test1test2
Exactly. Nvidia is screwing you. You only get 10-bit on the desktop and in Adobe if you have a Quadro. So don't waste your money on an HDR or even 10-bit monitor unless you have a Quadro. And the companies are lying to you as well. Most 4K monitors under $1000 are not true 10-bit panels. They're 8+FRC.
Posted on Reply
#56
bug
Prima.VeraIt seems like that non-professional cards are not properly supporting 10-bit / channel, but instead is something like 8+2 bit, by doing dithering, which is no way same as native 10-bit.
Nope. Monitors can do 8bit+FRC, the video card can't. And even then, the eye can't really tell the difference. We've had 8bit mimicked as 6bit+FRC before, it wasn't a big deal. Of course, for critical work you want the real deal, but for most people, it doesn't really matter.
For 10bit support, you need the video card+OS+monitor to work hand in hand. This isn't new, it's been known for ages. Most likely test1test2 made an uniformed buy and is now venting over here.
test1test2Exactly. Nvidia is screwing you. You only get 10-bit on the desktop and in Adobe if you have a Quadro. So don't waste your money on an HDR or even 10-bit monitor unless you have a Quadro. And the companies are lying to you as well. Most 4K monitors under $1000 are not true 10-bit panels. They're 8+FRC.
Again, how exactly is Nvidia screwing us? If you do professional work, you buy a professional card. If you play games/watch Netflix, a GeForce is enough. Where's the foul play? How is Nvidia different from AMD? And once again, how is this related to the topic at hand?
Posted on Reply
#57
test1test2
Because one day you're going to want to watch an HDR MOVIE on your DESKTOP with your GAME card, and you will NOT be allowed to. HDR requires 10-bit and your GAME cards are LOCKED OUT. What do you not understand about the drivers restrict them to ONLY displaying 10-bit and HDR in DX games?? On the desktop and in Adobe your game cards only do 8-bit. And NO it has not been known for ages. Nvidia keeps this crap secret. I know for a FACT that you have no idea how to enable 30 bit display in Adobe. So don't give me any crap.

Adobe only added support for 30-bit in 2015.
Posted on Reply
#58
Mussels
Freshwater Moderator
test1test2Because one day you're going to want to watch an HDR MOVIE on your DESKTOP with your GAME card, and you will NOT be allowed to. HDR requires 10-bit and your GAME cards are LOCKED OUT. What do you not understand about the drivers restrict them to ONLY displaying 10-bit and HDR in DX games?? On the desktop and in Adobe your game cards only do 8-bit. And NO it has not been known for ages. Nvidia keeps this crap secret. I know for a FACT that you have no idea how to enable 30 bit display in Adobe. So don't give me any crap.

Adobe only added support for 30-bit in 2015.
i dunno, my 4K UHD samsung HDTV quite happily runs HDR content (and yes, it notifies me when i enable it in games like resident evil or ME:A)

The limitations are on the TV side with HDMI, and not the GPU (HDMI cant do 4:4:4/RGB with the 12 bit color only 4:2:2 12 bit, DP can but my TV doesnt have that)
Posted on Reply
#59
bug
test1test2Because one day you're going to want to watch an HDR MOVIE on your DESKTOP with your GAME card, and you will NOT be allowed to. HDR requires 10-bit and your GAME cards are LOCKED OUT. What do you not understand about the drivers restrict them to ONLY displaying 10-bit and HDR in DX games?? On the desktop and in Adobe your game cards only do 8-bit. And NO it has not been known for ages. Nvidia keeps this crap secret. I know for a FACT that you have no idea how to enable 30 bit display in Adobe. So don't give me any crap.

Adobe only added support for 30-bit in 2015.
Only 5 posts and already git an attitude. Keep it up, dude.
Posted on Reply
Add your own comment
Apr 25th, 2024 10:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts