• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

For every mini scandal covered by the media, 99 more slip through the cracks.
True. :(

As far as I am concerned Nvidia already dashed their driver reputation last summer with a string of bad drivers, leaving many to hang onto older drivers until Autumn.

It's why I always wait at least 2 releases now before upgrading drivers. They can't be trusted anymore not to release shit.
Yes, they've been a bit flaky, unfortunately. However, I still install straight away with wild abandon :laugh: and so far have somehow managed to avoid any major issues.
 
As far as I am concerned Nvidia already dashed their driver reputation last summer with a string of bad drivers, leaving many to hang onto older drivers until Autumn.

It's why I always wait at least 2 releases now before upgrading drivers. They can't be trusted anymore not to release shit.

Some people used to say that AMD had crappy drivers but now nvidia has showed driver problems as well,
I also will wait updating my display driver when there has been a new one released.
We will read it fast enough on the internet if there are any problems with them :D
 
......who buys a $400 + card and don't us a monitor to full experience what they paid for.......step your game up people display port! This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini and it just doesn't perform well on 87 octane gas(as oppose to 93 and above). Since some hate car analogies, its also like when flat screen 1080p TVs first came out and people were still using the red white and yellow rca connectors. You still got a picture but not the quality that you paid for. I guess they have to put warning stickers on graphic cards suggesting the proper monitors. Didn't they eliminate dvi on high end cards and make it so adapters didn't work either......You gotta do your research......
 
Last edited:
......who buys a $400 + card and uses dvi or an hdmi even......step your game up people display port! This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini and it just doesn't perform well on 87 octane gas(as oppose to 93 and above). Do they have to put warning stickers on graphic cards suggesting a high end monitor? Didn't they eliminate dvi on some cards and make it so adapters didn't work either......

People have different standards of what makes a good monitor. For me, IQ is important, followed by screen size for what works on my desk, and then screen material. I prefer glossy for its rich colors. I don't need 144Hz 1440p to have a great gaming experience.

Why? Because that experience is subjective, and no one can say that something is inadequate. I'm also very cost-conscious, and I am not going to replace my near $300 monitor for considerably more just to make elitists happy.
 
Last edited:
The market is still flooded with monitors that offer only HDMI, VGA and DVI-D.
I have bought a nice one not long ago with IPS display.
I have it connected through DVI-D.
A tv takes the HDMI port.

Not a problem. Leave only HDMI and DP connections on the video card (possibly with some mini variants in there) and let those who still cling on to DVI use a HDMI-to-DVI cable (it's cheap, I know, I have one). Or include a HDMI-toDVI adapter with the card if you feel generous.
 
......who buys a $400 + card and uses dvi or an hdmi even......step your game up people display port! This is like saying...Yeah i bought this Porsche/Ferrari/Lamborghini and it just doesn't perform well on 87 octane gas(as oppose to 93 and above). Do they have to put warning stickers on graphic cards suggesting a newer monitor? Didn't they eliminate dvi on high end cards and make it so adapters didn't work either......
Thing is the people made a big deal about the Fury X not having HDMI 2.0, so I guess plenty of people use HDMI.
 
I've been reading this thread and it usually affects users of 2K monitors imported from Korean that have only a DVI port (no DP port). Monitors which have DP ports are not affected provided the user is NOT using a DVI port.

I used to have a QNIX myself and these monitors can be easily overclocked by users up to 120 hertz. So until NVIDIA comes out with a fix don't overclock the QNIX monitors beyong 80hertz...maybe make it 75 hertz just to be sure.
 
I find it funny its been 2 weeks sense the story of hand picked over clocked samples now this. Add to it low stock, I think were gonna see a new card sooner then Christmas from the big N.

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709 601201888 601203818 601204369

I wouldn't be surprised to see a 1060 from them, since they literally are poised to lose the whole mid-tier to AMD. Of course, 1060's would also need to be in stock, LOL!

:roll:
 
Thing is the people made a big deal about the Fury X not having HDMI 2.0, so I guess plenty of people use HDMI.

true.....i edited my rant as soon as higher logic and coffee kicked in.
 
more people are beginning to get on the high refresh rate train. In that world, DVI no longer matters.

Yes, they are. From my eyes, I see no reason to do it. And I have a modern, recent monitor. Earlier I stated what qualities in a monitor are important to me.

I've never been a victim of peer pressure. I don't follow the herd. I've always done what works for me in life.
 
Maybe this will convince Nvidia it's time to put DVI to greener pastures. It's limited to 2560x1600@60Hz anyway (see https://en.wikipedia.org/wiki/Digital_Visual_Interface#Technical_overview)

It's not limited to it, that's the "spec", but it's common to see systems running out of spec. You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.

I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.
 
It's not limited to it, that's the "spec", but it's common to see systems running out of spec. You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.

I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.

should most def drop it if you cannot seem to do it correctly....yeah...
 
There is also a problem with DisplayPort sync above 120hz, which causes white line artifacting.
 
It's not limited to it, that's the "spec", but it's common to see systems running out of spec. You can do 1080p/144Hz and 1440p/120Hz over dual link DVI no problem.

I do agree that it's time to drop the DVI port on high end cards though, now that they are using DVI-D instead of DVI-I the DVI-VGA adapters no longer work so they may as well just drop the DVI port or replace it with an extra HDMI port and include a HDMI-DVI adapter, that will save space.

Well, you can run anything out of spec, but then you no longer expect any predictable result, do you?
 
should most def drop it if you cannot seem to do it correctly....yeah...
Also dropping that bulky DVI connector can also mean more space for the exhaust on reference cards.
 
Was looking for an GTX 1070 to replace my GTX 970 but with the corean monitor i have, OC'ed to 96 Hz, might have problems.
 
Sounds like they have a display controller issue, and its almost for the market they are laying claim to with their cards, high resolution high FPS displays.
Perhaps they are lucky their not able to produce more actual cards yet, a BIOS update for thousands of cards will undoubtedly cause some to be bricked, and no replacements available would be a worse black eye than the already absurd price gouging and limited availability.
 
Just going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.

This issue isn't present for a majority of GTX1080 owners. Including myself; however I stopped overclocking my gen 1 Qnix Qx2710 when I got my XB270HU.
 
Sounds like they have a display controller issue, and its almost for the market they are laying claim to with their cards, high resolution high FPS displays.
Perhaps they are lucky their not able to produce more actual cards yet, a BIOS update for thousands of cards will undoubtedly cause some to be bricked, and no replacements available would be a worse black eye than the already absurd price gouging and limited availability.

Well if this would happen with Displayport monitors like 1440@144Hz or 2160p@60Hz then I would agree. But the "bug" is with running dl-dvi out of spec Mpix/s, thus not very critical. This is more troublesome bug:
https://forums.geforce.com/default/...es-not-work-with-the-htc-vive-on-gtx-1080-/1/
 
Just going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.
not true, "old" native 75/85/120hz monitors use dvi and they seem to work just fine. heck, even some (if not most)144hz monitors have dvi, not the only interface but its probably the primary
 
Last edited by a moderator:
I think the take away from the problems with the 1070/1080s is to not buy first batch. I've been impatient in the past, and burned several times with glitchy cards requiring an RMA. No doubt that Nvidia will sort out these bugs. Until then, my wallet's staying closed.
 
not true, "old" native 75/85/120hz monitors use dvi and they seem to work just fine. heck, even some (if not most)144hz monitors have dvi, not the only interface but its probably the primary
"old" native 75/85/120hz were also not running 2560x1600 resolution. DVI has no trouble pushing that HZ at newer rez, and you do not need to push the pixel clock higher to achieve said refresh rates. As such, said monitors would have no issue with a 1080.

If any of them WERE pushing above 2560x1600p60, then those monitors were being sold outside of spec, and whoever bought them took the risk that something wouldnt work. Any decent monitor that was pushing said specs really should have used displayport.
 
Just going to throw this out here, but how many monitors that support greater than 60Hz refresh rate (natively, no overclocking) that only use DVI are out there? I'm going to venture none. Therefore nVidia can't have known this, because they likely do not test out of spec configurations.

This issue isn't present for a majority of GTX1080 owners. Including myself; however I stopped overclocking my gen 1 Qnix Qx2710 when I got my XB270HU.
Take a look at mine Viewsonic V3D245. It has d-sub, dvi and hdmi.
 
Monitors using dual link dvi with refresh rates above 81hz. I'm sure this will have some slobbering over Nvidia failing again but really...

Is Display Port or HDMI not better? And if your monitor doesn't have those, why buy an expensive gfx card. My 6 year old Dell has Display Port.

Korean 1440p PLS and IPS monitors, the good ones, only have dual link dvi, and these are the only ones that allow a decent refresh rate overclock, plus very low input lag.

Looks like ill be waiting for this to hopefully be fixed in drivers or domething before i buy a 1070, since this will be a issue for me.
 
Back
Top