Monday, June 27th 2016

GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

The second design flaw to hit the GeForce GTX 1080 and GTX 1070 after the fan revving bug, isn't confined to the reference "Founders Edition" cards, but affects all GTX 1080 and GTX 1070 cards. Users of monitors with dual-link DVI connectors are noticing problems in booting to Windows with pixel clocks set higher than 330 MHz. You can boot to windows at default pixel clocks, and when booted, set the refresh-rates (and conversely pixel clocks) higher than 330 MHz, and the display works fine, it's just that you can't boot with those settings, and will have to revert to default settings each time you shut down or restart your machine.

A user of a custom-design GTX 1070 notes that if the refresh rate of their 1440p monitor is set higher than 81 Hz (the highest refresh rate you can achieve with pixel clock staying under 330 MHz) and the resolution at 2560 x 1440, the machine doesn't correctly boot into Windows. The splash screen is replaced with flash color screens, and nothing beyond. The system BIOS screen appears correctly (because it runs at low resolutions). The problem is also said to be observed on a custom-design GTX 1080, and has been replicated by other users on the GeForce Forums.
Source: Reddit
Add your own comment

147 Comments on GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

#26
Eroticus
Vayra86 said:
Of course it can work, but you got the point I reckon - nothing wrong with DVI-D if it suits your monitor and rig, and there is no relation to it being 'low or high end'. It's like tires on a car, if they're the right size, it's fine, but you *can* get super exotic tires too and still be stuck to the same engine under the hood. You only get a faster car with a better engine.
I didn't try to hurt you or anything ... =P

But the technology is old, no point to use it any more, when DP works well support more resolution and frequencies.

You won't buy used GT-R 1993, if you could get GT-R 2016 for same price.
Posted on Reply
#27
Vayra86
Eroticus said:
I didn't try to hurt you or anything ... =P

But the technology is old, no point to use it any more, when DP works well support more resolution and frequencies.

You won't buy used GT-R 1993, if you could get GT-R 2016 for same price.
The point is, you are making a silly argument when Nvidia puts a DVI connector on its newest GPU's. "Bwah, not important, old stuff, it can be shit"

Like... huh?
Posted on Reply
#28
Eroticus
Vayra86 said:
The point is, you are making a silly argument when Nvidia puts a DVI connector on its newest GPU's. "Bwah, not important, old stuff, it can be shit"
Sooner or later some year will pick up ur DVI port to, like 1999 toke ur optical drive. ;)

Any way, enjoy ur DVI port, have a nice day....
Posted on Reply
#29
Caring1
Vayra86 said:
Of course it can work, but you got the point I reckon - nothing wrong with DVI-D if it suits your monitor and rig, and there is no relation to it being 'low or high end'. It's like tires on a car, if they're the right size, it's fine, but you *can* get super exotic tires too and still be stuck to the same engine under the hood. You only get a faster car with a better engine.
That analogy is incorrect, the vehicle can be driven faster due to better tires alone.
Posted on Reply
#30
Frick
Fishfaced Nincompoop
Vayra86 said:

*note: Nvidia isn't scoring points over here the past year. Fuckup after fuckup, as little as they may be, but it's becoming a real pattern now.
Sadly it doesn't affect them in any way.

Anyway DVI for the win. I wish they kept the connectors anyway.
Posted on Reply
#31
Vayra86
Caring1 said:
That analogy is incorrect, the vehicle can be driven faster due to better tires alone.
Yeah... car analogies never work. Damn it!
Posted on Reply
#32
goodeedidid
I don't care about that at all, I haven't seen DVI connection in years. What sane person will use this old shit anyways with the likes of 1080. This is the best video-card ever.
Posted on Reply
#33
Vayra86
goodeedidid said:
I don't care about that at all, I haven't seen DVI connection in years. What sane person will use this old shit anyways with the likes of 1080. This is the best video-card ever.
You made a new account just to say that you don't care?
Posted on Reply
#34
goodeedidid
Vayra86 said:
You made a new account just to say that you don't care?
What do you mean?
Posted on Reply
#35
qubit
Overclocked quantum bit
This is such an obvious problem how could it not have been found by NVIDIA? Also by reviewers and users alike who have high refresh rate monitors.

For example, my setup will default to 144Hz when I install a new graphics card and then the drivers as that's what my monitor supports.
Posted on Reply
#36
goodeedidid
qubit said:
This is such an obvious problem how could it not have been found by NVIDIA? Also by reviewers and users alike who have high refresh rate monitors.

For example, my setup will default to 144Hz when I install a new graphics card and then the drivers as that's what my monitor supports.
I suppose reviewers didn't do reviewing much with DVI connections.

I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.
Posted on Reply
#37
okidna
P4-630 said:
Or just lower the Hz everytime before shutting down, which what I would do, I'm uncomfortable flashing my expensive card.
You can do it with just one click with batch file to run simple app like Display Changer : http://12noon.com/?page_id=80

I use it to test an image processing project a couple months ago, to test how the application react in different refresh rate, resolution, and color depth without touching Windows display settings or even graphic card driver settings at all.
Posted on Reply
#38
qubit
Overclocked quantum bit
goodeedidid said:
I suppose reviewers didn't do reviewing much with DVI connections.

I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.
DVI is pretty standard and hardly obsolete, so this is an obvious fault that should have been picked up straight away. DisplayPort is only required for 4K and above.
Posted on Reply
#39
jabbadap
Correct me if I'm wrong, but is this only affect people who OC their monitors to connected by dual link DVI?
Posted on Reply
#40
puma99dk|
This sound like a Bios update is needed, dunno if it can be fixed with a new driver which could be awesome if it was possible.
Posted on Reply
#41
the54thvoid
jabbadap said:
Correct me if I'm wrong, but is this only affect people who OC their monitors to connected by dual link DVI?
If so, it's laughable. Truly. For all the fairy protests from the usual sources, for a problem they probably don't have. For an issue that affects so few who overclock monitors not really designed for it in most cases.
More tea cups and more storms.
Posted on Reply
#42
Valdas
goodeedidid said:
I suppose reviewers didn't do reviewing much with DVI connections.

I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.
Imho people don't change their monitors nearly as often as they change their graphics cards or other components, therefore assumption that if you have a 1080 in your system, you won't have an older monitor, is incorrect.
I've got two monitors where one is old, so I should just throw it away? Even when it works just fine?
Posted on Reply
#43
bug
TheGuruStud said:
All of the low latency korean monitors came with only dvi (multiple inputs increase latency). Tons of them were sold for refresh OCing.

I have no monitors with DP. DP is a new addition to most monitors.
I have a DP capable monitor I bought like 5 years ago (HP z24something). And it didn't cost an arm and a leg.

Maybe this will convince Nvidia it's time to put DVI to greener pastures. It's limited to 2560x1600@60Hz anyway (see https://en.wikipedia.org/wiki/Digital_Visual_Interface#Technical_overview)
Posted on Reply
#44
RCoon
Gaming Moderator
the54thvoid said:
If so, it's laughable. Truly. For all the fairy protests from the usual sources, for a problem they probably don't have. For an issue that affects so few who overclock monitors not really designed for it in most cases.
More tea cups and more storms.
For every mini scandal covered by the media, 99 more slip through the cracks.
Posted on Reply
#45
P4-630
The Way It's Meant to be Played
bug said:
Maybe this will convince Nvidia it's time to put DVI to greener pastures.
The market is still flooded with monitors that offer only HDMI, VGA and DVI-D.
I have bought a nice one not long ago with IPS display.
I have it connected through DVI-D.
A tv takes the HDMI port.
Posted on Reply
#46
rtwjunkie
PC Gaming Enthusiast
goodeedidid said:
Or maybe there are modern DVI connections, I haven't seen anything like this myself.
Yes, exactly, there are. My monitor is new as well as a recent model. It has DVI and DP. I play at 60Hz and am very comfortable and pleased with that. DVI is just fine. It is merely the victim of elitist snobbery.
Posted on Reply
#47
john_
Funny.

Nvidia cards are having problems with DVI.
Solution: Throw away your monitors that still use DVI. They are old.
Really? Maybe before telling someone to change his monitor you should send him a few hundred dollars.

Anyway, one more problem where there is an easy workaround. The question is how many .bat workaround files can someone have on his desktop, especially after paying hundreds of dollars for a Pascal card and a high refresh rate GSync(or not) monitor. One workaround for high power consumption when sitting on the desktop at 144Hz, one workaround for booting/shutting down the system, more workarounds in the future?

Probably Nvidia will implement something more automatic in their next driver, like auto changing refresh rates when shutting down and when booting up. But they should had seen this in time and fix it. A few extra frames at the latest title does sell cards, but their reputation about drivers quality took years to establish and people will stop forgiving them for little things like this.
Posted on Reply
#48
rtwjunkie
PC Gaming Enthusiast
john_ said:
but their reputation about drivers quality took years to establish and people will stop forgiving them for little things like this.
As far as I am concerned Nvidia already dashed their driver reputation last summer with a string of bad drivers, leaving many to hang onto older drivers until Autumn.

It's why I always wait at least 2 releases now before upgrading drivers. They can't be trusted anymore not to release shit.
Posted on Reply
#49
TheinsanegamerN
TheGuruStud said:
How was this not found by nvidia in super early testing, let alone reviewers? Rush, rush, rush, who cares, launch them all no matter the incompetence. I imagine the delay for users to find this was b/c almost no one can get the cards anyway XD
No kidding. Nvidia has really been dropping the ball in the last year or so, between rushed pascal and driver problems.

There is some term to describe this, IDK what it is called, when a company is in the lead they stop caring about their products as much as when they had competition. nvidia is in that stage. And much like how the same stage let IE get overtaken by chrome and firefox, nvidia may be in trouble here soon if they dont shape up.

I will say I regret not getting the M295x in my alienware 15 when I had the chance. Nvidia's optimus has been disappointing lately.
rtwjunkie said:
Yes, exactly, there are. My monitor is new as well as a recent model. It has DVI and DP. I play at 60Hz and am very comfortable and pleased with that. DVI is just fine. It is merely the victim of elitist snobbery.
Or technological obsolescence. There is nothing wrong with DVI, it's just old. Technology moved forward since DVI came onto the market in 1999. Heck, dual link DVI doesnt officially support over 60hz at 1600p. Modern monitors are pushing 144 hz 1080p, and 144hz 1440p will be coming.

of course it works fine for 60hz, but more people are beginning to get on the high refresh rate train. In that world, DVI no longer matters.
Posted on Reply
#50
qubit
Overclocked quantum bit
RCoon said:
For every mini scandal covered by the media, 99 more slip through the cracks.
True. :(

rtwjunkie said:
As far as I am concerned Nvidia already dashed their driver reputation last summer with a string of bad drivers, leaving many to hang onto older drivers until Autumn.

It's why I always wait at least 2 releases now before upgrading drivers. They can't be trusted anymore not to release shit.
Yes, they've been a bit flaky, unfortunately. However, I still install straight away with wild abandon :laugh: and so far have somehow managed to avoid any major issues.
Posted on Reply
Add your own comment