Monday, June 27th 2016

GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

The second design flaw to hit the GeForce GTX 1080 and GTX 1070 after the fan revving bug, isn't confined to the reference "Founders Edition" cards, but affects all GTX 1080 and GTX 1070 cards. Users of monitors with dual-link DVI connectors are noticing problems in booting to Windows with pixel clocks set higher than 330 MHz. You can boot to windows at default pixel clocks, and when booted, set the refresh-rates (and conversely pixel clocks) higher than 330 MHz, and the display works fine, it's just that you can't boot with those settings, and will have to revert to default settings each time you shut down or restart your machine.

A user of a custom-design GTX 1070 notes that if the refresh rate of their 1440p monitor is set higher than 81 Hz (the highest refresh rate you can achieve with pixel clock staying under 330 MHz) and the resolution at 2560 x 1440, the machine doesn't correctly boot into Windows. The splash screen is replaced with flash color screens, and nothing beyond. The system BIOS screen appears correctly (because it runs at low resolutions). The problem is also said to be observed on a custom-design GTX 1080, and has been replicated by other users on the GeForce Forums.
Source: Reddit
Add your own comment

147 Comments on GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

#126
newconroer
Slizzo, post: 3483829, member: 97498"
I don't think there are any people at 1440P with 120Hz-144Hz going over a dual link DVI, as that connection does not have enough bandwidth to support what you speak of. This is why you see display port on monitors that support those specs.
There's a whole sub community of users who have QNIX and Catleap monitors from as far back as last decade. All of which are using Dual link DVI. The limitation is not the cable (provided it's a decent quality/gauge).
The issue is the pixel clock. Pixel clock patchers exist, to exceed the limitations in the graphics drivers.

In fact I am confident that there's more 120hz 1440p monitor owners running DVI then there are running Display Port. Mass marketed 1440p 120hz+ isn't even a thing and it's only the past several years. that consumers have taken to 1440p (at 60hz let alone 120/144hz) over 1080p - of which is still the industry standard.
Posted on Reply
#127
Slizzo
newconroer, post: 3483861, member: 40251"
There's a whole sub community of users who have QNIX and Catleap monitors from as far back as last decade. All of which are using Dual link DVI. The limitation is not the cable (provided it's a decent quality/gauge).
The issue is the pixel clock. Pixel clock patchers exist, to exceed the limitations in the graphics drivers.

In fact I am confident that there's more 120hz 1440p monitor owners running DVI then there are running Display Port. Mass marketed 1440p 120hz+ isn't even a thing and it's only the past several years. that consumers have taken to 1440p (at 60hz let alone 120/144hz) over 1080p - of which is still the industry standard.
You realize you're talking about people running their monitors and connection over spec right? I own one of those QNIX monitors. I ran it at 100Hz all the time on my 780. I then bought a TRUE 144Hz 1440P screen, and the QNIX is now running as it should, at 60Hz.

Also, the issue is not present on any other connection other than DVI.
Posted on Reply
#128
newconroer
Slizzo, post: 3484219, member: 97498"
You realize you're talking about people running their monitors and connection over spec right?
Like everyone's CPUs and GPUs? Not sure what the point there is

Slizzo, post: 3484219, member: 97498"
I own one of those QNIX monitors. I ran it at 100Hz all the time on my 780. I then bought a TRUE 144Hz 1440P screen, and the QNIX is now running as it should, at 60Hz.
So there's no possibility that the manufacturers intentionally designed and/or later realized that people were using them in an enthusiast way and then ensured they were from that point on, marketed in a way as to feed that enthusiast crowd's interest?
That's like saying a car motor was designed and 'oops look I guess you can tune it for more horsepower by turning up the turbo boost pressure.. oh dear look what we've done'


Slizzo, post: 3484219, member: 97498"
Also, the issue is not present on any other connection other than DVI.
Maybe, but that doesn't change the fact that a lot of people have DVI-D monitors running high frequencies.
Posted on Reply
#129
Slizzo
newconroer, post: 3484257, member: 40251"
Like everyone's CPUs and GPUs? Not sure what the point there is


So there's no possibility that the manufacturers intentionally designed and/or later realized that people were using them in an enthusiast way and then ensured they were from that point on, marketed in a way as to feed that enthusiast crowd's interest?
That's like saying a car motor was designed and 'oops look I guess you can tune it for more horsepower by turning up the turbo boost pressure.. oh dear look what we've done'




Maybe, but that doesn't change the fact that a lot of people have DVI-D monitors running high frequencies.
My thing is this: nVidia should in no way be expected to "answer" for this, as cases such as this are outside of the design spec. The fact that some are lucky and able to get their monitors to run at a refresh rate that they are not designed for is just that: luck.

Do you complain to Intel or AMD/nVidia that your overclock on your CPU or GPU is making your computer fail to boot?
Posted on Reply
#130
newconroer
Slizzo, post: 3484335, member: 97498"
My thing is this: nVidia should in no way be expected to "answer" for this, as cases such as this are outside of the design spec. The fact that some are lucky and able to get their monitors to run at a refresh rate that they are not designed for is just that: luck.

Do you complain to Intel or AMD/nVidia that your overclock on your CPU or GPU is making your computer fail to boot?
What could they have possibly overlooked or dare I say done on purpose, that would limit the ability of the GPU to not function properly when using a high pixel clock, that worked on previous generations?
It does not make sense.

So yes, it looks to be a mistake - and based on what I've informed you of, one that affects a lot of users.
Posted on Reply
#131
Slizzo
newconroer, post: 3484346, member: 40251"
What could they have possibly overlooked or dare I say done on purpose, that would limit the ability of the GPU to not function properly when using a high pixel clock, that worked on previous generations?
It does not make sense.

So yes, it looks to be a mistake - and based on what I've informed you of, one that affects a lot of users.
They removed the analog signal from Pascal altogether. It's completely different than what Maxwell supported.
Posted on Reply
#132
newconroer
Slizzo, post: 3484347, member: 97498"
They removed the analog signal from Pascal altogether. It's completely different than what Maxwell supported.
You'll have to educate me on where this factors in. DVI-D is not DVI-I. It ignores the analog signal.
Posted on Reply
#133
xorbe
Camm, post: 3483287, member: 110377"
I certainly do. Somewhat annoying my old Titan X ran better than my new 1080.
Wait, what? What's going on with the 1080 compared to Titan X? Should I hang onto my Titan X? What's the deal?
Posted on Reply
#134
RichF
Dippyskoodlez, post: 3479460, member: 13553"
I don't see how i3 equivilant FX CPUs at $200+ is not 'raping consumers' in late 2016.
1) No one is being forced to buy those.

2) The 8320E, which overclocks to 5 GHz with the right board, is $90 at Microcenter most of the time, with $40 off a board, too. People who buy the more expensive 8 core FX processors are paying a premium for not doing their homework — something extremely common in the world of consumerism and hardly enough to make AMD somehow a bad company. Corporations are out there to sell you less than what you're paying for. That's how profit-seeking works. You don't even need a Microcenter to get a low price on an FX, although it is true that the extra cost of cooling and a robust power supply can make it less of a value option.

If I were in the market for a budget gaming system I'd get a 5675C Broadwell and overclock it some. The CPU is more expensive than an FX but its gaming efficiency is quite a bit higher. Skylake doesn't really offer anything since Intel refused to put EDRAM on even one SKU.

3) i3 CPUs don't equal or outperform an overclocked 8 core FX in some workloads. Even though the FX design is old and it's still on 32nm it has 8 integer cores. Many integer-heavy workloads that are highly threaded will be good enough on an FX even today. If your workload is heavily floating point then look elsewhere.
Posted on Reply
#135
Slizzo
newconroer, post: 3484355, member: 40251"
You'll have to educate me on where this factors in. DVI-D is not DVI-I. It ignores the analog signal.
I'm saying that the port is not unchanged, they made changes to their GPU and now this "issue" comes up. An issue that any manufacturer of parts should have no concern over supports since it is out of standard. The fact that it worked before and doesn't now is just luck.

Intel does not support overclocking. If your CPU does not run outside of it's specified voltage and clock speed, you can't really be angry at them about it, can you?
Posted on Reply
#136
newconroer
Slizzo, post: 3484786, member: 97498"
I'm saying that the port is not unchanged, they made changes to their GPU and now this "issue" comes up. An issue that any manufacturer of parts should have no concern over supports since it is out of standard. The fact that it worked before and doesn't now is just luck.

Intel does not support overclocking. If your CPU does not run outside of it's specified voltage and clock speed, you can't really be angry at them about it, can you?
How is it out of standard?
Posted on Reply
#137
Dippyskoodlez
RichF, post: 3484478, member: 154826"
1) No one is being forced to buy those.
So its okay for amd to make a poor product and sell it to consumers because noone is forced to buy those, but nvidia is forcing gpus down your throat now?

Its clear your statement was bullshit and now you're playing mental gymnastics to prevent from admitting your statement was stupid.
Posted on Reply
#138
Circa_Survivor
Holy crap.... i have been dealing with this boot problem for 3 days after I installed my 1080. I tried everything under the sun and had a feeling it could be my monitor, which is a qnix i overclocked to 96hz. Luckily, i bought a new monitor because i was sick of using dvi. Im so damn relieved.
Posted on Reply
#139
RichF
Dippyskoodlez, post: 3487205, member: 13553"
So its okay for amd to make a poor product and sell it to consumers because noone is forced to buy those, but nvidia is forcing gpus down your throat now?
Let me know where I said anything of the sort.

Dippyskoodlez, post: 3487205, member: 13553"
Its clear your statement was bullshit and now you're playing mental gymnastics to prevent from admitting your statement was stupid.
What's clear is that you're using arguments ostensibly from me, but not actually from me, to try to discredit things I actually did say.

It's bad enough to post with an abusive tone without resorting to fabrication.
Posted on Reply
#140
newconroer
Covert_Death, post: 3488686, member: 97721"

Is there a comment missing from this?
Posted on Reply
#141
Covert_Death
newconroer, post: 3488751, member: 40251"
Is there a comment missing from this?
yes, that was weird, Chrome won't let me type anything, the text box is missing in HTML....

Anywho, what I was trying to say was:

Slizzo, post: 3483829, member: 97498"
I don't think there are any people at 1440P with 120Hz-144Hz going over a dual link DVI, as that connection does not have enough bandwidth to support what you speak of. This is why you see display port on monitors that support those specs.
While the number may be small, the whole Korean monitor shebang was a thing because of those monitors reaching over 120Hz on dual link DVI. Infact I have a crossover 27" 1440p monitor that I run daily at 120hz for gaming over a single dual link DVI cable.

Currently this issue sucks as every time I boot, my screen has 4 mirrored screens in each corner instead of one display. I can edit and post a screenshot later if interested but imagine that the GPU just uses the 4 quadrants as separate displays that are in mirror mode.

It's quite strange and I've never had this issue before, ever reboot it happens now, I have to launch NVidia control panel and refresh my screen settings to get it back to a single display so I can use it.

I really hope this is fixed soon.
Posted on Reply
#142
Slizzo
I believe 368.69 resolved the issue with pixel clocks. That or the new 368.81
Posted on Reply
#143
Covert_Death
for me .69 introduced the issue as a problem every single boot up (before it was intermittent) but .81 resolved the issues as you suggested!

All is well and back to 120hz
Posted on Reply
#144
newconroer
Yes it's fixed now in .81 ...guess it was just a mistake on their part.
Posted on Reply
#145
Camm
xorbe, post: 3484435, member: 102945"
Wait, what? What's going on with the 1080 compared to Titan X? Should I hang onto my Titan X? What's the deal?
There's a power management bug with high bandwidth displays, which causes artifacting. Think 144hz+ panels, OC 4k screens, etc.
Posted on Reply
#146
cheddle
came here just to say I don't get this issue. EVGA 1080 FE - boot at 2560x1440 @ 100hz over DVI on a korean OC monitor.
Posted on Reply
#147
puma99dk|
cheddle, post: 3510432, member: 151893"
came here just to say I don't get this issue. EVGA 1080 FE - boot at 2560x1440 @ 100hz over DVI on a korean OC monitor.
Nvidia already fixed this with a driver update some time ago :laugh:
Posted on Reply
Add your own comment