• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

Small number ...there's a lot of 1440p 120/144hz users who are still using monitors with Dual DVI...

Man everytime I invest in an Nvidia product, right after wards..something like this comes out.. wtf


I wonder what happens when you use a DVI - D to Display Port adapter.

I don't think there are any people at 1440P with 120Hz-144Hz going over a dual link DVI, as that connection does not have enough bandwidth to support what you speak of. This is why you see display port on monitors that support those specs.
 
I don't think there are any people at 1440P with 120Hz-144Hz going over a dual link DVI, as that connection does not have enough bandwidth to support what you speak of. This is why you see display port on monitors that support those specs.

There's a whole sub community of users who have QNIX and Catleap monitors from as far back as last decade. All of which are using Dual link DVI. The limitation is not the cable (provided it's a decent quality/gauge).
The issue is the pixel clock. Pixel clock patchers exist, to exceed the limitations in the graphics drivers.

In fact I am confident that there's more 120hz 1440p monitor owners running DVI then there are running Display Port. Mass marketed 1440p 120hz+ isn't even a thing and it's only the past several years. that consumers have taken to 1440p (at 60hz let alone 120/144hz) over 1080p - of which is still the industry standard.
 
Last edited:
There's a whole sub community of users who have QNIX and Catleap monitors from as far back as last decade. All of which are using Dual link DVI. The limitation is not the cable (provided it's a decent quality/gauge).
The issue is the pixel clock. Pixel clock patchers exist, to exceed the limitations in the graphics drivers.

In fact I am confident that there's more 120hz 1440p monitor owners running DVI then there are running Display Port. Mass marketed 1440p 120hz+ isn't even a thing and it's only the past several years. that consumers have taken to 1440p (at 60hz let alone 120/144hz) over 1080p - of which is still the industry standard.

You realize you're talking about people running their monitors and connection over spec right? I own one of those QNIX monitors. I ran it at 100Hz all the time on my 780. I then bought a TRUE 144Hz 1440P screen, and the QNIX is now running as it should, at 60Hz.

Also, the issue is not present on any other connection other than DVI.
 
You realize you're talking about people running their monitors and connection over spec right?
Like everyone's CPUs and GPUs? Not sure what the point there is

I own one of those QNIX monitors. I ran it at 100Hz all the time on my 780. I then bought a TRUE 144Hz 1440P screen, and the QNIX is now running as it should, at 60Hz.
So there's no possibility that the manufacturers intentionally designed and/or later realized that people were using them in an enthusiast way and then ensured they were from that point on, marketed in a way as to feed that enthusiast crowd's interest?
That's like saying a car motor was designed and 'oops look I guess you can tune it for more horsepower by turning up the turbo boost pressure.. oh dear look what we've done'


Also, the issue is not present on any other connection other than DVI.

Maybe, but that doesn't change the fact that a lot of people have DVI-D monitors running high frequencies.
 
Like everyone's CPUs and GPUs? Not sure what the point there is


So there's no possibility that the manufacturers intentionally designed and/or later realized that people were using them in an enthusiast way and then ensured they were from that point on, marketed in a way as to feed that enthusiast crowd's interest?
That's like saying a car motor was designed and 'oops look I guess you can tune it for more horsepower by turning up the turbo boost pressure.. oh dear look what we've done'




Maybe, but that doesn't change the fact that a lot of people have DVI-D monitors running high frequencies.

My thing is this: nVidia should in no way be expected to "answer" for this, as cases such as this are outside of the design spec. The fact that some are lucky and able to get their monitors to run at a refresh rate that they are not designed for is just that: luck.

Do you complain to Intel or AMD/nVidia that your overclock on your CPU or GPU is making your computer fail to boot?
 
My thing is this: nVidia should in no way be expected to "answer" for this, as cases such as this are outside of the design spec. The fact that some are lucky and able to get their monitors to run at a refresh rate that they are not designed for is just that: luck.

Do you complain to Intel or AMD/nVidia that your overclock on your CPU or GPU is making your computer fail to boot?

What could they have possibly overlooked or dare I say done on purpose, that would limit the ability of the GPU to not function properly when using a high pixel clock, that worked on previous generations?
It does not make sense.

So yes, it looks to be a mistake - and based on what I've informed you of, one that affects a lot of users.
 
What could they have possibly overlooked or dare I say done on purpose, that would limit the ability of the GPU to not function properly when using a high pixel clock, that worked on previous generations?
It does not make sense.

So yes, it looks to be a mistake - and based on what I've informed you of, one that affects a lot of users.

They removed the analog signal from Pascal altogether. It's completely different than what Maxwell supported.
 
They removed the analog signal from Pascal altogether. It's completely different than what Maxwell supported.
You'll have to educate me on where this factors in. DVI-D is not DVI-I. It ignores the analog signal.
 
I certainly do. Somewhat annoying my old Titan X ran better than my new 1080.

Wait, what? What's going on with the 1080 compared to Titan X? Should I hang onto my Titan X? What's the deal?
 
I don't see how i3 equivilant FX CPUs at $200+ is not 'raping consumers' in late 2016.
1) No one is being forced to buy those.

2) The 8320E, which overclocks to 5 GHz with the right board, is $90 at Microcenter most of the time, with $40 off a board, too. People who buy the more expensive 8 core FX processors are paying a premium for not doing their homework — something extremely common in the world of consumerism and hardly enough to make AMD somehow a bad company. Corporations are out there to sell you less than what you're paying for. That's how profit-seeking works. You don't even need a Microcenter to get a low price on an FX, although it is true that the extra cost of cooling and a robust power supply can make it less of a value option.

If I were in the market for a budget gaming system I'd get a 5675C Broadwell and overclock it some. The CPU is more expensive than an FX but its gaming efficiency is quite a bit higher. Skylake doesn't really offer anything since Intel refused to put EDRAM on even one SKU.

3) i3 CPUs don't equal or outperform an overclocked 8 core FX in some workloads. Even though the FX design is old and it's still on 32nm it has 8 integer cores. Many integer-heavy workloads that are highly threaded will be good enough on an FX even today. If your workload is heavily floating point then look elsewhere.
 
You'll have to educate me on where this factors in. DVI-D is not DVI-I. It ignores the analog signal.

I'm saying that the port is not unchanged, they made changes to their GPU and now this "issue" comes up. An issue that any manufacturer of parts should have no concern over supports since it is out of standard. The fact that it worked before and doesn't now is just luck.

Intel does not support overclocking. If your CPU does not run outside of it's specified voltage and clock speed, you can't really be angry at them about it, can you?
 
I'm saying that the port is not unchanged, they made changes to their GPU and now this "issue" comes up. An issue that any manufacturer of parts should have no concern over supports since it is out of standard. The fact that it worked before and doesn't now is just luck.

Intel does not support overclocking. If your CPU does not run outside of it's specified voltage and clock speed, you can't really be angry at them about it, can you?

How is it out of standard?
 
1) No one is being forced to buy those.


So its okay for amd to make a poor product and sell it to consumers because noone is forced to buy those, but nvidia is forcing gpus down your throat now?

Its clear your statement was bullshit and now you're playing mental gymnastics to prevent from admitting your statement was stupid.
 
Holy crap.... i have been dealing with this boot problem for 3 days after I installed my 1080. I tried everything under the sun and had a feeling it could be my monitor, which is a qnix i overclocked to 96hz. Luckily, i bought a new monitor because i was sick of using dvi. Im so damn relieved.
 
So its okay for amd to make a poor product and sell it to consumers because noone is forced to buy those, but nvidia is forcing gpus down your throat now?
Let me know where I said anything of the sort.

Its clear your statement was bullshit and now you're playing mental gymnastics to prevent from admitting your statement was stupid.
What's clear is that you're using arguments ostensibly from me, but not actually from me, to try to discredit things I actually did say.

It's bad enough to post with an abusive tone without resorting to fabrication.
 

Is there a comment missing from this?
 
Is there a comment missing from this?

yes, that was weird, Chrome won't let me type anything, the text box is missing in HTML....

Anywho, what I was trying to say was:

I don't think there are any people at 1440P with 120Hz-144Hz going over a dual link DVI, as that connection does not have enough bandwidth to support what you speak of. This is why you see display port on monitors that support those specs.

While the number may be small, the whole Korean monitor shebang was a thing because of those monitors reaching over 120Hz on dual link DVI. Infact I have a crossover 27" 1440p monitor that I run daily at 120hz for gaming over a single dual link DVI cable.

Currently this issue sucks as every time I boot, my screen has 4 mirrored screens in each corner instead of one display. I can edit and post a screenshot later if interested but imagine that the GPU just uses the 4 quadrants as separate displays that are in mirror mode.

It's quite strange and I've never had this issue before, ever reboot it happens now, I have to launch NVidia control panel and refresh my screen settings to get it back to a single display so I can use it.

I really hope this is fixed soon.
 
I believe 368.69 resolved the issue with pixel clocks. That or the new 368.81
 
for me .69 introduced the issue as a problem every single boot up (before it was intermittent) but .81 resolved the issues as you suggested!

All is well and back to 120hz
 
Yes it's fixed now in .81 ...guess it was just a mistake on their part.
 
Wait, what? What's going on with the 1080 compared to Titan X? Should I hang onto my Titan X? What's the deal?

There's a power management bug with high bandwidth displays, which causes artifacting. Think 144hz+ panels, OC 4k screens, etc.
 
came here just to say I don't get this issue. EVGA 1080 FE - boot at 2560x1440 @ 100hz over DVI on a korean OC monitor.
 
came here just to say I don't get this issue. EVGA 1080 FE - boot at 2560x1440 @ 100hz over DVI on a korean OC monitor.

Nvidia already fixed this with a driver update some time ago :laugh:
 
Back
Top