• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX "Pascal" Faces High DVI Pixel Clock Booting Problems

Most shooters work very well, even with 290x. not every one like you play with MSAAx8

Of course it can work, but you got the point I reckon - nothing wrong with DVI-D if it suits your monitor and rig, and there is no relation to it being 'low or high end'. It's like tires on a car, if they're the right size, it's fine, but you *can* get super exotic tires too and still be stuck to the same engine under the hood. You only get a faster car with a better engine.
 
Of course it can work, but you got the point I reckon - nothing wrong with DVI-D if it suits your monitor and rig, and there is no relation to it being 'low or high end'. It's like tires on a car, if they're the right size, it's fine, but you *can* get super exotic tires too and still be stuck to the same engine under the hood. You only get a faster car with a better engine.
I didn't try to hurt you or anything ... =P

But the technology is old, no point to use it any more, when DP works well support more resolution and frequencies.

You won't buy used GT-R 1993, if you could get GT-R 2016 for same price.
 
I didn't try to hurt you or anything ... =P

But the technology is old, no point to use it any more, when DP works well support more resolution and frequencies.

You won't buy used GT-R 1993, if you could get GT-R 2016 for same price.

The point is, you are making a silly argument when Nvidia puts a DVI connector on its newest GPU's. "Bwah, not important, old stuff, it can be shit"

Like... huh?
 
The point is, you are making a silly argument when Nvidia puts a DVI connector on its newest GPU's. "Bwah, not important, old stuff, it can be shit"

Sooner or later some year will pick up ur DVI port to, like 1999 toke ur optical drive. ;)

Any way, enjoy ur DVI port, have a nice day....
 
Of course it can work, but you got the point I reckon - nothing wrong with DVI-D if it suits your monitor and rig, and there is no relation to it being 'low or high end'. It's like tires on a car, if they're the right size, it's fine, but you *can* get super exotic tires too and still be stuck to the same engine under the hood. You only get a faster car with a better engine.
That analogy is incorrect, the vehicle can be driven faster due to better tires alone.
 
*note: Nvidia isn't scoring points over here the past year. Fuckup after fuckup, as little as they may be, but it's becoming a real pattern now.

Sadly it doesn't affect them in any way.

Anyway DVI for the win. I wish they kept the connectors anyway.
 
That analogy is incorrect, the vehicle can be driven faster due to better tires alone.

Yeah... car analogies never work. Damn it!
 
I don't care about that at all, I haven't seen DVI connection in years. What sane person will use this old shit anyways with the likes of 1080. This is the best video-card ever.
 
I don't care about that at all, I haven't seen DVI connection in years. What sane person will use this old shit anyways with the likes of 1080. This is the best video-card ever.

You made a new account just to say that you don't care?
 
This is such an obvious problem how could it not have been found by NVIDIA? Also by reviewers and users alike who have high refresh rate monitors.

For example, my setup will default to 144Hz when I install a new graphics card and then the drivers as that's what my monitor supports.
 
This is such an obvious problem how could it not have been found by NVIDIA? Also by reviewers and users alike who have high refresh rate monitors.

For example, my setup will default to 144Hz when I install a new graphics card and then the drivers as that's what my monitor supports.
I suppose reviewers didn't do reviewing much with DVI connections.

I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.
 
Or just lower the Hz everytime before shutting down, which what I would do, I'm uncomfortable flashing my expensive card.

You can do it with just one click with batch file to run simple app like Display Changer : http://12noon.com/?page_id=80

I use it to test an image processing project a couple months ago, to test how the application react in different refresh rate, resolution, and color depth without touching Windows display settings or even graphic card driver settings at all.
 
I suppose reviewers didn't do reviewing much with DVI connections.

I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.
DVI is pretty standard and hardly obsolete, so this is an obvious fault that should have been picked up straight away. DisplayPort is only required for 4K and above.
 
Correct me if I'm wrong, but is this only affect people who OC their monitors to connected by dual link DVI?
 
This sound like a Bios update is needed, dunno if it can be fixed with a new driver which could be awesome if it was possible.
 
Correct me if I'm wrong, but is this only affect people who OC their monitors to connected by dual link DVI?

If so, it's laughable. Truly. For all the fairy protests from the usual sources, for a problem they probably don't have. For an issue that affects so few who overclock monitors not really designed for it in most cases.
More tea cups and more storms.
 
I suppose reviewers didn't do reviewing much with DVI connections.

I mean who does gaming with such expensive and new tech with old monitors? Or maybe there are modern DVI connections, I haven't seen anything like this myself.
Imho people don't change their monitors nearly as often as they change their graphics cards or other components, therefore assumption that if you have a 1080 in your system, you won't have an older monitor, is incorrect.
I've got two monitors where one is old, so I should just throw it away? Even when it works just fine?
 
All of the low latency korean monitors came with only dvi (multiple inputs increase latency). Tons of them were sold for refresh OCing.

I have no monitors with DP. DP is a new addition to most monitors.

I have a DP capable monitor I bought like 5 years ago (HP z24something). And it didn't cost an arm and a leg.

Maybe this will convince Nvidia it's time to put DVI to greener pastures. It's limited to 2560x1600@60Hz anyway (see https://en.wikipedia.org/wiki/Digital_Visual_Interface#Technical_overview)
 
If so, it's laughable. Truly. For all the fairy protests from the usual sources, for a problem they probably don't have. For an issue that affects so few who overclock monitors not really designed for it in most cases.
More tea cups and more storms.

For every mini scandal covered by the media, 99 more slip through the cracks.
 
Maybe this will convince Nvidia it's time to put DVI to greener pastures.

The market is still flooded with monitors that offer only HDMI, VGA and DVI-D.
I have bought a nice one not long ago with IPS display.
I have it connected through DVI-D.
A tv takes the HDMI port.
 
Or maybe there are modern DVI connections, I haven't seen anything like this myself.

Yes, exactly, there are. My monitor is new as well as a recent model. It has DVI and DP. I play at 60Hz and am very comfortable and pleased with that. DVI is just fine. It is merely the victim of elitist snobbery.
 
Funny.

Nvidia cards are having problems with DVI.
Solution: Throw away your monitors that still use DVI. They are old.
Really? Maybe before telling someone to change his monitor you should send him a few hundred dollars.

Anyway, one more problem where there is an easy workaround. The question is how many .bat workaround files can someone have on his desktop, especially after paying hundreds of dollars for a Pascal card and a high refresh rate GSync(or not) monitor. One workaround for high power consumption when sitting on the desktop at 144Hz, one workaround for booting/shutting down the system, more workarounds in the future?

Probably Nvidia will implement something more automatic in their next driver, like auto changing refresh rates when shutting down and when booting up. But they should had seen this in time and fix it. A few extra frames at the latest title does sell cards, but their reputation about drivers quality took years to establish and people will stop forgiving them for little things like this.
 
but their reputation about drivers quality took years to establish and people will stop forgiving them for little things like this.

As far as I am concerned Nvidia already dashed their driver reputation last summer with a string of bad drivers, leaving many to hang onto older drivers until Autumn.

It's why I always wait at least 2 releases now before upgrading drivers. They can't be trusted anymore not to release shit.
 
How was this not found by nvidia in super early testing, let alone reviewers? Rush, rush, rush, who cares, launch them all no matter the incompetence. I imagine the delay for users to find this was b/c almost no one can get the cards anyway XD
No kidding. Nvidia has really been dropping the ball in the last year or so, between rushed pascal and driver problems.

There is some term to describe this, IDK what it is called, when a company is in the lead they stop caring about their products as much as when they had competition. nvidia is in that stage. And much like how the same stage let IE get overtaken by chrome and firefox, nvidia may be in trouble here soon if they dont shape up.

I will say I regret not getting the M295x in my alienware 15 when I had the chance. Nvidia's optimus has been disappointing lately.
Yes, exactly, there are. My monitor is new as well as a recent model. It has DVI and DP. I play at 60Hz and am very comfortable and pleased with that. DVI is just fine. It is merely the victim of elitist snobbery.
Or technological obsolescence. There is nothing wrong with DVI, it's just old. Technology moved forward since DVI came onto the market in 1999. Heck, dual link DVI doesnt officially support over 60hz at 1600p. Modern monitors are pushing 144 hz 1080p, and 144hz 1440p will be coming.

of course it works fine for 60hz, but more people are beginning to get on the high refresh rate train. In that world, DVI no longer matters.
 
Last edited:
Back
Top