• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Dvi-D? to vga, no video

  • Thread starter Thread starter Deleted member 24505
  • Start date Start date
D

Deleted member 24505

Guest
I just connected an old monitor to my Asus gtx560ti top as a second screen using a DVI-VGA adapter, but am getting no video. I think the card has 2xDVI-D+mini HDMI

My main monitor is a Dell 24" connected by Dvi-Dvi

What's the problem?

Thanks
 
As single monitor it works(the old one)?
 
what does the Nvidia software say when you try to enable it? Is it even being seen? Do you have the resolution set with in its bounds?
 
Only DVI-I and DVI-A has analog. Your card does have DVI-I connectors so that isn't the problem. All I can suggest is the process of elimination:
1) make sure the source on the monitor has VGA selected.
2) try using the DVI-I to D-Sub adapter on both DVI-I ports on the card.
3) try using a different VGA cable
4) try using a different DVI-I to D-Sub adapter
5) try a different monitor
If all of the above fails, I suspect the RAMDAC in the GPU is defective.
 
Only DVI-I and DVI-A has analog. Your card does have DVI-I connectors so that isn't the problem. All I can suggest is the process of elimination:
1) make sure the source on the monitor has VGA selected.
2) try using the DVI-I to D-Sub adapter on both DVI-I ports on the card.
3) try using a different VGA cable
4) try using a different DVI-I to D-Sub adapter
5) try a different monitor
If all of the above fails, I suspect the RAMDAC in the GPU is defective.

Are you sure my card has DVI-I and not DVI-D which is ana only?

This is my card-
http://www.asus.com/Graphics_Cards/ENGTX560_Ti_DCII_TOP2DI1GD5/

I am using a DVI-I to VGA adapter, then a normal VGA cable to the monitor.

And yes the second monitor does work.

what does the Nvidia software say when you try to enable it? Is it even being seen? Do you have the resolution set with in its bounds?

It does not show up in either Nvidia control panel or windows diplay settings.
 
You sure that adapter it is good, the monitor with adapter, in single setup work?
 
You sure that adapter it is good, the monitor with adapter, in single setup work?

Not tried it in single setup.

I am kinda thinking my card is DVI-D which is digital only, so the DVI-I adapter I am using is getting no VGA video signal.

Would a DVI-D to VGA adapter work?
 
Are you sure my card has DVI-I and not DVI-D which is ana only?

This is my card-
http://www.asus.com/Graphics_Cards/ENGTX560_Ti_DCII_TOP2DI1GD5/
Asus Specs said:
DVI Output : Yes x 2 (DVI-I)

And yes the second monitor does work.
Did you swap the two to see if the issue follows the adapter/cable/monitor or does it stay with the DVI-I port on the GPU?

Would a DVI-D to VGA adapter work?
It would have to be an active DVI-D to VGA converter. A DVI-D to VGA adapter doesn't exist.
 
Did you swap the two to see if the issue follows the adapter/cable/monitor or does it stay with the DVI-I port on the GPU?


It would have to be an active DVI-D to VGA converter. A DVI-D to VGA adapter doesn't exist.

I will try switching them, to see if the DVI cabled dell monitor works on the right hand (from the back) DVI connector, And try the DVI-I adapter with the second monitor on works on the left DVI connector.

So is this item mis-advertised?
http://www.amazon.co_uk/dp/B003K3060I/?tag=tec053-21

Edit-

Switched them around, with windows running, nothing, switched them back and rebooted, and it is working now. Funny, i did reboot before, but hey ho it is working now.

Thanks guys
 
Last edited by a moderator:
Switched them around, with windows running, nothing, switched them back and rebooted, and it is working now. Funny, i did reboot before, but hey ho it is working now.
Never switch them with the PC and Windows running. You risk damaging your monitor. Monitor connections are NOT "hot-swappable". It's not just a cable. The complex technical handshake (card-monitor) only works after a reboot.
 
The pictures shows that it is a DVI-D connector but DVI-D is not capatible with VGA. It is missing the four pins by the flat connector to make it DVI-A. It clearly isn't a converter and it clearly isn't DVI-A to VGA so, it's a piece of junk. The reviews confirm that.

Switched them around, with windows running, nothing, switched them back and rebooted, and it is working now. Funny, i did reboot before, but hey ho it is working now.
If it malfunctions again, I'd take a good look at all the pins to make sure none are damaged/defective.


As Blín D'ñero said, DVI nor VGA are hot swappable. It is always best practice to plug/unplug when the computer is off.
 
The pictures shows that it is a DVI-D connector but DVI-D is not capatible with VGA. It is missing the four pins by the flat connector to make it DVI-A. It clearly isn't a converter and it clearly isn't DVI-A to VGA so, it's a piece of junk. The reviews confirm that.

The existence of these is news to me. I was originally going to reply as the second post that DVI-D doesn't have analog pins and would be incompatible, but I assumed it was a typo and the OP meant DVI-I. I didn't expect that such an adapter actually existed. I guess there are unscrupulous sellers who prey on people who don't know enough about the specification to understand that it would never work.
 
Same here. I had to look up the DVI-A specs to make sure the picture was DVI-D and it was. I can't think of any situation where that thing would be useful because it is nonstandard. It would be good as a prank though because you have to look pretty closely at it to realize it isn't DVI-A.
 
Never switch them with the PC and Windows running. You risk damaging your monitor. Monitor connections are NOT "hot-swappable". It's not just a cable. The complex technical handshake (card-monitor) only works after a reboot.

You got info to back up this claim? I've plugged and unplugged many monitor cables while machines were running and have never seen any "damage".

HDMI at least has HDT (hot plug detect) which wouldn't make sense if you couldn't hot plug it.

HPD
The HPD (Hot-Plug-Detect) feature is a communication mechanism between a source and a sink device that makes the source device aware that it has been connected/disconnected to/from the sink device. When an HDMI cable is inserted between the two devices, the resulting hot-plug detection instantiates a start-up communication sequence. The EDID information stored in the sink device gets read by the source device though the DDC bus, and the source device typically presents itself on the CEC link and requests basic status information from the sink device such as its power status as well as other devices on the HDMI chain.

Source
 
Last edited:
I will always shut down before switching monitors from now on, have always done it with the rig running and never saw any damage, but I wont risk it any more.
 
You got info to back up this claim? I've plugged and unplugged many monitor cables while machines were running and have never seen any "damage".

HDMI at least has HDT (hot plug detect) which wouldn't make sense if you couldn't hot plug it.



Source
Sorry, i should have mentioned that i meant DVI = not hot-swappable.
1. Won't work under Windows anyway.
2. Possible damage: i base that on manuals (ink on paper) of monitors and graphics cards in the past. I'm not responsible for that info.
 
Sorry, i should have mentioned that i meant DVI = not hot-swappable.
1. Won't work under Windows anyway.
2. Possible damage: i base that on manuals (ink on paper) of monitors and graphics cards in the past. I'm not responsible for that info.

DVI, HDMI, and DisplayPort all support HDT, which means they are hot-swappable. VGA does not.
 
I guess it is pretty easy to understand that when hotswapping a DVI plug one risks not connecting all pins at the same time. Same with VGA. Probably that is of some concern.

You seem to want it to be OK risking. I am in the better safe than sorry club. ;)
 
The second monitor is a samsung syncmaster 923NW which is native at 1140x900. Neither windows props or Nvidia control panel is showing that resolution, if i force it to 1440x900 using Nvidia control panel I end up with a 1/2" black line across the top and the picture seems to be too big (wider than the screen display area)
 
Try setting the res at correct settings then use the monitors controls to finish off the adjustment (threw the menu on the monitor ). Should come out perfect.
 
The second monitor is a samsung syncmaster 923NW which is native at 1140x900. Neither windows props or Nvidia control panel is showing that resolution, if i force it to 1440x900 using Nvidia control panel I end up with a 1/2" black line across the top and the picture seems to be too big (wider than the screen display area)
Are there any over-/underscan options enabled in the control panel? That could do it.
 
Have you enabled EDID?
Somewhere in Nvidia controlpanel should be that option.
 
Try setting the res at correct settings then use the monitors controls to finish off the adjustment (threw the menu on the monitor ). Should come out perfect.

Tried, just does not seem to work, I can't move the picture up or down.

Are there any over-/underscan options enabled in the control panel? That could do it.

Not that i can see, my main monitors display is fine.

Have you enabled EDID?
Somewhere in Nvidia controlpanel should be that option.

No idea, and looked but can't find it.

I have the monitor set at 1152x864, as that is the closest setting without a big bar at the top.

I will take some pics this afty to show you.
 
I just connected an old monitor to my Asus gtx560ti top as a second screen using a DVI-VGA adapter, but am getting no video. [...]
Does the old monitor have an OSD (on screen display) for input etc? and is it set to VGA - IN?
 
Back
Top