1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

No Display With DVI to VGA Adapter

Discussion in 'AMD / ATI' started by MikeTyson, Jun 28, 2010.

  1. MikeTyson New Member

    Joined:
    Dec 9, 2009
    Messages:
    645 (0.37/day)
    Thanks Received:
    20
    Location:
    South East, United Kingdom
    Basically I have a monitor which accepts both VGA and DVI input, but I only have a VGA cable... and I only have a card with DVI slots (this is a lol situation)

    And I have an adapter which converts DVI to VGA... so I plug this onto the graphics card and put my VGA cable into it, and plug the other end of the VGA cable into the monitor.

    But it does not detect any active display and goes straight into power save mode.

    But if I put in an old VGA card and use the VGA cable with the same monitor, it gives a display.

    The card is not the issue (X1300) as it also has a DVI output, which I have used the VGA adapter on and again it gives no display on the monitor, and it is not the VGA adapater that's the problem because I've used different ones and they all give no display.

    I'm trying to get this sorted so I can put in my spare 7950GT and use that instead... but it only has 2 DVI outputs. (yes I have tried both of them before you ask)

    So what exactly is the problem here? Thanks in advance :)
  2. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,457 (6.26/day)
    Thanks Received:
    3,423
    Location:
    IA, USA
    Did you use the same DVI-A to VGA adapter on both cards? Also, make sure on the monitor, it is configured for analog or VGA. If it is auto searching, they might not be able to find each other.
    Crunching for Team TPU
  3. MikeTyson New Member

    Joined:
    Dec 9, 2009
    Messages:
    645 (0.37/day)
    Thanks Received:
    20
    Location:
    South East, United Kingdom
    Yes I used the same adapter on both :)

    The monitor has a picture of a VGA slot on the input select and is titled "RGB" and that works fine when I'm using the X1300 with VGA output with just the regular cable.

    But when I put the adapter on the DVI slot on the X1300 and the other end plugged into the monitor is still VGA, it doesn't find any display and goes to power save :(

    It should be configured for Analogue or VGA because it works regularly with normal VGA to VGA cable.
  4. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,457 (6.26/day)
    Thanks Received:
    3,423
    Location:
    IA, USA
    Have you recently confirmed the 7950GT works at all? Does it display on a DVI monitor? If no, I'd assume it is dead or incompatible with your motherboard (rare, but can happen). If it doesn't work in another system with different hardware, it's a pretty sure bet the card is done for.
    Crunching for Team TPU
  5. MikeTyson New Member

    Joined:
    Dec 9, 2009
    Messages:
    645 (0.37/day)
    Thanks Received:
    20
    Location:
    South East, United Kingdom
    No the card works fine, I literally just removed it from another system to put into this one :)

    But it's not the cards problem.. because I don't get a display with the DVI from an X1300 either

    I've also tried using my old X850XT Crossfire Edition to test the DVI adapter and that didn't give a display output either

    So what could be the problem? Because I have 3 different adapters and have tried them all and none of them work :(

    And I'm not prepared to shell out near £30 (or any money in fact) for a new DVI cable when I have a perfectly fine working VGA cable
  6. MikeTyson New Member

    Joined:
    Dec 9, 2009
    Messages:
    645 (0.37/day)
    Thanks Received:
    20
    Location:
    South East, United Kingdom
    This could be why (I had no idea all this rubbish existed)

    http://www.hisdigital.com/UserFiles/news/200902261616114965.png

    My adapter has the socket titled DVI-I (Dual Link)

    I do however have an adapter which does have some pins missing which looks like the DVI-A one at the bottom of the picture

    However I've tried this before as well and it gave no luck, so what the helllll bitch!

    A website said something about "your DVI graphics card needs to be capable of analogue output" but surely almost all cards are?

    Is my X850XT or 7950GT capable of analogue output?
  7. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,457 (6.26/day)
    Thanks Received:
    3,423
    Location:
    IA, USA
    So...

    You have three cards:
    -Radeon X1300
    -Radeon X850 XT
    -GeForce 7950 GT


    What works:
    -Radeon X1300: VGA works

    What doesn't work
    -Radeon X1300: DVI-A to VGA adapter
    -Radeon X850 XT: DVI-A to VGA adapter
    -GeForce 7950 GT: DVI-A to VGA adapter

    That obviously makes DVI-A to VGA adapters look like the guility party but no, you said you used 3 different ones with the same results.


    I guess that leaves only one question: how many monitors are you trying this on? Just one? Can you try a different monitor to see if it yeilds the same results?


    If they are functioning correctly, yes. DVI-I (what is on the cards) includes DVI-D (digital) and DVI-A (analog). The adapter you are using should be a DVI-A to VGA adapter which basically means it is only pulling the analog signal out.


    DVI-D will look better on a digital screen (like LCDs) than VGA. Make sure the cable is DVI-D though. Some monitors won't accept DVI-A/DVI-I.
    Crunching for Team TPU
  8. MikeTyson New Member

    Joined:
    Dec 9, 2009
    Messages:
    645 (0.37/day)
    Thanks Received:
    20
    Location:
    South East, United Kingdom
    Think I just accidentally erased my entire last post LOL

    Awwhhh shit and all my ice cream has melted before I got to eat it :(

    I see, but VGA cables are all the same yes? So this means that whilst I'm using a VGA cable, as long as the adapter is correct I should see a result correct?
  9. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,457 (6.26/day)
    Thanks Received:
    3,423
    Location:
    IA, USA
    If it has VGA on one end, it can be DVI-I (with unused pins) or DVI-A on the other end. I've never met a broken adapter so the chances of any of them working are pretty good.

    Just for clarity, you have (correct me if I'm wrong):

    DVI-I Female (vid card) -> DVI-I or DVI-A Male (adapter) -> VGA Female (other end of adapter) -> VGA Male (cable) -> VGA Male (other end of cable) -> VGA Female (monitor)



    I have to assume it really, really doesn't like VGA adapters. I have no idea how or why but that's really the only conclusion I'm drawing. The cheapest thing to try (and a better picture to boot) is a DVI-D cable. Again, make sure it is DVI-D. Some monitors won't accept DVI-I.


    Yes, D-Sub 15 (15 pins) with male or female ends. DVI-A, except the arrangement of pins, is the exact same as VGA--at least in theory. I've encountered situations where DVI-D worked but DVI-A didn't but as I said, that's rare and the odds of it happening across three cards at once is non-existant.
    Crunching for Team TPU
  10. Necrofire

    Necrofire New Member

    Joined:
    Nov 1, 2007
    Messages:
    586 (0.23/day)
    Thanks Received:
    45
    The adapter isn't the problem, as if the card doesn't output vga, it will be missing the 4 pins on the left of that picture (like the single link or dual link one) and thus the adapter wouldn't go in.

    Sucks about the ice cream.

    Also, I agree with the assumption that the monitor isn't liking the adapter.
  11. MikeTyson New Member

    Joined:
    Dec 9, 2009
    Messages:
    645 (0.37/day)
    Thanks Received:
    20
    Location:
    South East, United Kingdom
    yeahhh bruv, you KNOW DAT!

    Do you think if I put another adapter on the TV's DVI input (so there is now an adapter on both ends of the VGA cable) that it would stand a better chance of working?
  12. MikeTyson New Member

    Joined:
    Dec 9, 2009
    Messages:
    645 (0.37/day)
    Thanks Received:
    20
    Location:
    South East, United Kingdom
    I seeeeeeeeee! Well I'm currently at home and the PC is at the GYALDEMZZZ YARD!

    I'll be there in a few hours and I'll check back with you mandemz on the flipside and tell you what the monitor is saying :)
  13. Necrofire

    Necrofire New Member

    Joined:
    Nov 1, 2007
    Messages:
    586 (0.23/day)
    Thanks Received:
    45
    If the tv takes VGA input via DVI, then maybe, but I doubt it.
    At that point, only the analog signal (vga) would be going over the wire.
    I really don't know if monitors/tvs can take analog input through DVI.

    FordGT90Concept will know more than I know about VGA input via DVI probably.

    EDIT: thanks for the definitions, I was very confused.
  14. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,457 (6.26/day)
    Thanks Received:
    3,423
    Location:
    IA, USA
    I highly doubt that will work. Most monitor DVI inputs are DVI-D because they only accept a digital signal. If it is a DVI-I input, you could certainly try it but my money is on it not working. I think most monitors have like a DVI-I port internally and the split it to VGA and DVI-D which cuts costs.
    Crunching for Team TPU
  15. TYCentury New Member

    Joined:
    Jan 31, 2013
    Messages:
    1 (0.00/day)
    Thanks Received:
    0
    Video card power?

    I realize that this topic is a skeleton by now, but for anyone who cares; another issue could be that the new video card may be too demanding for your PSU, an could take it over it's watt limit. Anyone who finds this issue with an adapter of this sort should check power demands. It seems rather silly, but it's actually a very common mistake!

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page