1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

advice needed on which monitor cable to use

Discussion in 'General Hardware' started by keakar, Jan 10, 2008.

  1. keakar

    keakar

    Joined:
    Mar 27, 2007
    Messages:
    2,376 (0.89/day)
    Thanks Received:
    292
    Location:
    just outside of new orleans
    i need advice on which monitor cable to use

    i dont know the difference in the analog vs dvi cable so can someone explain it too me in simple terms?

    my monitor only came with the analog cable with the blue connectors on it so what will give me the best results?

    motherboard is the Gigabyte P35 DS3L with the Q6600 cpu

    the monitor i have is this one: SAMSUNG 2220WM Black 22" 5ms DVI Widescreen LCD Monitor 300 cd/m2 1000:1
    http://www.newegg.com/Product/Product.aspx?Item=N82E16824001240

    the video card im using is this one: EVGA 512-P3-N802-AR GeForce 8800GT Superclocked 512MB 256-bit GDDR3 PCI Express 2.0
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814130319

    thanks in advance for sharing your knowledge :toast:
  2. ntdouglas

    ntdouglas New Member

    Joined:
    Jan 19, 2007
    Messages:
    1,026 (0.37/day)
    Thanks Received:
    44
    Location:
    Chicago
    They all come with a dvi. Dvi looks alot like analog but its not. Yours should have came with a dvi cord and a digital to analog converter plug in.
  3. Kreij

    Kreij Senior Monkey Moderator Staff Member

    Joined:
    Feb 6, 2007
    Messages:
    13,881 (5.09/day)
    Thanks Received:
    5,615
    Location:
    Cheeseland (Wisconsin, USA)
    If your monitor only has an analog input (15 pin) it won't matter what output you use from your video card. You can convert from DVI to analog with an adapter, but the results will be the same as far as image quality goes.

    If both the video card and monitor have DVI ports, use them. Much better.
    I believe that both of the devices you listed have DVI connections, so get a DVI cable if you do not have one.
  4. keakar

    keakar

    Joined:
    Mar 27, 2007
    Messages:
    2,376 (0.89/day)
    Thanks Received:
    292
    Location:
    just outside of new orleans
    yes my monitor has an analog input (15 pin) and dvi input and it came with a dvi cable with white plugs and also a vga cable with blue plugs which i believe is refered to as analog cable

    my video card has dvi connections and a converter to connect the blue vga cable (analog)

    i assume dvi is better but i figured someone here could explain the difference in the two and if i should upgrade it somehow, you know like maybe get better cables like the ones that have gold plating or something.
  5. Kreij

    Kreij Senior Monkey Moderator Staff Member

    Joined:
    Feb 6, 2007
    Messages:
    13,881 (5.09/day)
    Thanks Received:
    5,615
    Location:
    Cheeseland (Wisconsin, USA)
    If both the monitor and the VC have DVI you don't need any fancy, expensive cables as you are not going to be transmitting more that a couple of feet between the monitor and computer.

    All the "super connection" sales pitch is crap as most cable have almost zero loss at lengths of less that 2 meters (six feet).

    The gold connectors will not tarnish in the way that copper will, but I have yet to see a cable connection that is tarnished that is kept in a enviromentally controlled room (like your home).
    keakar says thanks.
  6. Kreij

    Kreij Senior Monkey Moderator Staff Member

    Joined:
    Feb 6, 2007
    Messages:
    13,881 (5.09/day)
    Thanks Received:
    5,615
    Location:
    Cheeseland (Wisconsin, USA)
    Let me share a bit more....

    In audio the idea of high quality cables has some merit.
    When sending analog signals over cable they are much more succeptable to interference and noise introduced through poor coupling (connections). This launched the market for items like Monster Cables and the like.

    With the advent of digital signal transmission, the idea of getting a clean signal to the destination is important, but the transmission protocols and the high speed signal processors are capable of making sure that the digital transmissions are accurate and timely without having to rely on high end transmission components. A copper signal wire will do just fine for everything we have today in the digital world for short transmission lengths.

    All in all, you are much better off spending money on high end components that you are on over-priced cables.

    As always, this is my opinion and someone out there will swear they ot an extra pixel to light up with a $100 cable.
    keakar says thanks.
  7. keakar

    keakar

    Joined:
    Mar 27, 2007
    Messages:
    2,376 (0.89/day)
    Thanks Received:
    292
    Location:
    just outside of new orleans
    ok, so its not like a cable wire where you get better picture out of a better cable as in signal loss is not an issue.

    ok then whats the difference in analog signal vs digital signal? isnt it converted anyway? is the difference about speed, picture quality, or response times?


    :roll: :roll: :roll: :toast:
    Last edited: Jan 10, 2008
  8. Graogrim New Member

    Joined:
    Jan 1, 2008
    Messages:
    308 (0.13/day)
    Thanks Received:
    31
    Location:
    East Coast US
    Agreed 100%.

    A digital signal consists basically of 0's and 1's, defined at precisely calibrated levels. An analog signal, on the other hand, consists of a varying waveform which is analogous to the values being represented. The tradeoff between the two with regard to signal quality lies in the way they degrade. An analog signal degrades by gradually becomes a worse approximation of the data it's intended to represent, getting "blurrier" and "muddier" until it reaches a state of illegibility.

    A digital signal is more absolute. It will tolerate signal degradation without any loss of quality all the way up to the point where the 0's and 1's start to become indistinguishable, at which point--barring error correction techniques beyond the scope of this discussion--the information is irretrievable. If you've watched digital satellite TV you've probably noticed occasional hiccups and bursts of blockiness. Those are the manifestations of digital data loss--a more radical change in picture quality when the signal is bad, but as long as the signal remains above a certain minimum level it's a perfect reproduction of the original data.

    It's for this reason that you don't need super-expensive digital cables. As long as the 1's and 0's make it across the connection in a state that's "good enough" to be distinguished, you can't improve on the origninal data, and no amount of gold or platinum or unobtanium in the connection will make it any better.
    Last edited: Jan 10, 2008
    keakar says thanks.
  9. Darknova

    Darknova

    Joined:
    Nov 8, 2006
    Messages:
    5,037 (1.79/day)
    Thanks Received:
    535
    Location:
    Manchester, United Kingdom
    Put simply. DVI is a digital signal, hence you get a clearer, crisper picture. I however think the colour reproduction is slightly lower on DVI but it's not really that noticeable. I'm just picky :p
    keakar says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page