• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What monitor connection do you use?

What monitor connection do you use?

  • VGA

    Votes: 1,405 15.6%
  • DVI

    Votes: 5,055 56.0%
  • HDMI

    Votes: 1,892 21.0%
  • DisplayPort

    Votes: 594 6.6%
  • Other

    Votes: 78 0.9%

  • Total voters
    9,024
  • Poll closed .

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,834 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
What monitor connection do you use?
 
For my LG IPS I use DVI-D and my 40" LCD connects to my 6870 via HDMI
 
What, no BNC option? :cool:

It's DVI across all the monitors at my house. Any new monitors bought will have DisplayPort and I'll use that when possible.
 
DVI-I for my main rig, HDMI for HTPC.
 
DVI for all of mine
 
Got a shiny new Dell U3011 this week (replacing Dell 2001FP). Using the new monitor with DVI, like the old
 
Using whatever monitors I can dig up, and they haven't reached the digital interface yet. :S

Edit: So VGA for me.
 
Got a shiny new Dell U3011 this week (replacing Dell 2001FP). Using the new monitor with DVI, like the old

That's the only interface, iirc, that works at full res. DP, HDMI, no go here.

Using DVI on my 3008WFP, DP on 3x U2412M and 3x P2311H.
 
Got a shiny new Dell U3011 this week (replacing Dell 2001FP). Using the new monitor with DVI, like the old

I'll take that 2001fp off your hands... I've been looking for another.
 
I use DVI on my main rig because the 2405FPW has no HDMI

I use HDMI on my second rig, which is connected to a 26" Samsung TV/Monitor INCLUDING HD Audio, which is very convenient, and the Samsung source select has a connection detection (it will only list available inputs that are plugged in).

I prefer the HDMI with audio, if only for the fact that I don't have to worry about audio cables...
 
BENQ XL2420T 120hz here. DVI ftw :)
 
I use the DVI on my main monitor and a mini displayport to DVI active adapter for my 2nd monitor
 
I used to use HDMI, but now I use DVI for PC and HDMI for Xbox 360
 
depending on the monitor and what it has ill use the highest it has, 4 Monitors still use VGA, 2 use HDMI
 
DVI on my desktop, HDMI on my laptop.


HDMI still has too many issues for a primary connection, if you arent using the audio.
(no 'standby' mode when computer is idle/asleep, AMD has overscan on by default, DX10 issues with 1080i being preferred over 1080p on some monitors... bleh)
 
Well my monitor only supports VGA so i use VGA
**EDIT: @W1zzard; Thanks for updating the frontpage poll question**
 
Main pc hdmi on monitor 1 and dvi on monitor 2
Second pc vga
Playstation 3 hdmi
Playstation 2 scart

So yeah good question..
 
DVI on my desktop, HDMI on my laptop.


HDMI still has too many issues for a primary connection, if you arent using the audio.
(no 'standby' mode when computer is idle/asleep, AMD has overscan on by default, DX10 issues with 1080i being preferred over 1080p on some monitors... bleh)

so what youre saying it hasnt become fully standardized
 
My setup is like this:
<pre>
PC---(DVI)---> LG 22" 1680*1050 TFT
|
L---(HDMI, TOS)---\
Yamaha Amp ---(HDMI)---> Benq 120Hz 720p Projector
Xbox---(HDMI)---/
</pre>:D

For whatever reason, my HD5870 won't output audio over HDMI, so i had to use an extra TOSlink connection for that.
 
Last edited:
I use a DVI-D to HDMI cable
 
If a digital connection is available, then it's a no-brainer to use it. I use DVI. Mind you, it's mainly older CRT monitors that don't have a digital connection.

I wonder how long until the analog VGA connection will be dropped?
 
Everything except VGA, and Other. :laugh:
 
DVI. HDMI always ends up looking messed up because either the monitor or the card enter this weird "I'm a TV!" mode, which for some reason means everything has to look like shit and be blurry. Like seriously time and time again I run into this. Doesn't make any sense.
 
so what youre saying it hasnt become fully standardized

its standardised. its just that some features were never implemented.


Since it was made for TV's, the 'device can power off the screen in idle' was never implemented. Instead, there is a mode (PS3's use it) that powers the TV on and off with the external device (proprietary?).

same with the 1080i issues i experienced, HDMI assumes that your device is going to force a resolution, like the consoles and set top boxes do. they either dont include (or include poorly) EDID info a lot of the time.


DVI. HDMI always ends up looking messed up because either the monitor or the card enter this weird "I'm a TV!" mode, which for some reason means everything has to look like shit and be blurry. Like seriously time and time again I run into this. Doesn't make any sense.

just find all the overscan and 'enhancements' and turn them off on both ends. its annoying, but 99% of the time you can just set all the crap to 'off' and it works a lot better.
 
Back
Top