• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DVI vs HDMI?

imperialreign

New Member
Joined
Jul 19, 2007
Messages
7,043 (1.07/day)
Location
Sector ZZ₉ Plural Z Alpha
System Name УльтраФиолет
Processor Intel Kentsfield Q9650 @ 3.8GHz (4.2GHz highest achieved)
Motherboard ASUS P5E3 Deluxe/WiFi; X38 NSB, ICH9R SSB
Cooling Delta V3 block, XPSC res, 120x3 rad, ST 1/2" pump - 10 fans, SYSTRIN HDD cooler, Antec HDD cooler
Memory Dual channel 8GB OCZ Platinum DDR3 @ 1800MHz @ 7-7-7-20 1T
Video Card(s) Quadfire: (2) Sapphire HD5970
Storage (2) WD VelociRaptor 300GB SATA-300; WD 320GB SATA-300; WD 200GB UATA + WD 160GB UATA
Display(s) Samsung Syncmaster T240 24" (16:10)
Case Cooler Master Stacker 830
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro PCI-E x1
Power Supply Kingwin Mach1 1200W modular
Software Windows XP Home SP3; Vista Ultimate x64 SP2
Benchmark Scores 3m06: 20270 here: http://hwbot.org/user.do?userId=12313
I was curious about a couple of things, after debating the benefits of both display standards with a buddy the other day . . .

How exactly do the two display standards measure up to each other, especially in regards to PC monitors (not so much large-display TVs)?


Now, I know HDMI is pretty much a necessity for displays larger than 1920x1200, but, in regards to PC monitors, the vast majority of our display cards don't support a native HDMI output . . . instead, one must use a DVI=>HDMI adapter - which, I was always under the impression such adapters added a small amount of latency to the actual display, which could result in poor display performance with fast-paced subject matter (i.e. games).

As well, I was also under the understanding that such adapters don't allow for the full bandwidth of the display type . . . that is, the HDMI output bandwidth through a DVI=>HDMI adapter would only be the max that the native DVI output is capable of?


I know for sure, though, that DVI can not support an audio pass-through, where HDMI can (although, in defense of DVI, even though HDMI can support up to 8-channel pass-through, nearly 95% of all products on the market that support HDMI connectivity only support 2-channel I/O, necessitating a seperate audio connection for multi-channel support) . . . as well, DVI is not capable of supporting HDCP content (although, if one is using their rig primarily for games, such might not be much of a concern) . . . but are such drawbacks truly a hinderance of DVI for standard PC setups?

So, I guess my real question is . . . which standard should be the preferred method of display connection with a "jack of all trades" PC?
 
HDMI, It's slim, easier to plug in, it's the way of the future.
The DVI cable is hard and thick, which make it quite easy to fall off if you don't tighten the screws.
Since I only have 1 monitor in my room, im usually switch it between my PS3 and my PC, HDMI cables make it so much easier.
 
Used both types and noticed 0 difference but my v cards have always been DVI out though. Maybe later they will put 3+ HDMI outputs on a single v card later :P.
 
Used both types and noticed 0 difference

I agree. I've switched between both, and don't notice a difference. If you have the ability to use HDMI, the use it.
I'm using DVI though, only because I think my card doesn't like the port converter. My picture can become a redish hue
when running HDMI :shadedshu Which is why I'm using DVI lol
 
DVI is prett much the same as HDMI the only real differnce is that DVI doesnt carry the audio signal like the HDMI does. ... the connection is different, of course...

You wont see any difference between the 2
 
DVI is prett much the same as HDMI the only real differnce is that DVI doesnt carry the audio signal like the HDMI does. ... the connection is different, of course...

You wont see any difference between the 2


DVI and HDMI are both carriers of digital signal. the ONLY diffrence being that HDMI will also carry digital audio with it.

There is no real world limit to the resolution each can carry either.... atleast not now anyways.

Personally i hate HDMI, what it does, and what it stands for. but thats marketing for you.
 
DVI is a digital signal with backwards compatibility for VGA.

HDMI is based on DVI and in fact 100% compatible - they just stripped the VGA compatibility and put audio in its place.
There is no difference at all between HDMI and DVI as far as video signals are concerned. no quality changes, no resolution changes, NOTHING.

All ATI video cards since the 2K series support 5.1 audio (even the onboard cards) and the 3K series and up support 7.1 - thats built in.

Nvidia cards need an SPDIF cable for input, so they can do upto 7.1 but it all depends on the sound card you use.

Personally i like HDMI. I dont give a damn about sony and its constantly changing specs, but the cable is thinner, longer, lighter and cheaper than DVI - and it doesnt require annoying screws to hold it in place. with my ATI cards, i have DVI, VGA, and HDMI with audio from the one port with the adaptors that came with the card.
 
Last edited:
I use vga on my HTPC. For some reason both DVI and HDMI make the desktop look terrible and hard to read.

But in essence they are both exactly the same signal.
 
I use vga on my HTPC. For some reason both DVI and HDMI make the desktop look terrible and hard to read.

But in essence they are both exactly the same signal.

some samsung screens have a "game mode" in the HDMI options which makes things look nasty. mine had it on by default (2494HS). should be fine over DVI, unless of course its default to the wrong res/"movie mode" etc
 
DVI and HDMI are both carriers of digital signal. the ONLY diffrence being that HDMI will also carry digital audio with it.

There is no real world limit to the resolution each can carry either.... atleast not now anyways.

Personally i hate HDMI, what it does, and what it stands for. but thats marketing for you.

Actually there are some current limitations on resolutions I beleive, 25xx is the max as I understand it (certainly for DVi), refresh rates between DVi and HDMI differ as well of course (DVi @ 60hz, HDMI at 75hz but I think you can do something about that on HDMI) but I agree, visually there is no difference (to my old eyes anyways), my monitor has only DSub or HDMI input, fortunatly my Vid card has HDMI output so obviously I go for that.
 
some samsung screens have a "game mode" in the HDMI options which makes things look nasty. mine had it on by default (2494HS). should be fine over DVI, unless of course its default to the wrong res/"movie mode" etc

I have a P2270HD and have lots of options in it that I can't access and I think it's because I'm using DVI, I also have that "game mode" option but can't access it... I was wondering if the HDMI adapter was the same thing as having an actual HDMI cable
 
I use vga on my HTPC. For some reason both DVI and HDMI make the desktop look terrible and hard to read.

But in essence they are both exactly the same signal.


Into a HDTV rather then a monitor?

I found the same actually, I found putting the sharpness up on the tv helped.
 
DVI dual-link (7.92 Gbit/s) is faster than HDMI 1.2a and older (3.96 Gbit/s). At HDMI 1.3, DVI dual-link supports the same resolution (2560×1600) as HDMI but DVI can't handle as high of a resfresh rate (75Hz compared to 60Hz). HDMI 1.4 supports 4096x2160 at 24 Hz.

Most DVI devices do support HDCP but HDCP is a bad thing so you don't want support for that anyway.

VGA -> DVI -> HDMI -> DisplayPort
VGA is included in DVI, DVI is the basis for HDMI, and DisplayPort has HDMI at its core.

DVI and DisplayPort are royalty free--HDMI is not. Like HDMI, DisplayPort includes 128-bit AES encryption of the signal. In effect, both suck and I stick to DVI with non-HDCP devices or VGA.


Jack of all trades PC? VGA. Splitters, switches, etc. won't cost you a $1000 smacks on VGA but they will on DVI and HDMI. Pretty much every monitor works fine with quality VGA cables. You also won't run the risk of HDCP hatin' you.
 
HDMI = DVI+ Audio
 
Most DVI devices do support HDCP but HDCP is a bad thing so you don't want support for that anyway.

.
wtf is this bullshit?
HDCP is just copy protection, most used for blu-ray.
 
HDCP guarentees only one display per image (disk or graphics card). It does not benefit the consumer at all (actually wastes electricity spent on encrypting/decrypting the signal); it only benefits (hardly) Hollywood--the result of extensive lobbying. It completely ignores fair use laws and is a PITA to everyone that doesn't profit from it. HDCP is also not free requiring a license.

HDCP applies to the data of an interface, not data on the medium. Inheriently, BD-DVDs do not have HDCP but BD-DVD players may be HDCP compliant.


For the most part, HDCP has not caught on (yay!).
 
Last edited:
HDCP guarentees only one display per image (disk or graphics card). It does not benefit the consumer at all (actually wastes electricity spent on encrypting/decrypting the signal); it only benefits (hardly) Hollywood--the result of extensive lobbying. It completely ignores fair use laws and is a PITA to everyone that doesn't profit from it. HDCP is also not free requiring a license.

HDCP applies to the data of an interface, not data on the medium. Inheriently, BD-DVDs do not have HDCP but BD-DVD players may be HDCP compliant.


For the most part, HDCP has not caught on (yay!).

it has indeed caught on, you cant watch blu ray without it - unless you use dubiously legal software.

HDCP is copy protection and nothing more. macrovision, anyone? its not like its a new idea behind it.
 
Now, I know HDMI is pretty much a necessity for displays larger than 1920x1200, but, in regards to PC monitors, the vast majority of our display cards don't support a native HDMI output . . . instead, one must use a DVI=>HDMI adapter - which, I was always under the impression such adapters added a small amount of latency to the actual display, which could result in poor display performance with fast-paced subject matter (i.e. games).

Since DVI and HDMI are both the same display standard, just in a difference interface type. The adaptors do no conversion, and hance add no latency.

Also, HDMI is not needed for larger than 1920x1200, both will do up to 2560 × 1600 @ 60Hz.

As well, I was also under the understanding that such adapters don't allow for the full bandwidth of the display type . . . that is, the HDMI output bandwidth through a DVI=>HDMI adapter would only be the max that the native DVI output is capable of?

Both pretty much have pretty much the same bandwidth, so this isn't true at all. The adaptors do not limit the resolution/bandwidth.


I know for sure, though, that DVI can not support an audio pass-through, where HDMI can (although, in defense of DVI, even though HDMI can support up to 8-channel pass-through, nearly 95% of all products on the market that support HDMI connectivity only support 2-channel I/O, necessitating a seperate audio connection for multi-channel support) . . . as well, DVI is not capable of supporting HDCP content (although, if one is using their rig primarily for games, such might not be much of a concern) . . . but are such drawbacks truly a hinderance of DVI for standard PC setups?

DVI is definitely capable of HDCP content.

And audo pass through of 7.1 is definitely common today.

So, I guess my real question is . . . which standard should be the preferred method of display connection with a "jack of all trades" PC?

I think the way it is done today is generally best. DVI being standard, with the option for HDMI through and adaptor with Audio.
 
Have a female DVI to male HDMI cable (monitor has only VGA and HDMI - no DVI). It looks perfect - as it should since both are digital.
 
only reason I dislike HDMI ATM is compatibility....
 
HDMI was developed to bring stricter enforcement of HDCP compliance. Along with it came the ability to transmit both Audio and Video signals. If your only use for HDMI is for LCD monitor which doesn't offer speaker support HDMI offers no real tangible benefit.

In it's simplest form, HDMI allows one to hook up your HDTV to the receiver which is then hooked up to your DVD player. Or you can hook up your HDTV to your DVD player with just one cable. All without having separate audio and video cables (for example).

HDMI for the PC user offers no tangible benefit (that a user can actually use) that cannot be obtain through traditional DVI setups. Which is why there is no mass acceptance of it from DVI as DVI was to VGA. Sure you can buy it because it's "something new" but that's all you will ever get out of it "something new". IE: different plug type, new look, etc...
 
Last edited:
It should be mentioned that most DVI cables not packaged (ehm, not cheap freebies) with something are huge in diameter and rather rigid. HDMI cables bend easier but they don't screw in like DVI so they are more likely to come loose.
 
The standard change from DVI to DisplayPort will be far more widely accepted in the PC world in my opinion because of the daisy chaining ability and DVI has too manu versions. DisplayPort will simplify some randomness found in the DVI standards (aka DVI-I, DVI-D, DVI-A, dual link or single link and all that mess).

DVI to HDMI on a PC monitor has nothing to gain.

P.S. While you can find this to be true most of the time, my DVI cable is thinner and easier to bend than the HDMI cable I have.
 
HDMI has too many different revisions floating around; some support this others don't... and the cable looks the same :shadedshu

DVI ftw
 
HDMI has only 2 versions. HDMI and mini HDMI. The version numbers are updates to the standard.
 
Back
Top