I was curious about a couple of things, after debating the benefits of both display standards with a buddy the other day . . . How exactly do the two display standards measure up to each other, especially in regards to PC monitors (not so much large-display TVs)? Now, I know HDMI is pretty much a necessity for displays larger than 1920x1200, but, in regards to PC monitors, the vast majority of our display cards don't support a native HDMI output . . . instead, one must use a DVI=>HDMI adapter - which, I was always under the impression such adapters added a small amount of latency to the actual display, which could result in poor display performance with fast-paced subject matter (i.e. games). As well, I was also under the understanding that such adapters don't allow for the full bandwidth of the display type . . . that is, the HDMI output bandwidth through a DVI=>HDMI adapter would only be the max that the native DVI output is capable of? I know for sure, though, that DVI can not support an audio pass-through, where HDMI can (although, in defense of DVI, even though HDMI can support up to 8-channel pass-through, nearly 95% of all products on the market that support HDMI connectivity only support 2-channel I/O, necessitating a seperate audio connection for multi-channel support) . . . as well, DVI is not capable of supporting HDCP content (although, if one is using their rig primarily for games, such might not be much of a concern) . . . but are such drawbacks truly a hinderance of DVI for standard PC setups? So, I guess my real question is . . . which standard should be the preferred method of display connection with a "jack of all trades" PC?