To my knowledge HD quality is just a certain resolution
Not exactly. HD DVD (either blue-ray or HD-DVD) use a much shorter wavelength signal to burn (and read) the pits and valleys (highs and lows - 1s and 0s) into the media. Since physics says that T = 1/F (Time = 1/Frequency) and we know time is a constant (1 second = 1 second), if the wavelength (the physical length of one cycle) is shorter, that means you can jam more cycles in the space consumed during 1 second in time (as determined by the rotation of the disk and other factors). In other words, the shorter the wavelength, the higher the frequency. The higher the frequency, the more information is contained in each second or play.
In terms of video, if you have more information available in each second, you can draw more horizontal
lines of resolution (if the HW is able) - and the more lines drawn per second is what determines the picture resolution. Standard TV in the US, NTSC, is capable of 525 lines. However, the signals transmitted, over the air or through cable, only draw about 250 lines at about 30 FPS. Standard DVD uses 480 lines which is why DVD looks better than broadcast or cable TV.
HDTV, on the other hand, is up to 1080 lines and that is possible because of the "density" of the data stored on the disk - shorter wavelength means greater density - but also means you must have a player capable of reading the shorter wavelengths, and you must have a display with electronics fast enough to process and display that many lines.
So, if your standard DVD player is not capable of reading the shorter wavelength - it might as well be a blank disk and your HDTV will be showing a polar bear eating marshmallows during a snowstorm.
if your friends tv/monitor can support one of these resolutions while using a DVI-input in theory it should be HD quality
Sorry, this is not correct either.
HD does not equal digital. DVI is just a physical interface - a connector for "digital" signals. A digital signal may or may not be HD - just as a DVD may or may not be HD - that depends on the original content and how it was created and the media in which it is stored. For example, the vast majority of video cards and virtually all but budget LCD monitors support a digital interface - DVI. But only a very small handful of cards support HD.
Don't confuse digital with analog, or digital with high definition. Technically speaking, you can certainly have analog HD - but since the RF bandwidth and storage requirements for analog signals are HUGE, digital is used, with the added advantage of less noise, less distortion, and easier delivery.
Clear as mud, right?