DVI vs HDMI

Slurpak

Silver Level Poster
I'm sure this question has ben asked many times before, but after a quick search, i couldn't find any in the forums.

i will be getting my new PCS computer soon, and will be using it for mainly gaming. the monitor i have ordered with it was the 'IIYAMA E2472HD 24" LED WIDESCREEN'
this monitor comes with a DVI cable, however i ordered an HDMI cable with it anyway.

has anyone got any recommendations on which cable i should use to connect my monitor up with?
i have heard that they are both exactly the same, but i was just wondering if anyone had noticed any differences.
i would assume the HDMI would be better, but i figured it would be best to ask the experts, and people who have experience of both.

thanks in advance

Slurpak
 

Slurpak

Silver Level Poster
thanks for the quick reply vanthus. do you mean that the monitor is more likely to last longer if i use the DVI cable?
 

vanthus

Member Resting in Peace
No,there have been lots of reports that monitors dont work all that well with HDMI you can always try both & judge for yourself,might just depend on the make of monitor.I know I prefer DVI,as I ve tried both,& the DVI connection is a lot stabler.
 

pengipete

Rising Star
You'll get exactly the same picture quality with DVI or HDMI. HDMI can carry sound along the same lead. DVI uses a separate lead for sound. DVI plugs are held in place with threaded bolts - HDMI just slots into place. DVI plugs should never be unplugged with the PC and monitor switched, HDMI can.

In short, it makes no difference - unless it makes a difference to you. I prefer to use DVI and leave the monitor's HDMI input free for other devices that don't have DVI - like my laptop and camera.
 

Slurpak

Silver Level Poster
thanks for all your help.
i have to say, i do love these forums. i had a question, i asked it, half an hour later i have some good information on the pros and cons of dvi and hdmi.
ill probably use the dvi, and save the hdmi slot for other things, like pengipete says. i wanted an hdmi cable for my xbox anyway, so ill just use it for that.
thanks again and +rep to both of you for your help and speed :)
 

pengipete

Rising Star
I leave an HDMI lead attached to the monitor - makes it much easier when I want to connect another device (especially as the monitor is wall-mounted)
 

vanthus

Member Resting in Peace
You'll get exactly the same picture quality with DVI or HDMI. HDMI can carry sound along the same lead. DVI uses a separate lead for sound. DVI plugs are held in place with threaded bolts - HDMI just slots into place. DVI plugs should never be unplugged with the PC and monitor switched, HDMI can.

In short, it makes no difference - unless it makes a difference to you. I prefer to use DVI and leave the monitor's HDMI input free for other devices that don't have DVI - like my laptop and camera.
dont mean to be offensive,but I'm talking through experience,not just quoting technical theory.I have tried both HDMI & DVI to my monitor & other monitors & DVI is far superior,this subject has appeared before on this forum & that was the consensus of opinion as far as I remember.
 
Last edited:

pengipete

Rising Star
You're wrong to assume that I was "just quoting technical theory" - in fact I ran a whole series of tests with three monitors and full HD TV using every possible combination of DVI, HDMI and VGA just a few days ago to help someone on the this forum. Regardless, saying there a visible difference between DVI-D and HDMI is the same as buying two identical copies of the same DVD and saying that one of them has better picture quality - it's impossible. It's not as if the monitor has two decoders and two completely separate signal paths between the ports and the screen. Don't forget that DVI-A and DVI-I is capable of carrying analog only signal - it's possible that people are not actually comparing like with like.

Regarding DVI audio, it was tried on some older cards - purely to allow the use of DVI to HDMI adaptors - but it didn't catch on and has been dropped. In fact, both NVidea and ATI - along with a number of other major players - announced in December 2010 that they would be phasing out DVI andVGA entirely by 2013 - 2015 at the latest - in favour of HDMI. DVI audio was a flop - much like the notion to send USB data over DVI - a few cards supported it but no monitors did.

Strictly speaking, DVI was designed to be hot-plugable but I wouldn't recommend it - despite the "technical theory".
 

vanthus

Member Resting in Peace
I'm impressed by your knowledge & enthusiasm pengipete & meant no disrespect, but I have definitely had problems with HDMI on my monitor & others.
 

pengipete

Rising Star
What sort of "problems"?

As an amatuer photographer, I did a fair bit of homework before chosing a monitor and one thing came up repeatedly across the various websites and forums - pretty-well all HD TV's and many HD resoultion monitors have default settings that are designed to make an image "pop" - overly sharpened, too much contrast and too much colour vibrance. Basically, they are designed to be viewed from several feet away or in a shop and look better than reality. That makes them completely unsuitable - at least at the default settings - for PCs which are usually viewed from no more than a couple of feet - often less - and have static images or parts of the screen that are already high in contrast. Even so, many of those sets store different settings for each connection - in fact, our TC has different settings for each individual HDMI port as well as the PC ports, SCART etc. Unless you have everything set to the the same values on each connection, you will not be comparing like with like because you will be judging the settings - not the picture coming from the PC.

An individual TV or monitor may have poor picture quality due to processing of the image after it has been received but beyond detecting the signal source, the monitor's decoder can not distinguish between uncompressed data coming from an HDMI lead and uncompressed data coming from a DVI-D lead - they are identical with the possible exception of HDCP and audio - neither of which have any bearing on the content of the data stream. In fact, the signal are so alike that you only need an "dumb" adaptor to use one with the other - they are completely, electrically compatible. If you think about it, the fact that you can use an adaptor that merely alters the connector - there's no conversion or circuitry required - is proof that the signals are identical.

As with all digital transfers, any errors that can't be corrected on-the-fly - FEC style - cause a break-up. It is impossible for a clean digital error to reduce the quality of a displayed image or to distort it - no snow, no lines, no ghosts and no wobbles - those are entirely analog problems. Any drop-outs or corruption is either blocked or corrected by the decoder so the worst that can happen is a "glitch" or a frozen image. Even then, it would be an actual fault rather than a difference in quality.
 

pengipete

Rising Star
I don't know because you haven't said what the "problem" is. If you are simply unhappy with the quality of the image on your monitor - you could have a bad monitor or it could simply be poorly set-up but don't forget that your graphic's cards own settings and the colour profiles in Windows (and some apps like Photoshop) will all affect the end result. Ultimately - for most people, you're dealing with a subjective element but there are completely objective ways of calibrating a monitor - but they cost. If you want a cheap and cheerful alternative, Google for "test cards" - the BBC one for HD is out there somewhere and it's perfect - then go through making sure that colours are natural, blacks are black, good range of greys etc (there are loads of site dedicated to setting up HD tvs - most of that also applies to monitors).
 
Top