I've never been able to figure out what the difference is between VGA and DVI exactly. I've plugged in my 22\" acer into both and I've never seen any difference.
From what I understand, DVI is supposed to handle pixels on a flatscreen better.
CDN_Merlin wrote:DVI uses digital signals so everything is sharper.
VGA is the digital signal converted to analog and back to digital. Your quality degrades.
actually no, its an analog signal from the video card's DAC. all the adapter(not a converter) does is send the analog signal to the correct pins for a VGA cable. unless the port on the video card is DVI-D, which has no analog pins, or is DVI-A, which is all analog.
for all practical purposes, a digital signal means your monitor doesn't need to sync the signal.
On true DVI-D you never need to auto or otherwise adjust the screen (usually adjust functionality is disabled in digital modes since it is meaningless).
I'm thinking about switching up my video cabling at home. Right now I can't use DVI, because the KVM switch I'm running everything through is VGA (plus, I can't go over 60Hz through the KVM). DVI is definitely noticeably sharper, it's what I have at work.