Page 1 of 1
DVI and VGA?
Posted: Sun May 18, 2008 3:01 pm
by []V[]essenjah
I've never been able to figure out what the difference is between VGA and DVI exactly. I've plugged in my 22\" acer into both and I've never seen any difference.
From what I understand, DVI is supposed to handle pixels on a flatscreen better.
Posted: Sun May 18, 2008 4:50 pm
by CDN_Merlin
DVI uses digital signals so everything is sharper.
VGA is the digital signal converted to analog and back to digital. Your quality degrades.
Posted: Sun May 18, 2008 5:11 pm
by heftig
You used a DVI cable too?
Re:
Posted: Sun May 18, 2008 6:02 pm
by fliptw
CDN_Merlin wrote:DVI uses digital signals so everything is sharper.
VGA is the digital signal converted to analog and back to digital. Your quality degrades.
actually no, its an analog signal from the video card's DAC. all the adapter(not a converter) does is send the analog signal to the correct pins for a VGA cable. unless the port on the video card is DVI-D, which has no analog pins, or is DVI-A, which is all analog.
for all practical purposes, a digital signal means your monitor doesn't need to sync the signal.
Posted: Sun May 18, 2008 8:17 pm
by Krom
On true DVI-D you never need to auto or otherwise adjust the screen (usually adjust functionality is disabled in digital modes since it is meaningless).
Posted: Sun May 18, 2008 9:35 pm
by Grendel
Posted: Mon May 19, 2008 7:33 am
by JMEaT
Cool thing about DVI I like is, I never have to adjust the V and H position to get a perfect fit on the screen.
Posted: Mon May 19, 2008 8:07 am
by Foil
I'm thinking about switching up my video cabling at home. Right now I can't use DVI, because the KVM switch I'm running everything through is VGA (plus, I can't go over 60Hz through the KVM). DVI is definitely noticeably sharper, it's what I have at work.