English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

VGA or DVI ..why is t better ?

2007-02-24 07:27:01 · 4 answers · asked by WDDA c 1 in Computers & Internet Hardware Monitors

4 answers

The facts

VGA, which stands for Video Graphics Array, has been the standard method for connecting monitors to Macs since the late 1990s and to PCs for some years before then. The newer DVI format, which stands for Digital Video Interface, is the preferred connection method for most non-budget LCD displays, a number of higher-end CRTs, and even a small number of expensive video displays. VGA is being phased out in favour of DVI, although we’re still in the transition period where support for both is common. Some monitors contain sockets for each format, and the Mac mini comes with a DVI to VGA adaptor dongle to help users with older displays connect to its built-in DVI port.

Much of the DVI specifications and its differences from VGA aren’t widely understood. VGA is based on analog signals, at least in terms of what goes across the cables themselves. Although the monitor signal always starts and its life as digital. With VGA systems it is converted to analog as it leaves the graphics card for the purposes of sending it through the cable, then converted back to digital format at the other end, over in the monitor. One drawback of this is the inability to address the monitor’s picture elements - the individual pixels - precisely. This isn’t a problem with CRT displays, as they simply draw out the image as it arrives by passing an electron beam across the inside phosphor-coated surface of the glass screen. It becomes an issue when working with LCD displays, as their pixels are fixed, individual entities which need to have the incoming image pixels matched up to produce a clean result.

The problem this causes with VGA connections is the need to apply phase and clock corrections to synchronise the signal with the display’s physical pixel array and its properties. When this is out of adjustment it can produce banding and other effects for reasons not dissimilar to what causes moiré patterning in printed work; the two different patterned arrays (the virtual pixel grid of the display signal and the physical pixel grid of the LCD itself) don’t match up, leading to regular bands of blurred detail and similar problems. Unlike the potentially imprecise way VGA works with LCD displays, the DVI signal is mapped to the physical frequency of the monitor’s pixels.

The display signal can also degrade with improperly sheilded VGA cables, leading to poorer-quality results at the display end. This isn’t a problem in the same way with DVI, although there is still a maximum recommended length (5 metres) beyond which the signals may not be transmitted properly.

In short, DVI provides a cleaner, faster, more precise display with hardware that supports it properly. This is all very well, but there is a small matter of different DVI formats to content with. Fortunately, these are designed to complement each other rather than actually being competing standards; the differences are essentially to do with handling displays with digital or non-digital inputs.

There are three main kinds of DVI connections which are available; DVI-D, DVI-A, and DVI-I. DVI-D is the ‘true’ digital format. It is the normal format used for connecting digital LCD monitors to DVI graphics cards. DVI-A is the analog version of DVI; it is used to carry a signal from a DVI graphics card to an analog display, for example a CRT monitor. There is a digital to analog conversion applied here, but this still gives higher-quality results than a standard VGA cable. Finally, there’s DVI-I, the integrated format which caters for both digital and analog equipment. This doesn’t convert a pure DVI-D output to something a DVI-A device can use. But it will act as a DVI-D cable or a DVI-A cable according to your needs. The real benefit is that you don’t have to use two different cables if you use both digital and analog displays.

Having said that, you’re pretty unlikely to run into trouble if you stick with the cables that come with whatever new display you buy. Where you need to take care is when buying cables separately. Although you’re unlikely to need anything other than a DVI-D cable when dealing with LCD screens, it could still be wise to consider buying a DVI-I cable to cover both eventualities. However, do be aware that some manufacturers have been making the blade part of the pin set in DVI-I cables larger than normal; this could affect how it fits in some equipment.

There is, unfortunately, still more to the DVI connection format. DVI-D and DVI-I can come in single-link and dual-link forms. Fortunately, again, this isn’t as complex as it sounds at first. The dual link varieties provide twice as much power and deliver the data more rapidly than the single link kind. This has a practical benefit with larger monitors, in that it allows a higher maximum resolution to be transmitted. In compatibility terms, the physical difference between single link and dual link is purely a matter of the absence or presence of extra pins in the middle section of the plug; any DVI-D or DVI-I-ready graphics card can accept either level.

2007-02-24 07:40:53 · answer #1 · answered by Paultech 7 · 0 0

DVI plugged monitors are the better technology and will give you better resolution. VGA is the old technology, and eventually won't be as compatible.

2007-02-24 15:32:35 · answer #2 · answered by Santa Barbara 7 · 0 0

dvi

2007-02-24 15:38:30 · answer #3 · answered by basketball freak 2 · 0 0

you wont notice much difference imo ..

2007-02-24 15:29:58 · answer #4 · answered by Anonymous · 0 1

fedest.com, questions and answers