hi robby
although there isnt a actual performance difference between v.g.a and d.v.i
however the d.v.i(digital video interface)is the newer standard and produces a clearer image,this is due to the digital protocol in which the desired illumination of pixels is transmitted as binary data. When the display is driven at its native resolution, it will read each number and apply that brightness to the appropriate pixel. In this way, each pixel in the output buffer of the source device corresponds directly to one pixel in the display device, whereas with an analog signal the appearance of each pixel may be affected by its adjacent pixels as well as by electrical noise and other forms of analog distortion.
Previous standards such as the analog VGA were designed for CRT-based devices and thus did not use discrete time display addressing. As the analog source transmits each horizontal line of the image, it varies its output voltage to represent the desired brightness. In a CRT device, this is used to vary the intensity of the scanning beam as it moves across the screen.
With a single DVI link, the largest resolution possible at 60 Hz is 2.75 megapixels (including blanking interval). For practical purposes, this allows a maximum screen resolution at 60 Hz of 1915 x 1436 pixels (standard 1.33 ratio), 1854 x 1483 pixels (1.25 ratio) or 2098 x 1311 (widescreen 1.6 ratio). The DVI connector therefore has provision for a second link, containing another set of red, green, and blue twisted pairs. When more bandwidth is required than is possible with a single link, the second link is enabled, and alternate pixels may be transmitted on each, allowing resolutions up to 4 megapixels at 60 Hz. The DVI specification mandates a fixed single link maximum pixel clock frequency of 165 MHz, where all display modes that require less than this must use single link mode, and all those that require more must switch to dual link mode. When both links are in use, the pixel rate on each may exceed 165 MHz. The second link can also be used when more than 24 bits per pixel is required, in which case it carries the least significant bits. The data pairs carry binary data at ten times the pixel clock reference frequency, maximum 1.65 Gbit/s x 3 data pairs for a single DVI link.
Like modern analog VGA connectors, the DVI connector includes pins for the display data channel. DDC2 (a newer version of DDC) allows the graphics adapter to read the monitor's extended display identification data (EDID). If a display supports both analog and digital signals in one input, each input can host a distinct EDID. If both receivers are active, analog EDID is used.
The maximum length of DVI cables is not included in the specification since it is dependant on bandwidth requirements (the resolution of the image being transmitted). In general, cable lengths from 1-15 feet will work for displays at resolutions of 1920x1200. cable lengths up to 50 feet can be used with displays at resolutions up to 1280x1024. For longer distances, to eliminate the video degradation, the use of a DVI booster is recommended. DVI boosters may or may not use an external power supply.
good luck and happy new year robby !
2007-12-31 02:08:25
·
answer #1
·
answered by brianthesnail123 7
·
1⤊
0⤋
DVI gives you better picture and it's the newest...
2007-12-31 11:59:40
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
DVI is probably better
however whether that is significant on a 15Pin D plug and DVI connector graphics card is arguable
I suspect that a pure DVI -> DVI connection would be better, but if either device has both then I doubt you will see any improvement. Becuiase if it has both then there is some additional switching or processign going on or the design is a hybrid rather than optimised for DVI
mind you I doubt you will see any significant improvement switching form 19pin D plug to DVI.
2007-12-31 10:03:30
·
answer #3
·
answered by Mark J 7
·
0⤊
0⤋
DVI is the better connection. DVI-Digital Visual Interface is newer than VGA--Video Graphics Array.
VGA is the old standard, while DVI is newer, but not the newest. VGA is analog, while DVI is digital.
Here are the Wikipedia explanations of what they are, and what type of picture you can expect from them:
http://en.wikipedia.org/wiki/VGA
http://en.wikipedia.org/wiki/DVI
Of course, High Definition is by far better than DVI too.
I would go with the DVI any day over VGA. I would go with a high def monitor, if they were not so expesive. So, I wait for prices to go down, and then will enjoy high def.
Good luck and enjoy your new monitor.
2007-12-31 10:03:26
·
answer #4
·
answered by Serenity 7
·
0⤊
0⤋
DVI is the newest and latest..
2007-12-31 10:01:24
·
answer #5
·
answered by Zippy 2
·
0⤊
0⤋