English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-04-28 07:10:38 · 2 answers · asked by Anonymous in Computers & Internet Hardware Monitors

2 answers

I have a Dual video card and a dual input monitor. And on each of them is a digital (DVI) and the other ia an analog (15 pin DIN).

Video= ATI Radeon 9250 Sapphire
Monitor= Samsung SyncMaster 731b

2007-04-28 07:20:49 · answer #1 · answered by Christian Soldier 7 · 1 0

A blue VGA connector is analog.

A white DVI connector is analog, digital or both.

DVI-D is digital
DVI-A is analog
DVI-I is both.

A lot of video cards are using the DVI-I connector and ship with a adapter that takes the analog signals from the DVI-A half and puts them out on a 15 pin D shell. The signals are exactly the same, it is just the connector that is different.

Cables should be DVI-A or DVI-D. A DVI-I cable is a bad idea.


Since the signal starts out digital, gets converted to analog by the computer, gets sent up a wire to the monitor and then in a LCD it is converted back to digital, if you have a DVI-D option on your monitor it is better to keep it all digital all the way.


HDMI is the consumer version of DVI-D. The DVI-D connector has space for two digital channels where HDMI only has one, HDMI includes HDCP (High Definition Content Protection) to stop you copying movies by breaking the cable and reading the signals.

2007-04-28 15:47:43 · answer #2 · answered by Simon T 6 · 0 0

fedest.com, questions and answers