English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

if i want to use my dvi connection from my lcd monitor to my video card do i have to change anything in my video card set-up?
how do i maximized its used?

wat is the difference between dvi and vga in terms of quality? how do i maximized it?

wat resolution is ideal for 19" widescreen lcd?

sorry for so many question.
pls help.

thanks in advance.

2007-10-18 12:13:13 · 3 answers · asked by solsticeph 1 in Computers & Internet Hardware Monitors

do i need to adjust anything from my video card settings to maximized its purpose?

i have set the resolution to 1280 x 1024 and it gets elongated, the desktop stretched why is that?

2007-10-18 12:58:43 · update #1

do i need to adjust anything from my video card settings to maximized its purpose?

i have set the resolution to 1280 x 1024 and it gets elongated, the desktop stretched why is that?

2007-10-18 12:58:49 · update #2

3 answers

If you have a DVI output on the video card and a DVI input on the monitor, then just connect them via a DVI-D cable. If you are buying one you only need a single link cable.

Ideal resolution for a 19" widescreen is 1440 by 900.

You do not need to do anything else to get the system to work.

DVI-D should be better than VGA because it is a direct digital link.

For VGA the system sends out the video information it converts it from the digital signal inside the video card and sends it as an analog signal, clocked by the pixel clock. However that pixel clock is not sent with the data.

When the monitor gets the signal it has to then sample it and convert it back to a digital format to display on the screen. But since there is no pixel clock information it does not know exactly where to sample the signal. Usually the monitors do a good job estimating where to sample, but sometimes the signal drifts or it estimates wrong and yiu end up with a blurry or jittery image.

DVI-D is digital in the PC to digital on the cable to digital in the monitor. (usually) a perfect signal transfer. No sampling errors no extra noise, no ghosting or ringing form the cable. No conversion errors in the D to A and the A to D converters.

2007-10-18 15:28:22 · answer #1 · answered by Simon T 6 · 0 0

IF the two, video exhibit and snap shots Card have DVI use them as they provide a lots lots extra effective image high quality as that's digital as a exchange of analog like the VGA solid success wish this enables

2016-10-04 03:05:03 · answer #2 · answered by ? 4 · 0 0

Your monitor or your gfx card shoulda came with a removeable switch u can take off for DVI, look on the back of your PC and see if the monitor runs straight into the video card or if theres an inch long adapter on it, remove it and u'll have DVI.

DVI is like Hi-def, and VGA is like standard, since your monitor is HD compatible.

I use 1280x1024 as it looks the best and is well spaced out.

2007-10-18 12:19:38 · answer #3 · answered by whiteness 3 · 0 1

fedest.com, questions and answers