English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I just bought a Samsung SyncMaster 226BW. It came with Digital cable. I don't see the quality difference between Digital and Analog. Can someone tell me?

2007-04-26 15:54:42 · 2 answers · asked by Anonymous in Computers & Internet Hardware Monitors

2 answers

DVI actually has a lower bandwidth as the spec is limited to 165 Mpels.

However, with analog signal there is no transmission of the pixel clock so the monitor has to make a best guess when to sample the analog signal to convert it back to digital. The monitors are a lot better at this today than say 5 years ago, but still not perfect. At lower resolutions and refresh rates this is usually not a problem, and if it is pressing the auto-setup button on the monitor when there is a lot of text on the screen will normally fix it.

However, at higher resolutions it gets harder for the monitor to sample the changing analog signal in the smaller time available while the signal is at the correct point. more pixels in the same 60 Hz refresh time means less time per pixel.

Since DVI is digital, then all these problems go away.

If you are under the 165 Mpel limit then DVI is better.

If you are over 165 Mpel, then you have a problem. The DVI standard does not define operation here and you need a dual channel adapter card and dual channel monitor that both implement dual channel the same way. If you go analog then it is hard to sample the fast moving video and get a stable picture.

2007-04-27 04:28:39 · answer #1 · answered by Simon T 6 · 0 0

DVI just has more bandwidth at higher resolutions, so until you're at 1900x1600 you Probably won't see a huge difference. there is a difference before you get to that point, but unless you are a graphics professional, you probably wouldn't notice.

2007-04-26 16:01:09 · answer #2 · answered by tigerkitty2 5 · 0 0

fedest.com, questions and answers