I have learned a lot about coaxial, composite (RCA), VGA (d-sub 15-pin analog), s-video (analog), DVI (digital, computer), component (RGB analog), and HDMI (digital and sometimes with HDCP encryption), but I'm missing how the big picture relates to my practical considerations of the various equipment choices and interfaces.
But when does each standard become helpful? Display size, resolution, response time? In other words, how do the acronymns: PAL, 720p (progressive), 1080i (interlaced), etc. translate to the video sources, and TVs or monitors that I am considering to purchase?
And how do they support both analog and digital in the same standard, while DVI requires variations (dvi-d, dvi-i, dvi-a)?
Do adapters care whether the signal is digital or analog, and if not, then are any losses with adapters?
And what is this talk about "upconverting" a signal? Is this specific to digital, or can it work for analog?
Lastly, can a capture card receive a digital signal without losses?
2007-08-16
04:26:05
·
1 answers
·
asked by
Andy
4