English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

3 answers

The way that the picture is displayed is fairly complex and has been designed to cheat your vision so that it appears not to flicker. However the screen does flicker as you can prove if you look through a rotating fan at the screen
The video is interlaced so that alternate lines are filled in by successive scans thereby reducing flicker.
LCDs and CRTs work differently. Also High Definition TV takes more bandwidth or spectrum space than standard.

2006-12-12 23:42:43 · answer #1 · answered by jerry_davis71 1 · 0 0

There are more books written on the theory of TV signal than almost any other topic in electronics, everyone claims to know but there are many conflicting theory's. If that doesn't help try the web site called How things work.

2006-12-13 07:25:59 · answer #2 · answered by Shadow_Dancer 2 · 0 0

lambda is so old fashioned . Hertz is the new hotness.

wavelength (Lambda) = C / frequency (hertz) where c = the speed of light = 300,000,000 m/s

an image on a tv screen is made up of scanning lines, so we talk about the speed of the horizontal scan - 15625 Hz, and the speed of the vertical scan - 50 hz.

a full colour broadcast (terrestrial) need a bandwidth of 5.5 mhz

the actual frequency at which a tv signal is broadcast will depend on what channel you're looking at and what country you are in

2006-12-13 07:27:14 · answer #3 · answered by Vinni and beer 7 · 1 0

fedest.com, questions and answers