Here's a question...
Does interlaced video take up the same bandwidth as progressive?
It seems like (with 1080i) that would be 1920 x 540=1,036,800 pixels at 60 frames per second = 62,208,000 pixels per second. (right?)
Then with 1080p that would be 1920 x 1080=2,073,600 pixels at 30 frames per second = 62,208,000 pixels per second.
If this is the case, why is everything broadcast in 1080i? Interlaced video seems to be a clear loss in quality (and non-CRT's have to deinterlace it, right?), and is often not true to the original scope of a program. Wasn't the advantage of interlaced video supposed to be something to do with CRTs (which no HDTV I've seen is)?
I've heard many clamoring for 720p broadcasts, but wouldn't that take up much less bandwidth than either of the 1080's? I'm confused... thanks for your consideration.
2006-12-22
12:37:22
·
1 answers
·
asked by
bug m
2
in
TVs