English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Here's a question...
Does interlaced video take up the same bandwidth as progressive?

It seems like (with 1080i) that would be 1920 x 540=1,036,800 pixels at 60 frames per second = 62,208,000 pixels per second. (right?)

Then with 1080p that would be 1920 x 1080=2,073,600 pixels at 30 frames per second = 62,208,000 pixels per second.

If this is the case, why is everything broadcast in 1080i? Interlaced video seems to be a clear loss in quality (and non-CRT's have to deinterlace it, right?), and is often not true to the original scope of a program. Wasn't the advantage of interlaced video supposed to be something to do with CRTs (which no HDTV I've seen is)?

I've heard many clamoring for 720p broadcasts, but wouldn't that take up much less bandwidth than either of the 1080's? I'm confused... thanks for your consideration.

2006-12-22 12:37:22 · 1 answers · asked by bug m 2 in Consumer Electronics TVs

So here's a follow-up question...

Why are BluRay Discs/HDDVDs marketed as 1080p? Does that mean their framerate is stretched from (24fps to 60fps) like they do with movies on TV?

2006-12-22 13:08:35 · update #1

1 answers

1080p is 1920x1080 at 60 (not 30 frames/s) so double the bandwidth of 1080i

For 720p replace 1080 by 720, so it takes less b/w than 1080p

To tell the truth, unless you have a TV 42" or larger you will not be able to see the difference between 1080i and 1080p

Now for broadcasters: They send a compressed bit stream and they have available about 19 Mbits/s channel. They can use that bandwidth any way they want, so obviously they want to send as many programs as possible.

2006-12-22 12:56:54 · answer #1 · answered by TV guy 7 · 1 0

fedest.com, questions and answers