English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-01-17 15:20:12 · 8 answers · asked by reuben_andres 1 in Consumer Electronics Home Theater

8 answers

If you get any fixed-pixel TV (LCD, DLP or Plasma) with 1080-line resolution, you will be watching 1080p regardless of what the input is. (There is only one plasma exception to that.) These sets convert whatever is input (480i, 480p, 720p, 1080i or 1080p) to 1080p for display. You will not get any more resolution from the lower resolution sources, but they will suffer less deterioration. The best input for a set that displays 1080p is 1080p, since that requires no additional processing, and it will be free of any interlace artifacts. There is no question that a 1080p set driven with a 1080p signal will give the best picture. Although they are becoming more numerous, most TVs, even those with 1080p screens, will not accept 1080p as a source. One reason is that there is no 1080p being broadcast either OTA, cable or satellite. The only 1080p sources are the new high def DVD players and some video games. This is likely to be the situation for some time. However, in anticipation of these new players, sets are beginning to come with 1080p inputs.

2007-01-17 18:37:27 · answer #1 · answered by gp4rts 7 · 2 1

And they wonder why people say that people who work in electronics don't know what they're talking about. Now we know.

1080p is better because 1080i is interlaced which means it only shows 1080 vertical lines, but they alternate, so you only see 540 lines. 1080p is progressive which means you see all1080 vertical lines at the same time, so the picture is much clearer than 1080i pictures.

2007-01-17 17:27:32 · answer #2 · answered by Anonymous · 2 0

HD resolutions go 720p, 1080i and the newest and best resolution, for now, is 1080p.

The i stands for "interlaced" and the p stands for "progressive scan." In progressive scan the lines of each frame are drawn independently. When interlaced only every other line is drawn, and the gaps are filled in between or "interlaced" this works okay for still images but can cause image distortion on most TVs. Regardless interlacing dates back to 1930's and has since been replaced by progressive scan technology.

The next step will be 2160p.

2007-01-17 15:28:01 · answer #3 · answered by cam 4 · 0 1

That first guy must work at Best Buy or Circuit City, because hes' completely wrong. (Typical) But he'll sell you some killer Monster Cables at a low low price!

1080P is a better resolution.

2007-01-17 19:29:13 · answer #4 · answered by Anonymous · 0 0

1080p brother!

2007-01-17 18:30:41 · answer #5 · answered by geno 3 · 1 1

1080p is better.

2007-01-18 04:03:27 · answer #6 · answered by redjetta 4 · 1 0

1080p because it's deinterlaced

2007-01-17 21:46:06 · answer #7 · answered by Anonymous · 0 0

1080i. I work with electronics

2007-01-17 15:26:09 · answer #8 · answered by master_alucard0 1 · 0 4

fedest.com, questions and answers