English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I understand that 1280 x 720i/p and 1980 x 1080i/p indicate the number of vertical horizontal lines on an HDTV set but when I check out manufacturers' specs for their HDTV's, you see specs like 1024 x 768, 1365 x 768, and 1280 x 1080. As you can see, nothing matches. So how for example, can a manufacturer advertising a 1024 x 768 tv claim to show 1080i when the tv physically has only 768 horizontal lines?

2007-08-11 06:07:34 · 4 answers · asked by Simon C 1 in Consumer Electronics TVs

4 answers

As far as the Consumer Electronics Association (CEA) and most of the display manufacturers are concerned ANY display that has a ‘native’ display resolution of, or is capable of displaying, 1024×768 (XGA) or better qualifies as an “HDTV” display. The current CEA definitions are as follows:

________ QUOTE ________

HIGH-DEFINITION TELEVISION (HDTV):
HDTV refers to a complete product/system with the following minimum performance attributes:
   • Receiver—Receives ATSC terrestrial digital transmissions and decodes all ATSC Table 3 video formats;
   • Display Scanning Format—Has active vertical scanning lines of 720 progressive (720p), 1080 interlaced (1080i), or higher;
   • Aspect Ratio—Capable of displaying a 16:9 image¹;
   • Audio—Receives and reproduces, and/or outputs Dolby Digital audio.

HIGH-DEFINITION TELEVISION (HDTV) MONITOR:
HDTV Monitor refers to a monitor or display with the following minimum performance attributes:
   • Display Scanning Format—Has active vertical scanning lines of 720 progressive (720p), 1080 interlaced (1080i), or higher;
   • Aspect Ratio—Capable of displaying a 16:9 image¹

¹(In specifications found on product literature and in owner’s manuals, manufacturers are required to disclose the number of vertical scanning lines in the 16:9 viewable area, which must be 540p, 810i or higher to meet the definition of HDTV.)

______ END QUOTE ______

Nowhere in the CEA’s definition is the horizontal resolution a criterion. In part this appears to have been done intentionally to allow manufacturers to market 4:3, 1024×768 (XGA) displays as an “HDTV” or “HDTV monitor.” Note: according to the definition the display must also be capable of displaying a 16:9 “image,” which of course nearly every 4:3 display can. Also, “HDTV” monitors do not require an ATSC compliant tuner or the ability to process or output Dolby Digital (AC-3) audio. I also suspect that the practice of specifying only a minimum number of active vertical scanning lines, as a method of characterizing a standardized category of displays, has its roots in the legacy of analog video where (in North America and a few other parts of the world) the well-established term “525-line” was omnipresent; unfortunately that was in a more simplistic analog era where CRTs predominated.

The current state of affairs is undeniably deceptive for many consumers as it allows display manufacturers enormous latitude when it comes to marketing their (purported) “HDTV” displays. Add to this the ability of modern digital video processing to mask most of the deleterious shortcomings that often accompanies such displays as well as poor quality source material.

Using the CEA definition (or industry guidelines,) a 16×720 pixel (W × H) native display, if such a thing were to exist, would meet their minimalist definition of an “HDTV monitor.”

I have always advocated that when shopping for a new high-definition, fixed-pixel display (system) consumers should always choose those that utilize native 1280×720 or better yet 1920×1080 panels. Native 1920×1080 “Full HD” displays are very often the best place to start one’s journey into the world of HDTV.

I highly encourage anyone that is interested in arming themselves with the knowledge required to purchase the best display for their money to read the references I’ve listed below.
 

2007-08-14 09:40:42 · answer #1 · answered by ? 5 · 0 0

It doesn't show the 1080i. The 1024x768 HDTV downconverts the 1080i signal to the native resolution of the TV. That is how it displays the 1080i picture. Conversely, if you had a 1980x1080p HDTV and it was receiving a 720p signal the TV would upconvert the 720p signal to 1080p.

2007-08-13 08:50:24 · answer #2 · answered by gkk_72 7 · 0 0

I think your probably reading that it can accept a 1080 signal but the TV downconverts the signal to it's native resolution. I'm sure someone here can give you a more detailed answer but that basically sums it up.

2007-08-11 06:44:52 · answer #3 · answered by mrhan1 3 · 0 0

1080i means the signal consists of TWO fields, each with 540 lines.

TVs with 768 lines, can do two things:
(cheap TVs), drop one field and then scale the other field up to 768 lines, or
(better TVs), combine the two fields and then scale it down to 768 lines.

2007-08-11 07:12:40 · answer #4 · answered by TV guy 7 · 0 1

fedest.com, questions and answers