English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I found the term 'true' being used for Hz ratings as they were for HD. So can someone cut through the bs... thanks.

2007-11-19 20:35:15 · 2 answers · asked by mick rogers 2 in Consumer Electronics TVs

2 answers

The normal refresh rate for UK TVs is 50 Hz. A 100 Hz TV creates additional frames between each of the received frames and displays all of them at twice the normal rate. This can smooth-out some artifacts in the 50 Hz display of TV video, if done right. True 100 Hz uses a more complex screen which has the possibility of reducing imperfections in the display of filmed material. Filmed material is shot at a slower rate than 50 Hz and displays at 50 Hz can have an imperfection called judder. True 100 Hz processing could reduce this effect.

Whether any of this means anything to you when you're considering purchasing a set depends on how well the techniques were implemented. If getting either of these features involves a significant difference in price, I recommend you try to find professional reviews of the sets to see if there really is any improvement in performance, not to mention you seeing the sets in action yourself.

2007-11-20 14:18:53 · answer #1 · answered by jjki_11738 7 · 0 0

Are you sure you mean "HZ" which is short for Hertz or cycles per second?

Could you mean 720, 1080i vs 1080p?

Here might be the problem:

All HDTV's must accept the following signals:

480, 720, 1080

(There are 'p' and 'i' versions - but lets ignore this).

My television is 720 internally. If I feed it standard video, it up-converts the video to 720. If I feed it 1080, it down-converts it to 720.

You have been able to get 1080i televisions for a while and now many sets are 'true 1080p' internal. Does this make sense?

2007-11-20 14:50:46 · answer #2 · answered by Grumpy Mac 7 · 0 2

fedest.com, questions and answers