English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Has anyone noticed the flickering nature of Televised computer screens on TV or other Motion Picture Viewers?Also, has it been noticed that TV screen motion pictures sometimes do not appear on developed polaroid (picture) outputs when snapped?Was wondering what causes it because I noticed it again yesterday.Destructive and constructive interfe bla bla?Oh Save me!

2007-02-23 02:52:29 · 7 answers · asked by origen2g 1 in Computers & Internet Hardware Monitors

7 answers

Its called the refresh rate. Basically beleive it or not your whole computer screen is made up of ONE (1) teeny tiny little line of a pixels that goes across the whole screen (left to right). so picture holding a piece of thread across the screen. now taht small line scrolls up your screen to make up the image that you see at an increadibly fast pace. The acual rate it travels in is called hertz (this is another story) So the faster the Hertz faster the line scrolls up the screen thus giving a crisper picture. So now knowing this you can imagine taking a snap shot with a camera (depending on the fstop, and shutter speed of the camera) you may get a blank, semi-blank or full computer screen. This is also the same for tv cameras that don't roll tape as fast as the screen scrolls. The huam eye can see a computer refresh rate of roughly 60 Hertz so therefore if you monitor is set to 60 or less your screen may have a slight flicker and give you headachs all the time. Few more things then I'll wrap it up. You can see the refresh better from a distance so if you wanted to test this out for your self right click on the desktop of your computer and go to properties, then settings, then advanced (on the bottom), then monitor and change your refesh rate to 60 or less and press apply. then walk far away from your screen and look at the monitor you beable to see it flickering a bit (depending on your eyes) make sure you set this setting back when you are done. Follow the same steps as above. Last note don;t set this too hight because it could burn out your monitor a bit faster 75-80 is a good number to keep this at.

2007-02-23 02:56:09 · answer #1 · answered by Twigward 3 · 0 1

well the flickering is due to the framerate at which images are projected. there is no such thing as an actual moving picture

Just like in the old old slide show movies, tv just flashes about 20 to 30 images per second making you think the image is moving

when a tv is playing an a camera video tapes it, the frame screens overlap, so instead of seing 20 to 30 images per second you see about 10 to 15, and thus the frames become apparant

about the polaroid question im not quite sure, but i would assume its because the film is designed to capture ultraviolate rays, not small lcd rays

2007-02-23 03:00:53 · answer #2 · answered by dragongml 3 · 0 0

Its down to frame rates.

Computer monitors and TVs don't actully show a static image. They constantly update and refresh, multiple times a second. Its fast enough that in normal use we don;t notice.

However, if you display a screen on a screen you have a flickering image displayed on a flickering image, this enhances the flickering effect because the two flickering screen will not be in sync. It works in much the same principle as when car wheels look like they are spinning backwards when they are on TV.

It happens in photos when the photo is taken when the screen blanks so the light emitted does no register on the sensor.

2007-02-23 03:04:02 · answer #3 · answered by ? 3 · 0 0

TV screens and computer screens have different rates at which they refresh the screens e.g. a British tv operates at 25Hz and a USA one operates at 30Hz (50Hz/60Hz interlaced). It's like when you see wagon wheels on a western movie, sometimes they go forward, then backwards, stopping in the middle. It's the strobie effect of the wheels versus the frames per minute of the film, likewise the computer screen vs the tv refresh(strobe) rate.. There is a brief period of time when a TV screen is totally blank (but not to the human eye) but your shutter speed on your camera must be REALLY fast.

2007-02-23 03:02:34 · answer #4 · answered by Del Piero 10 7 · 0 0

yeah if you have ever noticed other cameras filming a tv or a video screen you will see the same thing. It is just the camera picking up the screen refreshing itself. Something we cannot normally see with our own eyes

2016-05-24 02:12:00 · answer #5 · answered by Anonymous · 0 0

Frame rate, or frame frequency, is the measurement of how quickly an imaging device produces unique consecutive images called frames. The term applies equally well to computer graphics, video cameras, film cameras, and motion capture systems. Frame rate is most often expressed in frames per second or simply, hertz (Hz).
The frame rate is related to but not identical to a physiological concept called the flicker fusion threshold or flicker fusion rate. Light that is pulsating below this rate is perceived by humans as flickering; light that is pulsating above this rate is perceived by humans as being continuous. The exact rate varies depending upon the person, their level of fatigue, the brightness of the light source, and the area of the retina that is being used to observe the light source. Few people perceive flicker above about 75 hertz.

These rates would be impractical for the actual frame rate of most film mechanisms so the shutter in the projection devices is actually arranged to interrupt the light two or three times for every film frame. In this fashion, the common frame rate of 24 fps (frames per second) produces 48 or 72 pulses of light per second, the latter rate being above the flicker fusion rate for most people most of the time.

Video systems frequently use a more complex approach referred to as interlaced video. Broadcast television systems such as NTSC, PAL, and SECAM produce an image using two passes called fields. Each field contains half of the lines in a complete frame (the odd-numbered lines or the even-numbered lines). Thus, while only using the bandwidth of 25 or 30 complete frames per second, they achieve a flicker fusion frequency of 50 or 60 Hz, at the expense of some vertical judder and additional system complexity. The "frame rate" of interlaced systems is usually defined as the number of complete frames (pairs of fields) transmitted each second (25 or 30 in most broadcast systems). However, since a conventional television camera will scan the scene again for each field, in many circumstances it may be useful to think of the frame rate as being equal to the field rate.

In contrast to televisions, computer monitors generally use progressive scan, and therefore internet video formats generally do also. The "P" versions of HDTV (i.e., 720p or 1080p) also support progressive scan, as do modern DVD players.

2007-02-23 02:59:29 · answer #6 · answered by Geejay 1 · 1 2

Yes. It's because the shutter rate of the camera doing the filming is out of sync with the update rate on the screen being filmed.

2007-02-23 02:59:43 · answer #7 · answered by Robin the Electrocuted 5 · 1 1

fedest.com, questions and answers