Here's a fairly simple explaination:
First of all, a computer monitor like the one you're viewing now "refreshes" or "repaints" itself at the rate of 60 times a second.
There's a little gun that sits at the back of your monitor that literally draws the picture on the back of the tube at 60Hz or, 60 times a second.
When a movie is "filmed" the film passes in front of the lens at the rate of 24fps (frames per second).
So, the flickering you see is due to the difference of the film speed and the monitor refresh rate. Since they are not exactly the same, the result is the flickering you see.
When viewing a monitor with just your naked eye, the human eye can actually detect a flicker on the screen when it draws at that rate of 60Hz. At a refresh speed of 70Hz, the screen will appear to flicker less, and the picture will seem to be more stable.
So, if your monitor is flickering when you are viewing it, check your refresh speed and you should try to set your refresh rate as high as possible - the higher the rate the smoother and more stable the picture. (but, don't exceed your Monitor's recommended refresh rate).
DID YOU KNOW?
A similar event occurs when you try to take a photograph of a TV picture and the finished photo will always show the scan line (a dark horizontal line will be across some area of the TV picture).
If however, you change your shutter speed to 1/25 or 1/30 sec. you will not get the scan line and the picture on the TV will be normal full-image with No scan lines.
That's because you "programmed" the camera's shutter speed to stay open long enough that the scanned TV image will complete itself before the shutter closes and you will get the entire TV image.
2006-08-17 22:09:00
·
answer #1
·
answered by GeneL 7
·
1⤊
0⤋
OK, here we go. The TV/Computer monitor has a specific rate at which it works at- normally 60Hz on the horizontal. The camera works also at 60 Hz- now, no mater how much filtering from AC to DC you still have some ripple- it may be down in the 1/1000 of a volt but its still there. Now- lets skip a bit. Your house wiring is actually a 2 phase system- it has 240 volts going into the circuit pannel, and its split into 120/120 and then joined together for heating, air condioning, the stove, the dryer, and heavy current draw items. Well if the TV/Computers on one phase, and the Cameras on the other phase, technically they're about 180 degree's out of phase, and then you throw in the ripple off the DC and it creates those "hum bars" is what we used to call'em- I don't know what they call them now. If its done in an industrial building it will have whats called 3 phase power- each phase, about 122 volts, 120 degree's apart- and you'll see 2 or 3 bars on the screen instead of 1 or 2.
Good Luck! :)
2006-08-18 05:13:06
·
answer #2
·
answered by Anonymous
·
0⤊
1⤋
that is becasue the harizontal n the vertical scanning rate of the monitor in the movie is very high. Moreover the cameras used do not have a scanning rate as such. Hence the flickers.
2006-08-18 05:38:29
·
answer #3
·
answered by Siddharth Moghe 2
·
0⤊
0⤋
a tv just like a light bulb turns the picture on and off fifty to sixty times a second. you're eye cannot catch this because it happens so fast. but when it takes a picture of another tv or monitor, the on off times overlap so the time that it turns on and off are just enough for you're eye to see it.
2006-08-19 19:41:58
·
answer #4
·
answered by tvman 2
·
0⤊
0⤋
The refresh rates on the film can be quicker than the refresh rates in your eyes. So the film is capable of capturing what you cant see with the naked eye, like flickering of the screen while it refreshes, which you couldn't normally see on your own since your eyes aren't fast enough.
2006-08-18 05:09:04
·
answer #5
·
answered by Anonymous
·
0⤊
0⤋
Yes you are right and this happen because the frequence rate.
TVs frequence is less than the what new gadgets like Digital Camera have....
2006-08-18 05:09:44
·
answer #6
·
answered by Vins 2
·
0⤊
0⤋