Your TV will always display in its native resolution (720p); so the question is: who is doing better deinterlacing? Your PC (or gaming machine) or your TV? A lot depend on the graphics card you use.
For a 32" TV, it should really make little difference, but you can try both options and see for yourself.
2007-06-08 05:07:41
·
answer #1
·
answered by TV guy 7
·
1⤊
0⤋
I set my box to 720p because the 1080i gets scaled down on a 768 native res. tv. But in the end, I don't think it makes that much of a difference.
2007-06-08 09:39:11
·
answer #2
·
answered by Georgia Bulldogs #1 2
·
0⤊
0⤋
You'll almost always have a better experience if you use 720p versus 1080i for gaming. The difference isn't in the resolution, it's in the way that the TV set draws the lines to form the picture.
The "p" stands for progressive. The "i" stands for interlaced.
When a TV set draws a frame (30 frames per second), it does so by starting with line 1 and then drawing the successive lines until the picture is formed.
If you tell the TV to draw an interlaced frame, the entire picture is actually made up of two fields. The TV draws line 1, then 3, then 5, and so on. After it gets to the bottom, it starts over with the even numbered lines.
A progressive-scan image is drawn in order: 1,2,3,4,etc. The disadvantage to interlaced scanning is in the creation of motion artifacts.
If a football player is running from the left to the right of your TV and you're drawing him using interlaced scanning, the man will appear to flicker because he is being sliced up in time because of the even/odd drawing scheme.
If, instead, you use progressive scanning to draw your football player, you will see a smooth image.
The key difference is motion. Therefore, I propose that video gaming will be best experienced by using 720p.
2007-06-08 09:22:54
·
answer #3
·
answered by Jdanforth 2
·
2⤊
1⤋