The horizon depends on the height of eye - for example if your eyes are, say, six feet from the ground the horizon will be 2.8 miles (not the 20 as quoted)
No need to trust me - check it out in the Sailing Almanac - they need this kind of information to work out how far they are from the land - this is done by refering to the 'dipping distance' of a lighthouse for example
Hope this helps
Steve
2006-09-26 10:00:01
·
answer #1
·
answered by Anonymous
·
1⤊
0⤋
Seen from 6 feet above the surface of the Earth, the horizon is about 3 miles away, so the east horizon is 6 miles from the west horizon.
2006-09-26 10:02:28
·
answer #2
·
answered by campbelp2002 7
·
0⤊
0⤋
Photograf's explanation is correct, although I thought the coefficient was 1.23, not 1.17. Of course, the difference isn't that significant.
To get the difference from the east horizon to the west horizon, you double this, naturally.
2006-09-26 10:41:15
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
Once you know your height of eye you simply plug that into the following formula:
1.17 times the square root of your height of eye = Distance to the horizon in nautical miles
For example, let's say you are on the water in a friend's sport fishing boat and your height of eye is 9 feet above the surface of the water. The formula to calculate distance to the horizon is:
1.17 times the square root of 9 = Distance to the horizon in nautical miles.
1.17 * 3 = 3.51 nautical miles
2006-09-26 09:53:29
·
answer #4
·
answered by Anonymous
·
1⤊
0⤋
40 miles.
Anytime you are on a flat surface, on earth, the horizon is defined at 20 miles. Thus you have 2 horizons at 40 miles away from each other.
2006-09-26 09:52:15
·
answer #5
·
answered by ohmneo 3
·
0⤊
1⤋
28 miles. Standing in one spot. The furthest you can see in any direction is 14 miles, due to the curvature of the earth.
2006-10-02 20:20:52
·
answer #6
·
answered by highway35stomper 2
·
0⤊
1⤋