It's a geometry problem. Suppose you're high up in a building on the beach overlooking the ocean. You're at height h above sea level.
Now draw a circle which represents the earth, with radius r. Your distance from the center of the earth is r+h. The horizon is some distance away, x. Your line of sight to the horizon is to the radius of the earth, forming a right triangle, where the hypoteneuse is r+h.
Using the Pythagorean Theorem,
r^2 + x^2 = (r+h)^2 = r^2 + 2rh + h^2
x^2 = 2rh + h^2
x = sqrt(2rh + h^2)
That's the exact answer, But for practical purposes, 2rh is bigger than h^2. To see this, suppose you're 500 feet in the air. That's about a tenth of a mile, and using miles, h^2 = 0.1^2 = 0.01. On the other hand, the earth's radius is about 4000 miles, so 2rh = 2 x 4000 x 0.1 = 800. This is 80,000 times bigger than h^2.
So we can safely ignore the h^2, and say x = sqrt(2rh). That can be broken down further. Again using miles (you can also do this in kilometers),
x = sqrt(2rh) = sqrt(2r) sqrt h = sqrt(8000) sqrt h = 90 sqrt h
(rounding off, since sqrt 8000 = 89.44; also, h must be in miles).
Using the formula x = 90 sqrt h, suppose you're in an airplane 4 miles up in the air (21,000 feet). Then the horizon is 90 sqrt 4 = 180 miles away.
Okay, now let's work it out for h in feet instead of miles. We'll go back to the formula x = sqrt(2rh), and use r = 3960 miles (a more accurate figure). Then
x = sqrt(2 x 3960 x h / 5280) = sqrt(3h/2) = 1.225 sqrt h
where h is in feet and x is in miles.
Two examples: (1) For h = 6 feet, use x = sqrt(3h/2) = 3 miles. (2) For x = 100 feet, use x = 1.225 sqrt h = 12.25 miles.
To recap: For h in miles, use x = 90 sqrt h. For h in feet, either use x = sqrt(3h/2) or x = 1.225 sqrt h.
In all cases, x will be in miles. And of course, all of this can be done using meters and kilometers.
2006-11-23 06:45:47
·
answer #1
·
answered by bpiguy 7
·
0⤊
0⤋