English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Incase my question does not make sense, here is a detailed explanation: say I am at exactly sea level; the horizon is a certain distance away from me. Now if I elevate myself to an arbitarily higher altitude, the horizon will obviously shift futher back and my so call viewing sphere will increase. Now what I need is a function that defines this shift in horizon, in other words: the relationship between altitude from sea level and the distance of the horizon. Obviously we assume that there are no viewing restrictions like objects or fog.

2006-07-24 14:41:16 · 5 answers · asked by shams_shafiq2000 1 in Science & Mathematics Physics

5 answers

1.17 times the square root of your height of eye = Distance to the horizon in nautical miles

a nautical mile is 2000 yards. I am guessing you can convert it from there

2006-07-24 14:46:19 · answer #1 · answered by Anonymous · 1 1

This is a simple geometry problem addressed using similar triangles and the following assumptions: 1) the surface is question is part of a sphere and 2) the altitude of the observer is very small.

I then define the distance to the horizon (d) as the length of the half chord connecting the tangent point along the tangent line from the observer to the horizon to the radius line connecting the observer to the center of the sphere.

Draw the picture. Notice that the tangent line of sight (LOS) from the observer makes a right angle with a radius line. The triangle composed of the LOS line, the radius between the observer and the center of the sphere has an angle theta between the two radius lines. This is the same
angle as the angle between the LOS and the half chord I defined as the distance to the horizon.

Using similar triangles, it is then easy to see that the distance to the horizon will is the square root of the height of the observer (h) multiplied by the radius of the sphere (R).

Less ambiguously the equation is SQRT(h*R)

Now for a reality check. A 6 foot tall person at the beach would then be able to see about 2.1 miles out into the ocean. From the top of a 1000 foot skyscraper, you should be able to see about 27 miles. Sounds reasonable to me.

(Note that the choice of the name "viewing sphere" is not appropriate since you are seeking to define a viewing circle on the surface of a sphere.)

fun problem, thanks.

(I am off by a factor of SQRT of 2 from matheisms answer. I am not sure what I missed, if anything. . . )

2006-07-24 22:07:30 · answer #2 · answered by Mr. Quark 5 · 0 0

The distance to the horizon can be calculated using the Pythagorean Theorem. Let "r" be the radius of the earth. The distance to the horizon "d" as a function of the altitude "h" is:

d(h) = sqrt((r+h)^2 - r^2)

Where "sqrt" is the square root function and "^2" is the power of 2.

The radius of the Earth is approximately 4000 miles, so plugging that into the equation gives:

d(h) = sqrt((4000+h)^2 - 16000000)

So just plug in the value for your altitude, "h".

2006-07-24 21:50:47 · answer #3 · answered by mathiesm 2 · 0 0

In fact, this latter result (for English units) is nicely summed up by the old rule that 7 times the height in feet is 4 times the square of the distance to the horizon in miles; i.e.,

distance to horizon (miles) = sqrt [ 7 × h (feet) / 4 ]

http://mintaka.sdsu.edu/GF/explain/atmos_refr/horizon.html

2006-07-24 22:39:52 · answer #4 · answered by ideaquest 7 · 0 0

From simple trigonometry, your altitude =rsecant(theta), where r=the radius of the Earth and theta is the degrees that you can see, i.e. degrees of longitude, for instance. One degree=piX2r/360 where r is the radius of the Earth. I will let you do the rest of the algebra.

2006-07-24 21:54:28 · answer #5 · answered by Sciencenut 7 · 0 0

fedest.com, questions and answers