It depends on how high your eyes are relative to the surface of the sea. The rule of thumb is: Distance to horizon in nautical miles = 1.17 times the square root of the height of your eye in feet.
You can then use Google to convert the result to whatever unit of distance you wish. Ex: enter "convert 3 nautical miles to kilometers" in a Google search and it will tell you: "3 nautical miles = 5.55600 kilometers".
A tall object beyond the horizon can stick up so that it is partially visible. To find the maximum distance at which its tippy-top would be just visible you use the same formula with the objects height over sea level and add the result to the distance from your eye to the horizon. Ex: a 100-foot high tower would be visible 11.7 nautical miles beyond the horizon.
Now, that formula works on Earth, but what if that sea you're looking out over is on a different planet? For that sort of case we need a more general solution.
Since your line of sight to the horizon runs at a tangent to the surface, touching it at the horizon, it forms one side of a giant right triangle. The other side forming the right angle runs from that tangent point to the center of the planet, and the hypotenuse runs from the center back up to your eye. If you know the radius (r) of the planet you know one of the sides, and the hypotenuse is that same distance plus the height of your eye (h) above the surface. Since we know from the Pythagorean Theorem that in a right triangle the square of the hypotenuse is equal to the sum of the squares of the other sides, from that you can calculate: Distance to horizon = sqr((r+h)^2+r^2)
Notes...
Sqr(X) = square root of X
X^2 = X squared
If weather/atmospheric conditions limit visibility to less than the calculated distance to the horizon, that *does not* change the distance to the horizon. It just means you can't see all the way to it.
2006-12-26 12:06:34
·
answer #1
·
answered by John's Secret Identity™ 6
·
0⤊
0⤋
For heights close to to the exterior of the earth, the subject-loose approximation is: d = 3.86 sqrt(h), with the top given in meters and the area to the horizon in kilometers. Even status on flat floor, your eye would be someplace around a million.5m above the floor. once you're going up in a rocket, different formula are needed.
2016-10-19 00:32:20
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
3 miles
2006-12-30 09:06:53
·
answer #3
·
answered by Sam's jam 2
·
0⤊
0⤋
The horizon would be only as far as you could see at a 90 degree angle. Once the angle increases (or decreases) you wouldn't be able to see any further.
I'm sure there's a mathematical equation out there for a specific distance.
2006-12-26 11:58:10
·
answer #4
·
answered by G 6
·
0⤊
3⤋
It probably depends on how far you see, but to me it never ends.
2006-12-30 11:22:14
·
answer #5
·
answered by BEYONCE 4EVA 1
·
0⤊
0⤋
As I remember, its something close to 30 miles.
2006-12-26 16:02:54
·
answer #6
·
answered by badabingbob 3
·
0⤊
2⤋
based on the curve of the earth, about 9 miles.
2006-12-26 11:58:34
·
answer #7
·
answered by Anonymous
·
0⤊
3⤋
11 km on earth.
2006-12-26 19:20:26
·
answer #8
·
answered by The madman who makes people fly 2
·
0⤊
1⤋
Like really, way far out there man.
2006-12-26 11:58:17
·
answer #9
·
answered by Anonymous
·
0⤊
3⤋
depends on visibility...usually only a couple of miles tho
2006-12-26 11:57:07
·
answer #10
·
answered by Anonymous
·
0⤊
2⤋