English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

4 answers

Assuming that you are off by one degree and a circle is 360 degrees, you will be off by 11,059.5291 feet at a distance of 120 miles.

That is 2.0946 miles!

2007-09-06 07:52:08 · answer #1 · answered by Brian 2 · 0 0

Let's assume you're traveling on a flat surface, otherwise the curvature of the Earth makes the problem complicated.

We can use properties of isosceles triangles to find the distance. If we make an isosceles triangle with two sides each 120 miles long, and the angle between those two sides 1 degree, then we're looking for the length of the third side, L.

It is given by this formula:
L = 2*S*sin(θ/2)
where S is the long side (120 miles) and θ is the angle (1 degree).

L = 2*S*sin(θ/2)
L = 2*S*sin((1 degree)/2)
L = 2*S*sin(0.5 degree)
L = 2*(120 miles)*0.00872654
L = 2.09437 miles

So you'd be off by about 2.1 miles.

2007-09-06 07:55:21 · answer #2 · answered by lithiumdeuteride 7 · 0 0

This is most easily done by assuming small angles. In planar geometry, (assuming you walked 120 miles in the direction off by 1 degree) would be

d = 2*L*sin(.5) (L=120 miles)

but for small angles, sin(t)~t (with t in radians not degrees!)

so we can approximate this by
d ~ 2*L*.5*pi/180 (pi/180 to convert .5 degrees to radians)
so d = L*pi/180 = 120*pi/180 = 2.1 miles.

2007-09-06 08:32:44 · answer #3 · answered by Anonymous · 0 0

If you are using a BB gun... along way.

2007-09-06 18:59:57 · answer #4 · answered by DugM 1 · 0 0

fedest.com, questions and answers