English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I heard that you could find out what latitude you're at if you measure the distance between the horizon and the pole star. So, I'm at the North Pole and therefore I would heave to measure between the north horizon and Polaris.

Now, how do I do that?

2006-07-23 15:18:42 · 4 answers · asked by astro_dude 1 in Science & Mathematics Astronomy & Space

4 answers

You don't actually measure the angle with respect to the horizon. You measure it with respect to down.
You take your sectant (a glorified protractor) look along the edge of it pointing it at the star and make a note of what angle a suspended mass makes with the protractor it's attached to.
In your case at the north pole down will be -90 degrees on the protractor (the star is always at +90 degrees) meaning your at a latitude of 90 degrees.
If your at the equator the mass dangles straight down giving a reading of zero degrees and a corresponding latitude of zero degreees.
and there you go... no need for a horizon, which would be inaccurate anyway especially if you stood at the base of a mountain.
Now for those of us in the southern hemisphere it's a bit more complicated as we can't see the pole star. We've gotta either make do with measurements of several stars or with one of those handy GPS's.

edit: with the approximation method giving above you can probably figure out if your in Mexico, the US or Canada for example, but you wouldn't be able to be more accurate than that. If your good this will give you an uncertainty of plus or minus five degrees which is about 500km (as opposed to the plus or minus half a degree of the protractor =50km or if you get a good sectant you can be even more precise)

2006-07-23 15:33:36 · answer #1 · answered by Paul C 4 · 2 1

Really, you mostly use approximation. Here is the procedure:

Extend your arm in front of you towards the horizon, with your hand in a fist. Place your other arm on top of it, also making a fist. You then keep doing this until you get directly above you, counting how many 'fists' it took. Divide the amount of fists it took by 90 (since 90 degrees is directly overhead). You will get a numer something like 9ish- for me it takes ten fists (90 divided by ten is nine, but it's different for different people. This number tells you how many degrees each 'fist' measures approximately.

This number tells you how many degrees each 'fist' is worth. So when you're looking at the night sky, you do the same process, exept count how many fists it takes to get to the star. Then, just multiply the amount of fists it takes by the number of degrees per fist (the number you got in the first step).

This should give you the angle between the horizon to the star.

2006-07-23 15:28:20 · answer #2 · answered by Michael G 2 · 1 0

look up the anticipated (-23º 26.3') selection of the sunlight on the 2d of the solstice. So the observer and the Moon's subpoint are expected to be separated via 38º and 23º 26.3' curvature around the Earth's floor. it somewhat is 61º 26.3'. So the sunlight could seem to be approximately 61º 26.3 down out of your zenith. Subtract that from 90º. Or from 89º 60': that could be a prediction of roughly 28º 33.7' above the horizon. Compensating for parallax in altitude and for refraction could provide a greater precise expected attitude.

2016-12-10 13:07:51 · answer #3 · answered by ? 4 · 0 0

lots of ways, the easiest is with a sextant, but a sextant is just a fancy combination of a "level" and a "protractor"

if you have a carpenters level and a protractor you can use the level to give you a horizontal and then put your protractor on it and site the star along the protractor so that it measures your line of site

voila

2006-07-23 15:23:34 · answer #4 · answered by enginerd 6 · 0 0

fedest.com, questions and answers