English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

a. originated just after the telescope was invented.
b. can be used to indicate the apparent intensity of a celestial object.
c. is used to measure the temperature of a star.
d. was used to determine the rate of precession

2007-02-23 16:29:13 · 4 answers · asked by rui h 1 in Science & Mathematics Astronomy & Space

4 answers

(a) It is not (a) as the Greek astronomer Hipparchus devised such a scale 2,000 years before the telescope was invented,

(b) is nearest but the word used is luminosity not intensity

(c) is only true indirectly and approximately as surface temperature is roughly proportional to brightness and brightness is what is being measured, here.

(d) red shift (the Doppler effect) is what is used to measure the rate of precession directly and, using Hubble's Law and Hubble's Constant to measure the distance of stars and other celestial light sources of light from the observer.

It was Norman Robert Pogson, a Nottingham-born astronomer (March 23, 1829 – June 23, 1891) who became director of the Madras Observatory, who put Hipparchus' intuitive system on a scientific basis in 1856:

His most notable contribution was to note that in the stellar magnitude system introduced by the Greek astronomer Hipparchus, stars of the first magnitude were about a hundred times as bright as stars of the sixth magnitude. His suggestion in 1856 was to make this a standard, so each decrease in magnitude represented a decrease in brightness equal to the fifth-root of 100 (or about 2.512). The Pogson Ratio became the standard method of assigning magnitudes.

Pogson discovered 8 asteroids, 21 variable stars and produced a Catalogue of 11,000 stars in the southern hemispere during his time in India.

Essentially his scale is logarithmic i.e, if magnitude 3 is 100 times brighter than magnitude 8, then magnitude 4 is 100 times brighter than magnitude 9 etc

2007-02-23 22:05:37 · answer #1 · answered by Anonymous · 0 0

I've always seen magnitude used as a way to describe the brightness of a star. A star of magnitude 1 is 2.5 times as bright as a star of magnitude 2, and so on through the scale so that a first magnitude star is about 97 times as bright as a star of magnitude 6.

There can even be stars of negative magnitude--these are simply stars brighter than magnitude 1 stars. Using the scale I just described, a star of magnitude -4 would be nearly 10,000 times as bright as a star of magnitude 6.

As a few examples, the brightest star in the night sky, Sirius, has a magnitude of about -1.4. Our own sun has a magnitude of -27 (I guess that's why they always say not to look at it!) : )

2007-02-25 09:21:10 · answer #2 · answered by motron 1 · 0 0

The answer is "C".

BTW, O B A F G K M is the magnitude scale from brightest to dimmest, and is memorized as;

"Oh, be a fine girl (guy), kiss me".

2007-02-24 02:40:06 · answer #3 · answered by stargazergurl22 4 · 0 0

B

2007-02-23 17:28:02 · answer #4 · answered by Arkalius 5 · 0 0

fedest.com, questions and answers