English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

partha

2006-12-14 18:06:27 · 3 answers · asked by Anonymous in Science & Mathematics Astronomy & Space

3 answers

Hipparchus of Nicaea was the first to come up with magnitude system to measure the brightness of the stars.

2006-12-14 18:15:58 · answer #1 · answered by phsgmo 2 · 0 1

There are two possible answers, depending exactly on what you had in mind:

1. The ancient Greeks, such as Hipparchus, looked at the stars over 2000 years ago, and called them stars of "first rank" or "first magnitude" if they were among the very brightest stars, the second "magnitude" if they seemed somewhat fainter, etc. This classification went down to the sixth magnitude, the faintest stars they could see with the naked eye.

(Hipparchus was truly one of the astronomical greats. By comparing the positions of his stars (measured with respect to the ecliptic and Vernal Equinox) against those of stars recorded about 150 years earlier, he noticed a systematic change in the Vernal Equinox measure. He deduced what we now call the "Precession of the Equinox" due to the Earth precessing in space. Hipparchus was appropriately remembered by the acronym for an ESA satellite that measured the position of stars with unprecedented accuracy in the early 1990s: HIPPARCOS, an acronym formed from the capital letters in "HIgh Precision PARallax COllecting Satellite.")

2. We owe the modern magnitude system, with its fine gradations, to the Reverend Norman Robert Pogson of England. In the middle of the 19th century, he became interested in measuring the relative brightnesses of stars with the very earliest examples of what would be called bolometers, today. As he compared his brightness measurements with the Greek whole number classifications, he noticed that theirs was essentially a logarithmic system, with their range of five magnitudes corresponding to a factor of essentially 100 in his own measured relative brightnesses.

In 1856 he finally published a proposal that this "magnitude scale" be formalised, such that a change in Magnitude of 5 should correspond to a change of precisely 2 in log (brightness). Because the Greek system had been a ranking system (small numbers correspond to bright stars, large numbers to faint), Pogson's proposed scale went the same way: SMALL (or even NEGATIVE!) Magnitudes are BRIGHT, and LARGE Magnitudes are FAINT. This system was sometimes referred to as "The Pogson Scale," but it is now so common that the concept "Magnitude" is used without any attribution.

This brightness system, "the wrong way round" as it first appears to those unused to it, has caused endless trouble, not only to generations of astronomy students, but even to professional physicists trying to home in on astronomical areas now that they've realised that the Universe itself is a far more fantastic accelerator than they could ever hope to achieve.

Live long and prosper.

2006-12-14 18:12:53 · answer #2 · answered by Dr Spock 6 · 2 1

Magnitude of an Astronomical Object
"Visual magnitude" is a scale used by astronomers to measure the brightness of a star. The term "visual" means the brightness is being measured in the visible part of the spectrum, the part you can see with your eye (usually around 5500 angstroms).

The first known catalogue of stars was made by the Greek Astronomer Hipparchus in about 120 B.C. and contained 1080 stars. It was later edited and increased to 1022 stars by Ptolemy in a famous catalogue known as the "Almagest". Hipparchus listed the stars that could be seen in each constellation, described their positions, and rated their brightness on a scale of 1 to 6, the brightest being 1. This method of describing the brightness of a star survives today. Of course, Hipparchus had no telescope, and so could only see stars as dim as 6th magnitude, but today we can see stars with ground-based telescopes down to about 22nd magnitude.

When astronomers began to accurately measure the brightness of stars using instruments, it was found that each magnitude is about 2.5 times brighter than the next greater magnitude. This means a difference in magnitudes of 5 units (from magnitude 1 to magnitude 6, for example) corresponds to a change in brightness of 100 times. With equipment to make more accurate measurements, astronomers were able to assign stars decimal values, like 2.75, rather than rounding off to magnitude 2 or 3.

There are stars brighter than magnitude 1. The star Vega (alpha Lyrae) has a visual magnitude of 0. There are a few stars brighter than Vega. Their magnitudes will be negative.

Astronomers usually refer to "apparent magnitudes", that is, how bright a star appears to us here at Earth. Apparent magnitudes are often written with a lower case "m" (like 3.24m).

The brightness of a star depends not only on how bright it actually is, but also on how far away it is. For example, a street light appears very bright directly underneath it, but not as bright if it's 1/2 a mile away down the road. Therefore, astronomers developed the "absolute" brightness scale. Absolute magnitude is defined as how bright a star would appear if it were exactly 10 parsecs (about 33 light years) away from Earth. For example, the Sun has an apparent magnitude of -26.7 (because it's very, very close) and an absolute magnitude of +4.8. Absolute magnitudes are often written with a capital (upper case) "M".

http://liftoff.msfc.nasa.gov/Academy/UNIVERSE/MAG.HTML

2006-12-14 18:13:49 · answer #3 · answered by AdamKadmon 7 · 1 1

fedest.com, questions and answers