If the apparent magnitude of Star A is +2.0 and the apparent magnitude of Star B is +5.0, then Star A will look brighter than Star B.
But maybe Star B is further away and, if the two stars were close together, things would look differently.
2007-03-16 01:19:41
·
answer #1
·
answered by Anonymous
·
1⤊
0⤋
The Greeks came up with a way to measure the apparent brightness of stars. 1 to 6. 1 being the brightest. Then after a long time people realized that there are stars that are more faint then a 6, and brighter then a 1. So they had to go negative to explain brighter stars. (The sun is a star) And planets. So that is how we got to the sun a -26 and the moon a -14.
2007-03-16 09:00:28
·
answer #2
·
answered by Bacchus 5
·
0⤊
0⤋
Star 'A' is the brighter of the two. The stellar magnitude scale is goofy because the larger the number the dimmer is the star. Another example -- a star with a magnitude of minus 3 is brighter than a star with a magnitude of 1. Our sun has a magnitude of minus 26!
2007-03-16 06:31:32
·
answer #3
·
answered by Chug-a-Lug 7
·
0⤊
0⤋
The star with the lower magnitude is brighter.
Magnitude is the degree of brightness of a star. In 1856, British astronomer Norman Pogson proposed a quantitative scale of stellar magnitudes, which was adopted by the astronomical community. He noted that we receive 100 times more light from a first magnitude star as from a sixth; thus with a difference of five magnitudes, there is a 100:1 ratio of incoming light energy, which is called luminous flux.
2007-03-16 05:03:20
·
answer #4
·
answered by momus2k7 2
·
0⤊
1⤋
The Greek system of appparent magnitudes was invented by Hipparchus and popularised by Ptolemy in his Almagest,
They did not have telescopes and it is only in the past 400 years that a need to extend the system below 6th magnitude arose as dimmer objects became able to be seen than those at the boundaries of naked-eye visibility. The Hubble and Spitzer Space Telescopes can now see objects of 30th magnitude one trillionth (10^-12) as bright as Vega.
It was Norman Robert Pogson (March 23, 1829–June 23, 1891), a Nottingham-born astronomer who spent most of his working life in India as director of the Madras Observatory who in 1856 put Hipparchus' system on a more scientific basis ...
Wikipedia reports ...
"His most notable contribution was to note that in the stellar magnitude system introduced by the Greek astronomer Hipparchus, stars of the first magnitude were about a hundred times as bright as stars of the sixth magnitude.
His suggestion in 1856 was to make this a standard, so each decrease in magnitude represented a decrease in brightness equal to the fifth-root of 100 (or about 2.512). The Pogson Ratio became the standard method of assigning magnitudes."
The scale is logarithmic.
Thus a diffference of 3 magnitudes is a difference of (2.512)^3 in brightness = 15.85. Star A looks nearly 16 times as bright as Star B.
2007-03-16 10:19:31
·
answer #5
·
answered by brucebirchall 7
·
0⤊
0⤋