English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-02-24 12:39:48 · 4 answers · asked by kiraz77cherry 1 in Science & Mathematics Astronomy & Space

4 answers

Every 5 magnitudes brighter = 100 times brighter, The Pogson ratio. 2.512 (the fifth root of 100): 1 tells us what difference 1 magnitude makes.

Hence 15 magnitudes brighter = 1,000,000 times brighter and 14 magnitudes brighter = approx 400,000 times brighter.

SUN apparent magnitude: -26.8
MOON apparent magnitude: -12.74

Difference in magnitudes: 14.06

2007-02-24 14:17:13 · answer #1 · answered by Anonymous · 1 0

The sun is always brighter than the moon. Astronomers measure brightness by an objects magnitude. The lower the number, the higher is the brightness. The moon's magnitude at full moon is -12.5 while the sun's magnitude is -26.5.

2007-02-24 12:54:41 · answer #2 · answered by Twizard113 5 · 0 0

Hi, isn't it true that the moon simply reflects the light from the sun? Therefore, relatively, it will be a constant ratio be it full moon or not. Of course total reflection from a full moon is more than a crescent, but brightness is the same. Exactly how many times i don know. Why r u asking such a question?

2007-02-24 12:59:23 · answer #3 · answered by Astiquer 2 · 0 0

The apparent magnitude of the full moon is about -12.5, and that of the sun is -26.73.

100^{ [ (-12.5) - (-26.73) ] / 5 }
100^(14.23/5)
100^2.846
492040

The sun is as bright as 492040 full moons.

2007-02-24 12:54:09 · answer #4 · answered by Anonymous · 2 0

fedest.com, questions and answers