English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Imagine a star A whose diameter is 10 times larger than that of a Sun-like star B and whose surface temperature is 2900 K. Suppose that both stars were located at the same distance from Earth. Which star would be brighter... and by what factor?

2007-02-03 10:24:42 · 2 answers · asked by Troy 1 in Science & Mathematics Astronomy & Space

2 answers

Assuming
(a) both stars are perfect black body radiators;
(b) Star B has a surface temperature of 5785 K
(c) we compute for the center of the visible band (550 nanometers) only;

then

Star A is brighter, but only by 10%. That's about 0.1 magnitudes.

2007-02-03 12:23:29 · answer #1 · answered by Keith P 7 · 0 0

The absolute value of the brightness would depend on the scale used to measure brightness. The larger star would likely have a different surface temperature than the smaller star and therefore the frequency of highest intensity from each star would be in a different portion of the electromagnetic spectrum. I say this to point out that there is no single number capable of describing the difference in intensity of the light coming from the two stars. The ratio of the intensity of light coming from the two stars would be different for each point in the spectrum.
All that being said the larger star is so much larger that it would be significantly brighter. There would be 100 times the surface area emitting light toward the earth. If the unit of measure is related to the number of photons then the larger star would be 100 times brighter than the smaller star. I am not sure what the answer would be if the unit of measure is related to the total amount of energy delivered to the earth by the entire spectrum.

2007-02-03 18:49:54 · answer #2 · answered by anonimous 6 · 0 0

fedest.com, questions and answers