English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

You are looking at two stars that have identical absolute magnitudes but one is 10 times farther away than the other. How many times brighter does the nearer star look than the more distant one?

A.10
B.20
C.100
D.1000

2007-12-03 15:14:47 · 7 answers · asked by tdoby3 2 in Science & Mathematics Astronomy & Space

7 answers

The correct answer is 'A'.
The stars' intrinsic luminosity values are not functions of algorithms, so the absolute value is simple to figure.

2007-12-03 15:24:38 · answer #1 · answered by Bobby 6 · 0 1

If the stars have the same absolute magnitudes, the only other factor (within reason based on your question) is distance.

The intensity of radiation is a function of distance from the source according to the inverse square ratio.

i=1/d^2

Plug in d and you get 1/100 the brightness at 10 times the distance.

There for the closer star appears 100 times as bright as the farther star - all things aside from distance being equal.

2007-12-03 15:40:01 · answer #2 · answered by Justin 5 · 0 0

Light propagates in all directions from the star so at distance x from the star the light is spread out over the surface of a sphere with radius x. This is known as the inverse square law: http://hyperphysics.phy-astr.gsu.edu/hbase/forces/isq.html. The correct answer is C: 100.

2007-12-04 02:15:45 · answer #3 · answered by Meatball 2 · 0 0

the answer is D. Luminosity varies as the inverse cube of the distance.

2007-12-03 17:00:04 · answer #4 · answered by Lorenzo Steed 7 · 0 0

A

2007-12-03 15:18:46 · answer #5 · answered by ProArtWork 4 · 0 0

c. obviously

2007-12-03 15:18:15 · answer #6 · answered by Joey 2 · 0 0

NO IDEA......

2007-12-03 15:18:14 · answer #7 · answered by xsplodeit 4 · 0 1

fedest.com, questions and answers