English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

how many times fainter would the more distant one appear to be? Explain your reasoning.

2007-11-02 01:38:23 · 4 answers · asked by Anonymous in Science & Mathematics Astronomy & Space

4 answers

Light follows an inverse square law. That is to say light intensity drops with the inverse square of the distance. If you double your distance from the light source it appears a quarter as bright (1/2squared, = 1/4). If you treble your distance it appears 1/9 as bright (1/3squared = 1/9).

So if one star is ten times further away it would appear 1/100 as bright as the nearer star.

2007-11-02 02:02:26 · answer #1 · answered by Jason T 7 · 0 0

none would be fainter because if you can see them and they are identical, then that means that they emmit the same amount of light, because for you to see light, it needs to reach your eyes, meaning that they would be of the same brightness

2007-11-02 02:17:13 · answer #2 · answered by Anonymous · 0 1

probably less than 1/10 as bright as you have 10 times more space crud in between.

2007-11-02 01:47:43 · answer #3 · answered by Anonymous · 0 1

they are all just balls of gas.......and I don't watch the stars enough to answer this

2007-11-02 01:41:56 · answer #4 · answered by intensity92000 2 · 0 1

fedest.com, questions and answers