English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

6 answers

Neither - the term is not "brightness", the term is "magnitude".

Apparent magnitude is how bright a star appears to us on earth.
Absolute magnitude is how bright the star would appear at a distance of 10 parsecs.

So its apparent magnitude to us on earth.

2007-04-15 14:32:53 · answer #1 · answered by Anonymous · 3 0

1. False, it's 2.5 2. True but only under certain conditions such as those in space 3. True, around 31 quadrillion (31000000000000000 kilometres) 4. True 5. False, it will nova, and become a planetary nebula, but not supernova 6. True, but it is a sun-like star between the nebula and red giant stages 7. False, a variable star has already been through these processes 8. True, we can see an edge-on view of the milky way from within it 9. False, as is the nature of a black hole, it can never be seen 10. False, quasars are not necessarily luminous, and distance is a perspective, but they are all relatively far away 11. True, scientists can only theorise and calculate to some degree when this happened and how, but technically, the big bang happened at every point in the universe

2016-05-21 00:40:39 · answer #2 · answered by ? 3 · 0 0

There are three words which are closely related:
Magnitude
Brightness
Luminosity.

In astronomy, they each have a specific meaning, but we can easily go from one to the other.

The magnitude scale started off as a scale of how bright stars appear, compared to each other. The brightest stars were said to be of first magnitude (Vega, for example), dimmer stars were of second magnitude (stars of Big Dipper), down to sixth magnitude for the faintest stars visible to most people (some people can see even dimmer stars). It was meant as a comparative scale.

Today, the scale is based on the amount of visible light that we receive from the star. The scale is adjusted such that a difference of 5 magnitudes means exactly 100 times brighter. (A difference of 1 magnitude is, therefore, the fifth root of 100, approx. 2.512 times). The scale, though, is still backwards (a small number means a bright star).

Magnitude normally measures the whole of visible light (however, there is such a thing as 'Blue' magnitude: the magnitude measured through a very specific blue filter). You can also define magnitude through any filter (it depends on the purpose of your project).

Apparent magnitude measures how bright the star appears. A big star, if it is very far, will still look faint (big number magnitude) while a smaller star could look brighter because it is closer.

Absolute Magnitude (often written with a capital M) is how bright the stars would appear if they were all placed at the same distance (10 parsecs = 32.6 light-years). The Sun's Absolute Magnitude is a very dim 4.8

There is a measure called "Brightness": it is the amount of energy emitted per unit of area (Watts per square metre), by the star. It normally includes all energy (all visible light, Ultra-Violet, Infra-Red, radio, etc.). Normally, measured "Brightness" is always apparent. We can also talk of "surface brightness", meaning at the surface of the star. The surface brightness is directly linked to the surface temperature.

However, I've seen some papers where astronomers used the phrase 'absolute brightness' (defined as above, with the 10 parsec distance). This is done when you need to compare the luminosity of many stars which are are different distance.

At Earth, the Sun's brightness is 1370 W/m^2 (of which 37% is reflected back in space by clouds, snow, etc.).

The Sun's Surface Brightness is 6.336x10^13 Watts per square metre (giving it a surface temperature of 5780 K = 9950 F). You will hear astronomers say "the color-temperatuer of the Sun is 5780 Kelvin".

Photographers have been using this language for a long time (what is the color temperature of your flash? 6500 K. Wow that is hotter than usual (hotter means bluer in this scale).

The modern apparent magnitude scale is based on the the brightness that we get (Watts per square metre) from stars: very small numbers (0.0000001 W/m^2 and less).

The total output is called "Luminosity"; this value is always absolute: The Sun puts out a total of 3.85x10^26 Watts. All of it is generated at the Sun's centre, almost all of it from the fusion of Hydrogen into Helium.

---
apparent magnitude : how bright it appears from Earth

Absolute Magnitude : how bright it sould appear if it were at 10 parsecs (32.6 light years)

(apparent) Brightness: total energy per area (Watts per square metre) received at Earth
surface Brightness: Watts per square metre at the star's surface (gives us surface temperature)

Luminosity: total energy output.

2007-04-15 15:50:59 · answer #3 · answered by Raymond 7 · 0 0

False. The sun's (or any star's) apparent brightness is how bright it appears to us. It's brightness (or absolute brightness) is how bright it would appear from a certian distance, 10 parasecs, I belive. 1 parasec=3.26 lightyears

2007-04-15 14:35:13 · answer #4 · answered by Anonymous · 0 1

True, but the key word is magnitude

2007-04-19 08:25:49 · answer #5 · answered by Anonymous · 0 0

i don't know but i sure know the sun is bright!!!

2007-04-15 14:35:54 · answer #6 · answered by Anonymous · 0 1

fedest.com, questions and answers