English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

id prefer the answer in depth please thanx ;)

2006-08-03 11:02:40 · 7 answers · asked by jivelikedat 2 in Science & Mathematics Astronomy & Space

7 answers

it doesn't look like anyone has given an "in depth" answer so...

magnitude is a measure of the brightness of an celestial object. there is apparent magnitude and absolute magnitude.

apparent magnitude is the brightness of a celestial object as it appears to someone on earth. originally, people divided all visible stars into six magnitudes. the brightest was one, and the dimmest was six. each magnitude was considered twice as bright as the next dimmest one. during modern times, the magnitude scale was defined. first and sixth magnitude stars were defined, and the first magnitude star was defined as being one hundred times brighter than a sixth magnitude star so each magnitude is now about 2.512 times brighter than the next dimmest. this is the result of choosing the one hundred magnitude difference between first and sixth magnitudes because the fifth root of one hundred is about 2.512, or 2.512 to the fifth power is about one hundred.

absolute magnitude is the magnitude of the celestial object as if it were ten parsecs away.

look here:
http://en.wikipedia.org/wiki/Apparent_magnitude
http://en.wikipedia.org/wiki/Absolute_magnitude

2006-08-03 13:13:41 · answer #1 · answered by warm soapy water 5 · 5 0

It is a scale that relates the apparent brightness of objects in the sky. A long time ago, before digital light meters, it was sorta arbitrary and it was a way that astronomers could talk about which star somebody meant and the scale gave the brightest stars in the sky the number 1 and the second brightest group of stars were 2nd magnitude.

Now, later, when light intensity becomes something measurable, the scale was made more rigorous, but based on the same starting points.

Magnitude of stars is sorta like pH of acids. The really concentrated acids have negative pH. The moon, the sun, etc, have magnitudes that are negative. And, like pH, the scale is logarithmic. The ratio of the intensity of a 2nd magnitude star to the intensity of a 1st magnitude star is the same as that ratio between a 3rd and a 2nd or a 5th and a 4th.

2006-08-03 18:21:57 · answer #2 · answered by tbolling2 4 · 0 0

1. the magnitude of a vector
Two-dim example:
magnitude(5 , 12) = |(5 , 12)| = V(5^2 + 12^2) = 13
Three-dim example:
|(3 , 4 , 12)| = V(3^2 + 4^2 + 12^2) = 13

2. the magnitude of a complex number
An example:
magnitude(5 + 12i) = |5 + 12i| = V(5^2 + 12^2) = 13.

3. The magnitude of a star defines it brightness.
Example: the Polar star and most stars of the Great Dipper have magnitude 2.
Magnitude 5 or 6 is very difficult to see.

Th

2006-08-03 18:14:14 · answer #3 · answered by Thermo 6 · 0 0

magnitude is basically the amount or strength of something. For example if I have 2 tons of soemthing then the magnitude is two.

It also applies to vectors. A vector has a magnitude and direction. For example I could say the velocity is 20m/s in the x-direction. The magnitude is 20 and the direction is in the X direction.

Get the drift???

2006-08-03 18:12:07 · answer #4 · answered by ObliqueShock_Aerospace_Eng 2 · 0 0

A number assigned to the ratio of two quantities; two quantities are of the same order of magnitude if one is less than 10 times as large as the other; the number of magnitudes that the quantities differ is specified to within a power of 10.

2006-08-03 18:15:51 · answer #5 · answered by Lindy357 3 · 0 0

magnitude is the brightness of a star. it ranges from negatife to positive numbers. usually between -10 and +10 because thats what most of the stars we know are.

2006-08-03 18:06:46 · answer #6 · answered by Anonymous · 0 0

The apparent magnitude (m) of a star, planet or other celestial body is a measure of its apparent brightness as seen by an observer on Earth. The brighter the object appears, the lower the numerical value of its magnitude.

Contents [hide]
1 Explanation
2 See also
3 References
4 External links



[edit]
Explanation
The scale upon which magnitude is measured has its origin in the Hellenistic practice of dividing those stars visible to the naked eye into six magnitudes. The brightest stars were said to be of first magnitude (m = 1), while the faintest were of sixth magnitude (m = 6), the limit of human visual perception (without the aid of a telescope). Each grade of magnitude was considered to be twice the brightness of the following grade (a logarithmic scale). This somewhat crude method of indicating the brightness of stars was popularized by Ptolemy in his Almagest, and is generally believed to have originated with Hipparchus. This original system did not measure the magnitude of the Sun.

In 1856, Pogson formalized the system by defining a typical first magnitude star as a star that is 100 times as bright as a typical sixth magnitude star; thus, a first magnitude star is about 2.512 times as bright as a second magnitude star. The fifth root of 100, an irrational number about (2.512) is known as Pogson's Ratio[1]. Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they first switched to Vega as the standard reference star, and then switched to using tabulated zero points for the measured fluxes[2]. The magnitude depends on the wavelength band (see below).

The modern system is no longer limited to 6 magnitudes or only to visible light. Very bright objects have negative magnitudes. For example, Sirius, the brightest star of the celestial sphere, has an apparent magnitude of −1.46. The modern scale includes the Moon and the Sun; the full Moon has an apparent magnitude of −12.6 and the Sun has an apparent magnitude of −26.8. The Hubble Space Telescope has located stars with magnitudes of 30 at visible wavelengths and the Keck telescopes have located similarly faint stars in the infrared.

Scale of apparent magnitudes App. Mag. Celestial object
−26.73 Sun
−12.6 full Moon
−8.0 Maximum brightness of an Iridium Flare
−4.4 Maximum brightness of Venus
−4.0 Faintest objects observable during the day with naked eye
−2.8 Maximum brightness of Mars
−1.5 Brightest star (except for the sun) at visible wavelengths: Sirius
−0.7 Second brightest star: Canopus
0 The zero point by definition: This used to be Vega
(see references for modern zero point)
3 Faintest stars visible in an urban neighborhood
6 Faintest stars observable with naked eye
12.6 Brightest quasar
27 Faintest objects observable in visible light with 8m ground-based telescopes
30 Faintest objects observable in visible light with Hubble Space Telescope
38 Faintest objects observable in visible light with planned OWL (2020)
(see also List of brightest stars)

These are only approximate values at visible wavelengths (in reality the values depend on the precise bandpass used) — see airglow for more details of telescope sensitivity.

As the amount of light received actually depends on the thickness of the atmosphere in the line of sight to the object, the apparent magnitudes are normalized to the value it would have outside the atmosphere. The dimmer an object appears, the higher its apparent magnitude. Note that apparent brightness is not equal to actual brightness — an extremely bright object may appear quite dim, if it is far away. The rate at which apparent brightness changes, as the distance from an object increases, is calculated by the inverse-square law (at cosmological distance scales, this is no longer quite true because of the curvature of space). The absolute magnitude, M, of a star or galaxy is the apparent magnitude it would have if it were 10 parsecs (~ 32 lightyears) away; that of a planet (or other solar system body) is the apparent magnitude it would have if it were 1 astronomical unit away from both the Sun and Earth. The absolute magnitude of the Sun is 4.83 in the V band (yellow) and 5.48 in the B band (blue).

The apparent magnitude in the band x can be defined as


where is the observed flux in the band x, and is a constant that depends on the units of the flux and the band. The constant is defined in Aller et al 1982 for the most commonly used system.

The variation in brightness between two luminous objects can be calculated another way by subtracting the magnitude number of the brighter object from the magnitude number of the fainter object, then using the difference as an exponent for the base number 2.512; that is to say (mf – mb = x; and 2.512x = variation in brightness).


Example 1

In terms of apparent magnitude (m), what is the difference in brightness between the Sun and the full moon?


(mf – mb = x)

(2.512x = variation in brightness)


The apparent magnitude of the Sun is -26.73, and the apparent magnitude of the full moon is -12.6. The full moon is the fainter of the two objects, while Sun is the brighter.


(mf – mb = x)

(-12.6 – -26.73 = x)

(-12.6 – -26.73 = 14.13)

(x = 14.13)


(2.512x = variation in brightness)

(2.51214.13 = variation in brightness)

(2.51214.13 = 449,032.16)

(variation in brightness = 449,032.16)


In terms of apparent magnitude, the Sun is more than 449,032 times brighter than the full moon. This is a good reason to avoid looking directly at the Sun, even during a solar eclipse.

Example 2

In terms of apparent magnitude (m), what is the difference in brightness between Sirius and Polaris?

(mf – mb = x)

(2.512x = variation in brightness)

The apparent magnitude of Sirius is -1.44, and the apparent magnitude of Polaris is 1.97. Polaris is the fainter of the two stars, while Sirius is the brighter.

(mf – mb = x)

(1.97 – -1.44 = x)

(1.97 – -1.44 = 3.41)

(x = 3.41)

(2.512x = variation in brightness)

(2.5123.41 = variation in brightness)

(2.5123.41 = 23.124)

(variation in brightness = 23.124)

In terms of apparent magnitude, Sirius is 23.124 times brighter than Polaris the North Star.

The second thing to notice is that the scale is logarithmic: the relative brightness of two objects is determined by the difference of their magnitudes. For example, a difference of 3.2 means that one object is about 19 times as bright as the other, because Pogson's ratio raised to the power 3.2 is 19.054607... A common misconception is that the logarithmic nature of the scale is due to the fact that the human eye itself has a logarithmic response. In Pogson's time this was thought to be true (see Weber-Fechner law), but it is now believed that the response is a power law (see Stevens' power law)[3].

Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way in which it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured in order for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U (centred at about 350 nm, in the near ultraviolet), B (about 435 nm, in the blue region) and V (about 555 nm, in the middle of the human visual range in daylight). The V band was chosen for spectral purposes and gives magnitudes closely corresponding to those seen by the light-adapted human eye, and when an apparent magnitude is given without any further qualification, it is usually the V magnitude that is meant, more or less the same as visual magnitude.

Since cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV regions of the spectrum their power is often under-represented by the UBV scale. Indeed, some L and T class stars have an estimated magnitude of well over 100, since they emit extremely little visible light, but are strongest in infrared.

Measures of magnitude need cautious treatment and it is extremely important to measure like with like. On early 20th-century and older orthochromatic (blue-sensitive) photographic film, the relative brightnesses of the blue supergiant Rigel and the red supergiant Betelgeuse irregular variable star (at maximum) are reversed compared to what our eyes see since this archaic film is more sensitive to blue light than it is to red light. Magnitudes obtained from this method are known as photographic magnitudes, and are now considered obsolete.

For objects within our Galaxy with a given absolute magnitude, 5 is added to the apparent magnitude for every tenfold increase in the distance to the object. This relationship does not apply for objects at very great distances (far beyond our galaxy), since a correction for General Relativity must then be taken into account due to the non-Euclidean nature of space.

Source: http://en.wikipedia.org/wiki/Apparent_magnitude

2006-08-03 20:31:23 · answer #7 · answered by Thuy Nguyen 2 · 0 0

fedest.com, questions and answers