Apparent magnitude

From Wikipedia, the free encyclopedia

Jump to: navigation, search

The apparent magnitude (m) of a celestial body is a measure of its brightness as seen by an observer on Earth, normalized to the value it would have in the absence of the atmosphere. The brighter the object appears, the lower the value of its magnitude.

Contents

[edit] Explanation

The scale upon which magnitude is now measured has its origin in the Hellenistic practice of dividing those stars visible to the naked eye into six magnitudes. The brightest stars were said to be of first magnitude (m = 1), while the faintest were of sixth magnitude (m = 6), the limit of human visual perception (without the aid of a telescope). Each grade of magnitude was considered to be twice the brightness of the following grade (a logarithmic scale). This somewhat crude method of indicating the brightness of stars was popularized by Ptolemy in his Almagest, and is generally believed to have originated with Hipparchus. This original system did not measure the magnitude of the Sun.

In 1856, Pogson formalized the system by defining a typical first magnitude star as a star that is 100 times as bright as a typical sixth magnitude star; thus, a first magnitude star is about 2.512 times as bright as a second magnitude star. The fifth root of 100 is known as Pogson's Ratio.[1] Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they first switched to Vega as the standard reference star, and then switched to using tabulated zero points[clarification needed] for the measured fluxes.[2] The magnitude depends on the wavelength band (see below).

The modern system is no longer limited to 6 magnitudes or only to visible light. Very bright objects have negative magnitudes. For example, Sirius, the brightest star of the celestial sphere, has an apparent magnitude of −1.4. The modern scale includes the Moon and the Sun; the full Moon has an apparent magnitude of −12.6 and the Sun has an apparent magnitude of −26.73. The Hubble Space Telescope has located stars with magnitudes of 30 at visible wavelengths and the Keck telescopes have located similarly faint stars in the infrared.

Apparent magnitudes of known celestial objects
App. Mag. Celestial object
−26.73 Sun (449,000 times brighter than full moon)
−20 Sun (As seen from Neptune)
−12.6 Full Moon
−8.0 Maximum brightness of an Iridium (satellite) flare
−6.0 The Crab Supernova (SN 1054) of AD 1054 (6500 light years away)
−4.7 Maximum brightness of Venus and the International Space Station (when the ISS is at its perigee and fully lit by the sun)[3]
−3.9 Faintest objects observable during the day with naked eye
−3.8 Minimum brightness of Venus when it is on the far side of the Sun
−3.0 Maximum brightness of Mars
−2.8 Maximum brightness of Jupiter
−1.9 Maximum brightness of Mercury
−1.47 Brightest star (except for the sun) at visible wavelengths: Sirius
−0.7 Second-brightest star: Canopus
−0.24 Maximum brightness of Saturn
0 The zero point by definition: This used to be Vega
3 Faintest stars visible in an urban neighborhood with naked eye
3.2 Maximum brightness of Vesta
4.13 Maximum brightness of Pallas
4.6 Maximum brightness of Ganymede
5.5 Maximum brightness of Uranus
6.5 Faintest stars observable with naked eye under perfect conditions
6.7 Maximum brightness of Ceres
7.7 Maximum brightness of Neptune
9.1 Maximum brightness of 10 Hygiea
9.5 Faintest objects visible with binoculars
10.2 Maximum brightness of Iapetus
12.9 Brightest quasar 3C 273 (2.4 Giga-light years away)
13.65 Maximum brightness of Pluto (1,148 times fainter than naked-eye visibility)
18.7 Current opposition brightness of Eris
23 Maximum brightness of Pluto's smallest moons Hydra and Nix
27 Faintest objects observable in visible light with 8m ground-based telescopes
30 Faintest objects observable in visible light with Hubble Space Telescope
35 Sedna at aphelion (900 AU)[4]
(see also List of brightest stars)

These are only approximate values at visible wavelengths (in reality the values depend on the precise bandpass used) — see airglow for more details of telescope sensitivity.

As the amount of light received actually depends on the thickness of the Earth's atmosphere in the line of sight to the object, the apparent magnitudes are normalized to the value it would have in the absence of the atmosphere. The dimmer an object appears, the higher its apparent magnitude. Note that brightness varies with distance; an extremely bright object may appear quite dim, if it is far away. Brightness varies inversely with the square of the distance. The absolute magnitude, M, of a celestial body (outside of the solar system) is the apparent magnitude it would have if it were 10 parsecs (~32 light years) away; that of a planet (or other solar system body) is the apparent magnitude it would have if it were 1 astronomical unit away from both the Sun and Earth. The absolute magnitude of the Sun is 4.83 in the V band (yellow) and 5.48 in the B band (blue).[citation needed]

The apparent magnitude in the band x can be defined as (noting that \log_{\sqrt[5]{100}} F = \frac{\log_{10} F }{\log_{10} 100^{1/5}} = 2.5\log_{10} F)

m_{x}= -2.5 \log_{10} (F_x)  +  C\!\,

where F_x\!\, is the observed flux in the band x, and C\!\, is a constant that depends on the units of the flux and the band. The constant C\!\, is defined in Aller et al. 1982 for the most commonly used system.

The variation in brightness between two luminous objects can be calculated another way by subtracting the magnitude number of the brighter object from the magnitude number of the fainter object, then using the difference as an exponent for the base number 2.512; that is to say (mfmb = x; and 2.512x = variation in brightness).

[edit] Example 1

What is the ratio in brightness between the Sun and the full moon?

 m_f - m_b = x  \!\

2.512x = variation in brightness

The apparent magnitude of the Sun is -26.73, and the apparent magnitude of the full moon is -12.6. The full moon is the fainter of the two objects, while the Sun is the brighter.

Difference in magnitude

 x = m_f - m_b \!\

 x = (-12.6) - (-26.73) = 14.13 \!\

 x = 14.13 \!\

Variation in Brightness

 v_b = 2.512^x \!\

 v_b = 2.512^{14.13} \!\

 v_b = 449,032.16 \!\

variation in brightness = 449,032.16

In terms of apparent magnitude, the Sun is more than 449,032 times brighter than the full moon.

[edit] Example 2

What is the ratio in brightness between Sirius and Polaris?

 m_f - m_b = x \!\

 2.512^x = \!\ variation in brightness

The apparent magnitude of Sirius is -1.44, and the apparent magnitude of Polaris is 1.97. Polaris is the fainter of the two stars, while Sirius is the brighter.

Difference in magnitude

 x = m_f - m_b \!\

 x = 1.97 - (-1.44) = 3.41 \!\

 x = 3.41 \!\

Variation in brightness

 v_b = 2.512^x \!\

 v_b = 2.512^{3.41} \!\

 v_b = 23.124 \!\

In terms of apparent magnitude, Sirius is 23.124 times brighter than Polaris the North Star.

The second thing to notice is that the scale is logarithmic: the relative brightness of two objects is determined by the difference of their magnitudes. For example, a difference of 3.2 means that one object is about 19 times as bright as the other, because Pogson's ratio raised to the power 3.2 is 19.054607... A common misconception is that the logarithmic nature of the scale is because the human eye itself has a logarithmic response. In Pogson's time this was thought to be true (see Weber-Fechner law), but it is now believed that the response is a power law (see Stevens' power law).[5]

Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way in which it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured in order for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U (centred at about 350 nm, in the near ultraviolet), B (about 435 nm, in the blue region) and V (about 555 nm, in the middle of the human visual range in daylight). The V band was chosen for spectral purposes and gives magnitudes closely corresponding to those seen by the light-adapted human eye, and when an apparent magnitude is given without any further qualification, it is usually the V magnitude that is meant, more or less the same as visual magnitude.

Since cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV regions of the spectrum their power is often under-represented by the UBV scale. Indeed, some L and T class stars have an estimated magnitude of well over 100, since they emit extremely little visible light, but are strongest in infrared.

Measures of magnitude need cautious treatment and it is extremely important to measure like with like. On early 20th century and older orthochromatic (blue-sensitive) photographic film, the relative brightnesses of the blue supergiant Rigel and the red supergiant Betelgeuse irregular variable star (at maximum) are reversed compared to what our eyes see since this archaic film is more sensitive to blue light than it is to red light. Magnitudes obtained from this method are known as photographic magnitudes, and are now considered obsolete.

For objects within our Galaxy with a given absolute magnitude, 5 is added to the apparent magnitude for every tenfold increase in the distance to the object. This relationship does not apply for objects at very great distances (far beyond our galaxy), since a correction for General Relativity must then be taken into account due to the non-Euclidean nature of space.

[edit] See also

[edit] References

Personal tools