Stellar magnitude is measure of the brightness of a star or other celestial body. The brighter the object, the lower the number assigned as a magnitude. In ancient times, stars were ranked in six magnitude classes, the first magnitude class containing the brightest stars. In 1850 the English astronomer Norman Robert Pogson proposed the system presently in use. One magnitude is defined as a ratio of brightness of 2.512 times; e.g., a star of magnitude 5.0 is 2.512 times as bright as one of magnitude 6.0. Thus, a difference of five magnitudes corresponds to a brightness ratio of 100 to 1. After standardization and assignment of the zero point, the brightest class was found to contain too great a range of luminosities, and negative magnitudes were introduced to spread the range.
Apparent magnitude is the brightness of an object as it appears to an observer on Earth. The Sun's apparent magnitude is -26.7, that of the full Moon is about -11, and that of the bright star Sirius, -1.5. The faintest stars visible through the largest telescopes are of (approximately) apparent magnitude 20. Absolute magnitude is the brightness an object would exhibit if viewed from a distance of 10 parsecs (32.6 light-years). The Sun's absolute magnitude is 4.8.
Bolometric magnitude is that measured by including a star's entire radiation, not just the portion visible as light. Monochromatic magnitude is that measured only in some very narrow segment of the spectrum. Narrow-band magnitudes are based on slightly wider segments of the spectrum and broad-band magnitudes on areas wider still. Because ordinary photographic plates are more sensitive to blue light than is the eye, photographic magnitude is sometimes called blue magnitude. Visual magnitude may be called yellow magnitude, because the eye is most sensitive to light of that color
Excerpt from the Encyclopedia Britannica without permission.