Overview

magnitude


Show Summary Details

Quick Reference

A measure of the brightness of a star. Ancient Greek astronomers defined the brightest stars as being of the first magnitude because they were the first to appear after sunset. The magnitude scale continued in steps of decreasing brightness down to sixth magnitude, for those stars which were visible only in total darkness. From its crude beginnings, the magnitude scale has been extended and is now on a strictly defined footing (see Pogson Scale) so that a difference of one magnitude corresponds to a difference in brightness of a factor of 2.512, and 5 magnitudes equals a brightness difference of exactly a hundredfold. Ancient magnitude estimates depended solely on the human eye, corresponding roughly to the modern V magnitude. The apparent magnitude of a star is its brightness as seen from Earth, whereas the absolute magnitude is a measure of its actual (i.e. intrinsic) brightness; the two differ because the intensity of light falls off with distance, and because of interstellar absorption. When the brightness is measured over all wavelengths, rather than just visible wavelengths, it is known as the bolometric magnitude.

Subjects: Astronomy and Astrophysics.


Reference entries

See all related reference entries in Oxford Index »


Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.