By Phil Harrington

Reproduced from the Stargazer - Volume 6 - Winter, 1990

It should be apparent to even the most casual observer that all stars do not shine with the same intensity in the night sky. Some appear quite bright while others are barely discernible. The ancient Greek Hipparchus was the first to devise a magnitude system for categorizing stars according to their brightness. His method was quite simple. The brightest stars were labeled as first magnitude. Those that appeared half as bright were second magnitude.

Others half as bright as second magnitude became third magnitude, and so on. The dimmest stars visible were labeled as sixth magnitude.

The magnitude system we use today is far more precise but still strongly reminiscent of that of Hipparchus. By contemporary standards, a magnitude 1.0 star is 100 times brighter than one of magnitude 6.0, a five-magnitude difference. A one magnitude jump, therefore, corresponds to a 2.512 brightness factor, since

2.512 x 2.512 x 2.512 x 2.512 x 2.512 = 100

Thus, a first magnitude star is 2.512 times brighter than a second magnitude star, which is 2.512 times brighter than a third magnitude star, and so on. From this we can see that a first magnitude star is 2.512 x 2.512 or 6.310 times brighter than a third magnitude one.

Unlike Hipparchus' system, the brightest stars are no longer considered first magnitude. In redefining magnitudes during the 19th century, astronomers realized that some planets and bright stars shine with even greater brilliance. The top end of the scale was extended to include magnitude 0 stars, as well as even brighter negative magnitude values. For instance, Sirius, the brightest nighttime star, is rated at magnitude -1.4, while the sun is -26.7. The table below itemizes the brightest objects in our sky. (The magnitudes shown for the planets are for their maximum brightness.)

Object Magnitude Object Magnitude
Sun-26.7Alpha Centauri-0.3

How bright an object appears from earth says nothing about its true, intrinsic luminosity. For this, we must speak of a star's "absolute magnitude." Quite simply, the absolute magnitude of a star is just its apparent magnitude as seen from a certain yardstick distance away. In this case, a distance of 32.6 light years or 10 parsecs was chosen. From that far-off point in space, the sun would be only magnitude +4.8, indicating a relatively average luminosity. Deneb, the brightest star in Cygnus at apparent magnitude +1.3, beams with an absolute magnitude of -7.1. We see that many stars are brighter than the sun, but many are fainter as well. The sun is, after all, just an average star in the scheme of the Universe.