top of page

Measurement of Magnitude

People sometimes ask us why some stars and planets have minus signs in front of their magnitudes, and what was the magnitude system which astronomers use to classify the brightness of stars. Well, it all started back in 129 BC, when the Greek astronomer Hipparchus developed the very first star catalogue. He ranked his stars simply by calling the brightest ones "first magnitude", the stars which were not quite as bright he called "second magnitude" and so on until the faintest stars which he could see he called "sixth magnitude". Ptolemy, another Greek copied the system in AD 140 when he created his own star catalogue, and for the next 1,400 years this system remained the basic one used in astronomy texts. The unfortunate thing about the way this developed is that the brighter a star is, the lower the magnitude number, which is the opposite of what you might logically expect, but that's the way it happened, and so that's what we have to live with. Then, along came the Italians, and as you might expect, they weren't happy to take something the Greeks had invented. Galileo Galilei, the Italian astronomer who first observed the moons of Jupiter using a home-made telescope, discovered that when he looked through his telescope he could see huge numbers more stars than could be seen with the naked eye. He denoted the brightest of the stars which could only be seen telescopically as "seventh magnitude", with fainter and fainter telescopic stars as having eighth or ninth magnitude or less, and so on. As telescopes kept getting bigger and better, astronomers added more and more magnitudes to the list, until today we have equipment like the Hubble Space Telescope which is capable of seeing all the way down to 30th magnitude. To give you a guide, with our unaided eyes we can see stars down to about 6th magnitude in moderately dark skies. A pair of 50mm binoculars will take you down to about 9th magnitude, and a 6 inch reflecting telescope (a typical smaller budget amateur instrument) can go as far as 13th magnitude.

So, that's OK for an introduction, but what about the actual relationship between magnitudes? Is it any more scientifically based these days, and if so what is the scale which is used? Well, by the middle of the 19th century astronomers realised that the whole magnitude scale needed re-defining, and a gentleman by the name of Norman R. Pogson proposed that a difference of five magnitudes be defined as a brightness ratio of exactly 100 to 1. This was very close to the values approximated by the existing scale, and no major changes would be needed, so Pogson's proposal was quickly accepted. One magnitude therefore corresponds to a brightness difference of exactly the fifth root of 100, or about 2.512, a value which has become known as the Pogson ratio. What this also means is that we have a logarithmic scale, so a difference of one magnitude say from first to second, means that a first magnitude star is 2.512 times fainter than a zero magnitude star. It also means that a second magnitude star is 6.3 (2.512 x 2.512) times fainter again, a third is 16 (6.3 x 2.512) times fainter and so on.  The consequences of this are best explained using a table:

One of the problems with having a precise scale is that some "first magnitude" stars are actually a good bit brighter than others and so the only thing to do was to push the scale upwards to negative values. Therefore, stars like Rigel in Orion, Capella in Auriga, Arcturus in Bootes and Vega in Lyra are all of zero magnitude, and Sirius in Canis Major (the brightest star in the sky) is minus 1.5. There are even brighter objects, and so Venus can be as bright as minus 4.4, the Full Moon is about minus 12.5 and the Sun is minus 26.7.

Hopefully this explains why objects seem to get very faint very quickly as we go down the magnitude scale, and why a magnitude 7 star is nearly a thousand times fainter than Rigel.

Absolute Magnitude
One additional piece of useful information is a measure of an object's real brightness, a value called the "Bolometric Magnitude". We don't know how bright an object really is until we take into account its distance from us. In order to measure this astronomers created the Absolute Magnitude Scale. An object's absolute magnitude is simply how bright it would appear if it was placed at a standard distance of 10 parsecs (32.6 light years). By this measurement our Sun would shine at the very unimpressive level of magnitude 4.85 but Rigel would be a magnificent minus 8, nearly as bright as the quarter moon. At the other extreme our nearest stellar neighbour, the red dwarf Proxima Centauri would only glow at magnitude 15.6 and be just visible in a 16 inch telescope.

bottom of page