Magnitude and Brightness | Starlight | Space FM
Magnitude 13.1 - Understand the astronomical magnitude scale and how apparent magnitude relates to the brightness of stars as viewed from Earth
We measure the brightness of a star by its magnitude. There are two types of magnitude: Apparent magnitude is how bright an object is to us on Earth. Absolute magnitude is how bright a star would appear in space from a certain distance.
Accessed 8 January 2026. Magnitude, in astronomy, measure of the brightness of a star or other celestial body. The brighter the object, the lower the number assigned as a magnitude. In ancient times, stars were ranked in six magnitude classes, the first magnitude class containing the brightest stars.
In 1850 the English astronomer Norman Robert Pogson proposed the system presently in use. One magnitude is defined as a ratio of brightness of 2.512 times; e.g., a star of magnitude 5.0 is 2.512 times as bright as one of magnitude 6.0. Thus, a difference of five magnitudes corresponds to a brightness ratio of 100 to 1.
The method we use today to compare the apparent brightness of stars is rooted in antiquity. Hipparchus, a Greek astronomer who lived in the second century BC, is usually credited with formulating a system to classify the brightness of stars. He called the brightest star in each constellation "first magnitude."
Magnitude 13.1 - Understand the astronomical magnitude scale and how apparent magnitude relates to the brightness of stars as viewed from Earth
One of the most fundamental properties of a star is its brightness. Astronomers measure stellar brightness in units called magnitudes, which seem at first counterintuitive and confusing.
A beginner''s guide to stellar magnitude, how astronomers measure a star''s brightness and which are the brightest objects in the sky.
Learn about absolute mangnitude for your IGCSE Physics exam. This revision note includes information on luminosity, apparent magnitude and absolute magnitude.
A star with a magnitude of +5.0 is 2.5 times fainter than a star with a magnitude of +4.0. Two stars that differ by 5.0 magnitudes are 100-times different in brightness.
So for centuries, astronomers have used and refined a method of determining the brightness of stars and every other celestial object called the magnitude system.
Bolometric magnitude is that measured by including a star''s entire radiation, not just the portion visible as light. Monochromatic magnitude is that measured only in some very narrow segment of the
The brightest stars, those that were traditionally referred to as first-magnitude stars, actually turned out (when measured accurately) not to be identical in brightness. For example, the brightest star in the
In other words, it would take 15 stars with the brightness of Sirius in one spot in the sky to equal the brightness of Venus. Sirius, the brightest apparent star in the winter sky,and the Sun have
Bolometric magnitude is that measured by including a star''s entire radiation, not just the portion visible as light. Monochromatic magnitude is that measured only in some very narrow segment
PDF version includes complete article with source references. Suitable for printing and offline reading.