the brightness of astronomical objects such as stars is quantified by a logarithmic system using the unit “stellar magnitudes”:
a difference of 5 stellar magnitudes equates to 100x brightness = ~6.5 f stops
a difference of 7.5 magnitude equates to 1000x brightness = ~10 f stops
a difference of 10 magnitudes equates to 10,000x brightness = ~13.5 f stops
midday sun = -26.7 which results in 130,000 lumens/sq.m luminance onto earth's surface = 40,000,000,000 x brighter than zero magnitude
full moon overhead = -12.5 which results in 0.267 lumens/sq.m luminance onto earth's surface = 100,000 x brighter than zero magnitude
venus at brightest = -4.3 which results in 0.000139 lumens/sq.m luminance onto earth's surface
sirius = -1.4 which results in 0.0000098 lumens/sq.m luminance onto earth's surface
zero magnitude which results in 0.00000265 lumens/sq.m luminance onto earth's surface
1st magnitude which results in 0.000000105 lumens/sq.m luminance onto earth's surface = 0.398 x as bright as a zero magnitude
5th magnitude which results in 0.0000000265 lumens/sq.m luminance onto earth's surface = 0.01 x as bright as a zero magnitude
6th magnitude which results in 0.0000000105 lumens/sq.m luminance onto earth's surface
relative brightness ratio = 10-(magnitude difference/2.5)
magnitude difference = -2.5Log(relative brightness ratio)