How bright would a blue giant star be 10 ly away?

Location
Virginia
Pronouns
She/Her
assuming that there was a blue giant star 10 light years away, how bright would it be?
Would it be brighter than Jupiter, the moon, etc?
 
Rigel has an apparent magnitude of 0.12 at 860 light years away, making it the 7th brightest star in the night sky. If my math is correct it would have a magnitude of -9.58 at 10 light years away, which would put it at 429 times brighter than Jupiter, but the full Moon would still be 23rd times as bright as it in turn.

There aren't really any good measures because in terms of stellar objects the full moon is an apparent magnitude of about -13 while the Venus at maximum brightness is only -5, where each point lower means 2.512 times brighter.
 
Last edited:
Thanks. . . .If you numbers are even halfway right, it should at least be visible in the day I think.
 
Sirius is visible during daytime under favorable conditions, but extremely difficult to spot unless you already know where to look.
 
Notably, our hypothetical Giant Star Friend would be a very bright point source, as opposed to showing visible disc like the moon does. I don't think it'd be so bright as to be painful or damaging to look at, but it'd definitely leave much more of an impression on the eye than a direct comparison of 'magnitude' would lead one to believe.

Unless I am badly misunderstanding how magnitude works, mind you.
 
Just stumbled over this and thought I might as well answer as an astrophysicist - brightness is a measure of the apparent magnitude (m) of the star, which is a measure of its intrinsic luminosity (eg how much light it gives off in total) and its distance (how far away it is). The distance given here is 10 light years, so we need the intrinsic luminosity of a blue giant. We run into a problem here which is that "blue giant" is not actually a well-defined term, but it generally includes an absolute magnitude of 0 or brighter - absolute magnitude being the apparent magnitude at a distance of exactly 10 parsecs (about 36 light years) without anything in the way. So we'll take that as our basis.

I won't bore you with the derivations and calculations, but this basically works out to give us an intrinsic luminosity of 78.7 (bright) and an apparent magnitude of -2.567. Apparent magnitude is measured on a logarithmic scale where the zero-point is defined by the star Vega (it used to be Polaris but then we realised Polaris actually fluctuates a bit) and every difference of 1.0 apparent magnitude is equal to multiplying the brightness by a factor of 5√100, or about 2.512. So a star of m = 1.0 is about two and a half times brighter than a star of m = 2.0, and a hundred times brighter than one of m = 6.0. Obviously, since "brighter stars" go downward, you can pass zero on this scale by being brighter than Vega, which quite a few things are.

Our Sun has an apparent magnitude of −26.7, which is so much brighter than our blue giant I'm not going to bother giving the difference (nothing is as bright as the Sun, it's very luminous and only eight light-minutes away), and the moon averages -12.7 when it's full and -10.0 when it's in its first or third quarter (the amount of surface lit matters less than the 90-degree angle it's being lit at, so it's the same either way), but we can compare our blue giant to Venus at −4.2 or Sirius at −1.46. This results in our star being almost exactly 4.5 times dimmer than Venus but 2.77 times brighter than the Dog Star. Jupiter has an apparent visual magnitude of -2.7, so our star is very slightly (about 12%) dimmer than Jupiter. So! Now you know!
 
Rigel has an apparent magnitude of 0.12 at 860 light years away, making it the 7th brightest star in the night sky. If my math is correct it would have a magnitude of -9.58 at 10 light years away, which would put it at 429 times brighter than Jupiter, but the full Moon would still be 23rd times as bright as it in turn.

There aren't really any good measures because in terms of stellar objects the full moon is an apparent magnitude of about -13 while the Venus at maximum brightness is only -5, where each point lower means 2.512 times brighter.
Happens to be there are, because of Moon´s phase curve:
table III on page 12
Would match Moon at phase angle of about 110 degrees (The two directions differ by 0,05 magnitudes only.).
 
brightness is a measure of the apparent magnitude (m) of the star, which is a measure of its intrinsic luminosity (eg how much light it gives off in total) and its distance (how far away it is).

This is really into the weeds, here, but do you happen to know why astronomy never switched to candelas in the last 70-ish years? I realize that as an engineering standard for quantifying artificial lights the units are nowhere near the same orders-of-magnitude, but it seems like "Photon power per steradian" is a lot closer to what is actually being talked about than going back-and-forth to the historical comparative "these one looks brighter than those ones" of Magnitude.
 
Candelas are a linear scale, and apparent magnitude is measured on a logarithmic scale where every five points is a hundredfold increase in brightness. The apparent magnitudes of known objects range from the Sun at −26.7 to objects in deep Hubble Space Telescope images of magnitude +31.5; a difference of ten to the twenty third power. If you want to fit all of those on a scale without having to measure most of the things Hubble can see in femtocandelas, a log scale is just easier. And bluntly, most of the time you don't care about the "objective" number of photons reaching you; you care about how bright something is relative to other known quantities so you can calculate things like distances and parallaxes and so on. Astronomy is a very comparative science because rather by definition we can't go up there with a measuring tape, only compare what we can see of Thing A to what we can see of Thing B - and if you're going to be doing a lot of comparing in your calculations anyway, you might as well use a scale where that's the explicit basis of definition.

(That, or just the standard explanation that academia is rooted in its ways in certain aspects and doesn't like changing how it's done things literally since Ancient Greece started categorising stars with each successive magnitude above m = 1 being half as bright as the last just because some newfangled measure of photons per unit area has been worked out. :p)
 
Last edited:
I mean it just doesn't seem to me that "b/a = 10^23" is harder to use in a numerical or intuitive sense, compared to "a = 31.5, b = -26.7; b-a = -58.2". Who cares if the important number is the exponent instead of the coefficient, just take the natural log of it and use that as your "useful value" instead, or something.

Like I could see if there were useful derived units that were a few layers of math away from candelas; off-hand, it seems intuitive to me that since a candela is Power/Radian^2 and since parallax measurements are an angle of arc anyways, you could get something that usefully replaced "magnitude" as a derived unit using "candela" + "diameter of Earth's orbit". (Not 100% percent sure how I'd go from arc angle to solid angle there but I dunno what would be most useful). And, like, maybe using a Luminosity Function weighted off like hydrogen absorption lines, rather than the wavelengths of light the human eye is most sensitive to; or outright dropping Luminous Intensity for Radiant Intensity to measure the total photon energy of radiation instead of "visible light."

Like the "well we keep using it because we've always used it" is a valid argument simply because there's all this knowledge inertia in using the same units we've always used and understanding what the Old Timers already wrote -- I still use "horsepower" to quantify how much electricity a motor should consume, even though the motor replaced the horse like a hundred years ago.

But it's like... "some guy put all the stars he could see in four buckets, and we took the average apparent brightness inside each of those four buckets and ran curve-fitting on them, and the best-fit logarithmic that came out of that is what we use to talk about how bright stuff in the sky is" just seems like a... painful way to measure stuff, no matter how important That Guy was.

It just seems weird to me, as like an amateur Space News Follower, that I've never even heard of an astronomer saying "Magnitude is a pain in the ass, here's a fresh new way to do it that we derived with dimensional analysis from fundamental units" even though that has been basically the entire argument that the SI Academy has been pushing in all the other sciences. That's why it's kilowatts instead of horsepower, joules instead of BTUs, etc. (Well, "as well as" not "instead of", but you know.)

Like maybe this is just me thinking about building instruments, but it seems like you frequently would care about "objective number of photons reaching you" for pretty much all measurements all the time these days, because "how many photons hit this particular pixel of the camera" is the basic measurement you're taking for each pixel. I mean even if you went back to plate exposures you're looking at how how many photons hit the plate, right.

And it seems like Astronomers frequently want to talk about "Absolute Magnitude", which seems to me less, I dunno, useful than Radiant (or Luminous) Flux, in a "plug it into equations and go actually noodle through the math you're interested in" sort of sense, when meanwhile getting from Candelas to Flux is pretty straightforward and isn't harassing you with intermediate complexity before you even get to the part you're trying to think about.
 
Last edited:
Back
Top