Inverse Square Law and the Sun

Movybuf1979

New Member
So, as I have mentioned before in my couple of posts, I like to "debate" with flat Earthers on Youtube. I got this reply today from someone:

"How about the inverse square LAW of light where if you double the distance from the light source it's a quarter the brightness? They tell us it takes 8 light minutes for light to reach earth from the sun. Factor in the sun's size and distance away and we can then calculate how far away the sun would have to be in order for us to not see it anymore which equivalates to around 6 light hours. Then cross reference that data with a "star" and it's size and distance relative to how light propagates from it and the outcome is we shouldn't be able to see these "stars" at the distances they claim! Once you actually realize what's going on your going to come back and delete these messages because of how silly you sound! Go do some research."

So, I am not even sure how to unpack this. There seems to be a mishmash of misunderstood science and terms and a faulty conclusion made with a faulty premise. I am unsure of how to even address this because his "logic" doesn't even seem to make sense to me. Can anyone make head or tails of what he is talking about?

He also made this claim:

"Apply the inverse square law of light to a full moon. The astronauts would of melted at 10,000,000 + lumens. Wake up."

Which just seems patently ridiculous.
 
You've nailed the flaw, which is a fundamental misunderstanding of the inverse square law. I've heard this before but have never found the source for these flawed calculations. I would ask them to show their calculations because they have to be wrong.

Pluto is 5.5 light hours from the sun and the sun is the brightest object in the sky. About as bright as a full moon.

Both claims are based on the same bad math or misunderstandings. Also a lumen is a measure of light falling on an area, not a measure of power or energy falling on an area. So the number of lumens falling on an astronaut has nothing to do with heat. Simple example is light bulbs. A 60 watt incandescent and a 10 watt LED bulb both produce about 725 lumens. Watts are power, lumens are light.
 
Last edited:
"Apply the inverse square law of light to a full moon. The astronauts would of melted at 10,000,000 + lumens. Wake up."
This one is easy: the moonlight is spread over a large area (diameter 3474 km, the moon "disc" has ~9.5 million km², while the human body has < 2 m² skin, that's 0.000002 km². So the single astronaut doesn't get that much light on the moon; not that much more than you do on Earth on a clear day.
 
The inverse square law as they think of it applies only to point sources. For extended sources like the Sun and Moon one has to calculate the irradiance by integrating the object's surface brightness over its angular extent.

These people never show the math to support their claims.
 
we can then calculate how far away the sun would have to be in order for us to not see it anymore which equivalates to around 6 light hours.

Politely ask him to show his working. You can't reliably counter his arguments until you properly understand his arguments, which means understanding all his precepts and all the deductive steps therefrom.

6 light hours is a pluto-like distance, so here's a ready-worked solution, plus everyday situations to compare to, that you can compare his numbers against:
https://www.sciencefocus.com/space/how-bright-is-daylight-on-pluto/
External Quote:
The average distance of Pluto from the Sun is about 39 times that of the Earth's. So, on average, the Sun on Pluto looks about 1,520 times fainter than it does on Earth. But this isn't particularly faint. The full Moon is on average about 400,000 times fainter than the Sun. So, doing the maths, this means the Sun seen from Pluto is about 264 times brighter than the full Moon. This is about the amount of light you'd see on Earth when the Sun is around four degrees below the horizon, during 'civil twilight', which is more than enough to read by. Even on Pluto, looking directly at the Sun would probably be painful.
Of course, if that's not enough, don't hesitate to copy his working here for us to gnaw on.
 
"How about the inverse square LAW of light where if you double the distance from the light source it's a quarter the brightness? They tell us it takes 8 light minutes for light to reach earth from the sun. Factor in the sun's size and distance away and we can then calculate how far away the sun would have to be in order for us to not see it anymore which equivalates to around 6 light hours. Then cross reference that data with a "star" and it's size and distance relative to how light propagates from it and the outcome is we shouldn't be able to see these "stars" at the distances they claim!"

The human eye can (with night vision) see light as low as 0.000003 cd/m².
Bright noon sunshine reaches 1,600,000,000 cd/m².
The "inverse square" (aka the square root) of (1,600,000,000/0.000003) is 23,000,000. The Sun could be 23 million times as far away as it is and we would still (barely) be able to see it.
8 minutes * 23 million = 185 million minutes = ~350 years
We could still (barely) see the sun if it was 350 light years away.

Our galaxy is 105,700 light years in diameter, so we won't be able to see every single star in our galaxy, but we can see quite a few. And some star types are 30,000 times as luminous as the sun, or more.
 
Think of this way, the Sun has an absolute magnitude of 4.8 or so, which means it would be visible to the naked eye seen from 10 parsecs.

Six light hours is 0.0002 parsecs.
 
Apply the inverse square law of light to a full moon.
Part of that is not really understanding where the light of the moon actually comes from. Flat Earthers already reject the notion that the moon is just a grey rock relatively close to the Earth, lit by the far-away sun.

They don't understand the conventional model, and so explaining the inverse-square law (and point sources vs area sources, etc) isn't going to go anywhere.
 
The claim could be based on one or both of two ideas:

a) there is a minimum angular size at which objects are normally visible to the naked eye. This is usually put at 1 arc minute, i.e. 1/60 of a degree. For comparison, the sun or moon as viewed from earth subtends about 30 arc minutes. Since the angular size diminishes in proportion to distance (not distance squared), on this basis the sun would be 'invisible' at abut 30 times the earth-sun distance, i.e. 30 times 152 million km, = about 4560 million km. This is approximately the distance of Neptune from the sun. Six 'light-hours' would be rather further than Neptune, but much closer than the nearest stars. But in any case it would be fallacious to apply the rule to a bright light source like a star, or (on earth) a lighthouse. Bright lights can be visible far beyond the usual limit of angular resolvability. In the case of the stars, they are visible in telescopes as point sources which do not appear any larger no matter how great the magnification, apart from any blurring due to optical imperfections or atmospheric effects.

b) a light is only visible if a certain minimum number of photons reaches the eye from the light in a short period of time. It should be possible to calculate the distance at which a given light source would become 'invisible' because the number of photons reaching the eye falls below this threshold. I'm not going to attempt that! It is a matter of common knowledge that many stars are invisible to the naked eye, but become visible in a telescope. Unlike the limit of angular resolution, the photon threshold would depend on the square of the distance.

Without seeing the flat-earther's calculations I don't know whether they are using these principles, or maybe some other idea entirely.
 
so a lumen is a measure of light falling on an area, not a measure of power or energy falling on an area. So the number of lumens falling on an astronaut has nothing to do with heat. Simple example is light bulbs. A 60 watt incandescent and a 10 watt LED bulb both produce about 725 lumens. Watts are power, lumens are light.
Just to complete... Lumen, lux and candela (cd) are usually referred to visible light, and somehow related to the human eye response. Those units are used in photometry.

The (more or less) equivalent units for radiometry (EM radiation , full spectrum, or portions like IR) are Watts, watts per m2, watts per stereoradian, and such.

From all the EM power released from the sun, to the Earth (or moon) arrives only about 1300 W/m2 or so
 
The inverse square law as they think of it applies only to point sources. For extended sources like the Sun and Moon one has to calculate the irradiance by integrating the object's surface brightness over its angular extent.

These people never show the math to support their claims.

IIRC, the error in considering the extended source as an equivalent point source about the order of (d/D)^2 where d is the diameter of the source, and D the distance you're viewing it from?

If so, grossly, as the sun's something like 100 times as far from us as it is wide, so the error, for us globeheads, is not far from 0.01%. So not worth worrying about, idealised averaged point sources are "good enough".
Of course, in the absence of a concrete model - with numbers - from the globefree, one cannot calculate what difference it would make to them.
 
IIRC, the error in considering the extended source as an equivalent point source about the order of (d/D)^2 where d is the diameter of the source, and D the distance you're viewing it from?

If so, grossly, as the sun's something like 100 times as far from us as it is wide, so the error, for us globeheads, is not far from 0.01%. So not worth worrying about, idealised averaged point sources are "good enough".
Of course, in the absence of a concrete model - with numbers - from the globefree, one cannot calculate what difference it would make to them.
My point is that what I've seen the flat earthers do is take the inverse square law and say if distance goes to zero then brightness goes to infinity. This is an argument I've seen as to why going closer to the moon would burn out your eyes. Of course this fallacy works with any light source, even a light bulb.
In reality, the surface brightness of an object is independent of distance and it's the angular extent that changes with distance. So, the moon doesn't get brighter as you get closer it just covers more of your field of view. If you've ever looked at the moon through a telescope and had it fill your field of view you will notice that When you look away your eye has a strong afterimage or is temporarily "blinded" but you don't burn out your eyes with near infinite amounts of light.
They are completely misunderstanding how this all works, which is standard of flat earth arguments.

edited to add: that effect on your eye is typically because your eyes have already dark adapted. If you were consistently looking at the moon and had time for your eye to adjust it shouldn't look any brighter than a sunlit Earth, given the albedo of the lunar regolith.
 
Last edited by a moderator:
My point is that what I've seen the flat earthers do is take the inverse square law and say if distance goes to zero then brightness goes to infinity. This is an argument I've seen as to why going closer to the moon would burn out your eyes.
I think it'd work out, though: if you simplify the moon into a point light source located at its center, then you'd still be 1737 km away from it at the moon's surface., and the brightness should work out approximately in the correct ballpark -- it's not entirely correct because the moon isn't evenly lit.
 
Yeah, i find it rather useless to discuss with these types, as they have no clue about radiometry.
 
"Apply the inverse square law of light to a full moon. The astronauts would of melted at 10,000,000 + lumens. Wake up."
Where do you even start with that one?

Let's not even get into lumens melting. That's just too much.

But I'm very interested in why astronauts are more at danger to lumens than we are. Does earth get less lumens? Or what is protecting us from lumens? Given this person thinks that lumens melt.

Also kinda confused as to what difference a full moon makes to astronauts.

I'd quite like a diagram to at least try to understand where the confusion comes from or maybe just the equation where they applied the inverse square law of light to a full moon and the calculator that spat out 10,000,000+ (you know, them lazy calculators that can only approximate).
 
This was (sort of) the topic in a long ago thread. It's important to be aware that the older and somewhat more sophisticated version of this argument was based on a fundamental misunderstanding: that photons are pooping out as they travel a certain distance. They just dissolve away until they're gone. People who held this notion would try to do the Math to see how many miles of travel it would take a photon to use up its energy and dissolve away to nothing. And if one photon... then all photons.

So, in their conception, the absolute magnitude of a star is irrelevant. All photons leaving this star would poop out on a journey across that much space. One... or 10 to the hundredth power photons... it doesn't matter.

Just as one car trying to make it across the U.S. from California to New York without refueling would run out of gas, so would a hundred cars, or a million. The same would be true if some cars started from San Diego or from L.A. or from Seattle. So the size, absolute or angular, of a star doesn't seem even vaguely relevant either. See what I mean?

This metaphor is apropos to the psychology of this misunderstanding. In daily life all machines or objects take energy to keep going. If you lift your foot off the gas pedal, your car slows down. You can't throw a rock across the Pacific, and it wouldn't matter if you threw one rock or a million. They would all poop out and fall into the sea. So, by association, that is also how photons work in a vacuum.

I specify the older more sophisticated version of the idea, because this idea has been degraded by naïve people simply repeating what they think other naïve people are saying. It's as if a flock of parakeets were playing a game of telephone. After a few years of this the idea has degraded almost to the level of white noise. E.g. they don't know the difference between lumens and photons.
 
Last edited:
This was (sort of) the topic in a long ago thread. It's important to be aware that the older and somewhat more sophisticated version of this argument was based on a fundamental misunderstanding: that photons are pooping out as they travel a certain distance. They just dissolve away until they're gone. People who held this notion would try to do the Math to see how many miles of travel it would take a photon to use up its energy and dissolve away to nothing. And if one photon... then all photons.
This belief is great for explaining why the sun doesn't light up the whole Flat Earth at once if you don't think too hard about it.

I mean that doesn't help you get past sunsets or the sunlight on the southern hemisphere in December, but it's a start if you want to stop thinking.
 
@Z.W. Wolf That's helpful. Of course, if photons pooped out, then the sky would be black. If the sky is black, then you need a dome over the world with glowing stars painted on it. As usual, a flawed base assumption leads to faulty conclusions.

I guess that some people find it inconceivable that a handful of photons travelled for millions of years before striking a CCD in the Hubble telescope. Most of us find that marvelous but some refuse to believe it.
 
Horrific mix of misconceptions available, it seems, annoyingly based on hearing about a concept, and not understanding it, or how to apply it. @Z.W. Wolf 's example of attenuation in a lossy medium (yes, the photons do just poop out (waaat?!?!?) when they interact with something that's in the way) however, that's not a particularly good description of the interstallar medium between us and the sun. Put it back in fibre optics where it belongs. And @MyMatesBrainwashed vid shows just the vaguest understanding of the concepts of "inverse" and "square", but in such a distorted way, my only response was "wow!". He was too soft on him, TBH, IMNSHO, he should have done the "0.1" starting point, just to rub in the magnitude of the misunderstanding.
 
Back
Top