Debunked: Telescope Distances of Billions of Light Years are impossible

The optical sensors simply affect the detail that can be attained. And all fall far short of the human eye.


My recent research all seems to contradict you. ie...
This is how I imagined the answer to be http://www.nasa.gov/missions/highlights/webcasts/shuttle/sts109/hubble-qa_prt.htm
In visible light (at wavelengths near 500 nm) the combination of the Hubble telescope plus its highest resolution cameras achieve an angular resolution of about 0.04 arc seconds. The human eye can resolve objects separated by about 40 arc seconds. In other words the resolution of Hubble exceeds that of the human eye by about a factor of 1000. The science team that developed the new Advanced Camera for Surveys be installed in SM3B points out that the high resolution channel of the ACS should resolve two fireflies separated by about 10 feet at a distance corresponding to the distance between New York and Tokyo.
Content from External Source
And for anyone like me interested, here's a Harvard lesson which compares the eye to a telescope http://mo-www.harvard.edu/OWN/pdf/eyeScopeT.pdf

hubble.JPG hubble2.JPG
 
Last edited:
Dust is irrelevant in the calculation of Dmax, because it's a linear factor that is already included in the Andromeda observation.

How do you figure since it is 30 times more abundant than previously believed? As distance increases that 30 times is going to become a significant factor - you know this - please don't try to avoid what you know at just a billion light years would become so significant as to be the primary factor.
 
My recent research all seems to contradict you. ie...
This is how I imagined the answer to be http://www.nasa.gov/missions/highlights/webcasts/shuttle/sts109/hubble-qa_prt.htm
In visible light (at wavelengths near 500 nm) the combination of the Hubble telescope plus its highest resolution cameras achieve an angular resolution of about 0.04 arc seconds. The human eye can resolve objects separated by about 40 arc seconds. In other words the resolution of Hubble exceeds that of the human eye by about a factor of 1000. The science team that developed the new Advanced Camera for Surveys be installed in SM3B points out that the high resolution channel of the ACS should resolve two fireflies separated by about 10 feet at a distance corresponding to the distance between New York and Tokyo.
Content from External Source
And for anyone like me interested, here's a Harvard lesson which compares the eye to Hubble http://mo-www.harvard.edu/OWN/pdf/eyeScopeT.pdf

hubble.JPG hubble2.JPG

Alright we'll plug that into the equations. Since the Andromeda galaxy is 2 million light years and is the furthest the human eye can see than 2,000,000 x 735 = 1,470,000,000 light years. Still far short of the claimed 13.7 billion by about 12 billion is it not?

So we will use your calculations - which also fail to take into account dust extinction as distance increases and the intervening dust becomes more and more a factor. All these equations of distance put forth ignore extinction and assume a perfect viewing sample.

As I said the 357.14 included no exposure, so we will use the 735 for exposure. Which as I also stated would only be a few times more than without exposure. Roughly doubled. I see no contradiction since I stated my calculation did not account for exposure and would increase several times with exposure.

The only contradiction I see is by those claiming it adds up to 13.7 billion or more.
 
Last edited:
Alright we'll plug that into the equations. Since the Andromeda galaxy is 2 million light years and is the furthest the human eye can see than 2,000,000 x 735 = 1,470,000,000 light years. Still far short of the claimed 13.7 billion by about 12 billion is it not?

It's 1/10th the distance. So just use a 100x longer exposure, and you can see it.
 
The 735 factor was for a 6 inch diameter telescope and a 60 second exposure!

The Hubble has 15.7 times the diameter of a 6 inch telescope.
My bad - introducting bunk again, sorry. I'll crawl back into the shadows.
 
Ignore dust for now, because your original post did not include dust.

Unless you accept the errors in your original post, there's no really point proceeding discussing other factors.

Do you now accept that you can get enough light to account for inverse square differences by a combination of large mirror and long exposure? If not, why not?
 
I'm marking this "debunked" as the original claim that a telescope cannot gather enough light to overcome the inverse square law of distant objects has been shown to be false, both by reference to the literature, and by simple calculation.
 
The 735 factor was for a 6 inch diameter telescope and a 60 second exposure!

The Hubble has 15.7 times the diameter of a 6 inch telescope.
And the Hubble has 357.14 times the diameter of the human eye and only allows it to collect 157,551 times more light than the human eye. So at 357.14 times the distance the light collected is 157.551^2 dimmer since at that distance the aperture would reduce to zero. Since a 6 inch is 536 times more light, then at 24 (objective diameter compared to the eye) times the distance it would be 536^2 dimmer. At only 24 times the distance the human eye can see the light becomes 536^2 dimmer with a 6 inch telescope.

So in reality with a 6 inch telescope one can only see 48,000,000 light years for the same object we can see at 2 million light years with the eye. Because at 48,000,000 light years the light is 536^2 dimmer being its surface area is only 536 times that of the eye.

http://www.astronomynotes.com/telescop/s6.htm

So the Keck telescope at 10 meters x 1000 mm/meter = 10,000 mm across can see an object at least 10,000/9 = 1111 times further away than with the unaided, dark-adapted eye for the same exposure time.
Content from External Source
The Hubble is 2.4 meters so 2.4 meters x 1000 = 2,400mm or 2,400/9 = 266 times.

2,000,000 x 266 = 532,000,000 light years. So we are even further from your claims now.

As I said - exposure will increase this several times, but you can't even find two references that give the same answer - which should tell you something is wrong right off the bat.
 
Ignore dust for now, because your original post did not include dust.

Unless you accept the errors in your original post, there's no really point proceeding discussing other factors.

Do you now accept that you can get enough light to account for inverse square differences by a combination of large mirror and long exposure? If not, why not?

Because you can't show me two sources that agree on the distance telescopes can see. Every single one gives you a different answer. The Keck supposedly sees 1111 times further than the human eye and it's objective diameter is over 3 times Hubble's. But you insist Hubble sees further than the Keck. And even a larger gap than the 6 inch compared to the Hubble. Something doesn't add up - but it's not on my side.

Are you going to argue dust in the atmosphere as the reasoning? If so let's include it in space as well then.

https://www.courses.psu.edu/astro/astro001_pjm25/telescopes.html

(10 m/2.4 m)2 = 17.4; i.e, Keck gathers 17.4 times as much light as Hubble (per unit time). Hubble's great advantage, of course, is that it's above the atmosphere.
Content from External Source
Even astronomers admit the Keck gathers more light than the Hubble - but is only limited because of atmospheric dust.
 
Last edited:
And the Hubble has 357.14 times the diameter of the human eye and only allows it to collect 157,551 times more light than the human eye. So at 357.14 times the distance the light collected is 157.551^2 dimmer since at that distance the aperture would reduce to zero. Since a 6 inch is 536 times more light, then at 24 (objective diameter compared to the eye) times the distance it would be 536^2 dimmer. At only 24 times the distance the human eye can see the light becomes 536^2 dimmer with a 6 inch telescope.

This makes no sense at all. I'm having trouble even explaining why it is wrong, as it seem so arbritary. Maybe break it down one step at a time. First:

"And the Hubble has 357.14 times the diameter of the human eye and only allows it to collect 157,551 times more light than the human eye."

I agree, with the addition of "in the same exposure length", as doubling the exposure would double the light collected. Agreed?
 
Because you can't show me two sources that agree on the distance telescopes can see. Every single one gives you a different answer. The Keck supposedly sees 1111 times further and it's objective diameter is over 3 times Hubble's.

Depends on the exposure. Let's focus on that. Do you agree longer exposure = more light hitting the sensor?
 

You were all ready to accept the 735 factor because you thought it supported your premise. Until you learned it was for a 6 inch scope. And now you're going to back to ignoring the exposure time. [...]

The exposure factor given for a 60 second exposure is 900. Which increases the distance you can see objects by a factor of 30.
 
Because you can't show me two sources that agree on the distance telescopes can see.

Getting sources to agree is not the issue. We are talking basic science here. Do you agree that 10x the exposure time will result in 10x the light hitting the sensor?

Please focus on that one point, so we can move forward to other points.
 
Because you can't show me two sources that agree on the distance telescopes can see.
That's simply because there is no limit. If the light source were strong enough, you could see it even with naked eye at 13 Gly. For example this article reports about an event at 7.5 Gly that could have been seen by naked eye (if anyone looked there at the right moment).

BTW, there are reports (mentioned for example here), of people seeing Bodes galaxy (12 Mly) or even the galaxy Centaurus A (13.7 Mly) under exceptionally good conditions. If you were on the orbit, like Hubble is, you would see even much fainter sources of light than that. The distance does not matter, it can be a faint object close, or a strong light source at billions of light years. For example if you can see the Bodes galaxy at 12 Mly, then if there were a galaxy 100 times brighter than Bodes at 120 Mly distance, you could see it with naked eye too. On Earth. On the orbit, it would be likely an order of magnitude or two more than that.

It is not the distance that is limiting, it is the light intensity. If a single photon from the light source can reach the sensor, it can (and will) be detected. Of course a single photon will be mostly lost in the noise, so you will need to collect couple of them before you can confirm the light source.
 
Last edited:
BTW, the edge of the observable Universe is in fact much farther than the age of the Universe indicates. It is estimated to be some 46.5 billion light years away. This is due to the expansion of the space - you can find a detailed explanation for example in the Wiki here: https://en.wikipedia.org/wiki/Observable_universe

However, even this much greater distance (further reducing the intensity of the light), in no way excludes that you can see objects near the edge, as long as they are intense enough. So again - no, telescopes do not have any limited distance range. The only limit is the size of the observable Universe.
 
Last edited:
I have also looked up how long the exposure was at the famous Hubble Deep Field image. It was made in four wavelengths, and the total exposure time was over 140 hours. It means, it collected light over 3 million times longer than a human eye does. Additionally, it was made on the orbit without the disturbing influence of the atmosphere, with much larger light collecting area, and with sensors sensitive more than human eye.
 
PS: Edited (see remarks in next posts)
and yet another comment from me - when comparing the human eye with a telescope, you forgot to consider the width of the angle of view (corresponding to the area from where the sensor gets the light from). While a human eye gets light from 60°; Hubble telescope at the Deep Field imaging used 5.3 arc minutes (0.083°) width of field. That corresponds to 522 572 time smaller area of observation at Hubble than at a a human. In other words, the light is 5*10^5 more focused (and hence more intense) at Hubble than at a human eye (and that's without counting in the light collecting area).

So let's summarize Hubble vs Human eye:
  1. Light collecting area: 4,5m² vs 0.00016m² (2x lens Ø10mm) => factor of 3*10^4
  2. Angle of width: 5.3 arc min vs 60° => factor of 5*10^5
  3. Spacial vs terrestrial observation (without adaptive optics) factor estimate at 10^2
  4. Sensor sensitivity - factor 1 to 100 (~1 for perfect conditions, conditioned eye, and excellent health)
  5. Time of exposure - up to millions of seconds vs 0.15s - factor of 10^7
The resulting factor, combining all those shown above gives some 3*10^14 10^20. The square root of it is ~2*10^7, which means that if you can see the Andromeda Galaxy at 2 Mly (2*10^6 ly) with naked eye, you could theoretically see identically intense galaxy at 2*10^13 ly with Hubble. However, the reality is lower, because of different factors like the already mentioned expansion of the Universe, dust, noise, etc. Still, it is largely sufficient to explain the existing Hubble images of big galaxies from the Deep Field.
 
Last edited:
So again - no, telescopes do not have any limited distance range. The only limit is the size of the observable Universe.

In practice though, a specific telescope has limits. But it we are talking about a telescope with arbitrary parameters (light collecting area, exposure time), then no, there's no real limit.

@Justatruthseeker started out with these essentially correct observations (assuming the sensitivity of the sensor is about he same as the eye

Assuming that a star is so far away that it is barely visible to the naked eye, we know that the Hubble telescope can make the star appear 127,551 times brighter. Does this mean that the Hubble telescope enables an observer to see the star if it were 127,551 times farther away? The answer is no. The Inverse Square Law says that the light that we receive from a star is inversely proportional to the square of its distance. According to this law, at that distance, the light of the star becomes 127,551^2 or 16,269,262,700 times dimmer, far too dim for us to see with the telescope.

This raises the question: What is the maximum distance an object can be seen through the Hubble telescope? The answer is 357.14 times the distance that the naked eye can see. The reason is that an object 357.14 times farther away, its light becomes 127,551 times dimmer. Since the Hubble telescope can make a star appear 127,551 times brighter, then looking through the telescope the star would be barely visible.

Not unreasonable observations. However he then goes on to totally discount the effect of long exposures.

We can frame the problem using JAT's scenario of a star (or galaxy) that is barely visible to the naked eye. How much further could the Hubble see this same star from?

As the light diminished by the inverse square of the distances, then any increase we achieve in acquiring light will only increase that distance by the square root of the increase in light. So as JAT correctly explains, an increase in light receiving area of 127,551x will push the distance to the square root of that, or 357x.

The real light collecting power comes in the form of long exposures. Increase in the length of exposure is directly proportional to increase light collected by the sensor. However, any increase we achieve in acquiring light will only increase that distance by the square root of the increase in light. So if we increase the exposure by a factor of 100, we will only push the distance 10x.

However, the key to the Hubble is that it can do really long exposures, many hours, hundreds of thousands of second, or millions of times longer than the eye exposure equivalent (1/10th of a second, being conservative). So we can gather millions of times more light. But any increase we achieve in acquiring light will only increase that distance by the square root of the increase in light, so really it's "only" thousands of times more distance

So to repeat my earlier calculation to fit this more long-winded but hopefully more understandable explanation, if the Hubble has an exposure of 350,000 seconds (as per wikipedia) then that's an increase in light of 3,500,000, and of course any increase we achieve in acquiring light will only increase that distance by the square root of the increase in light, hence it's only 1870x more distance.

So we've got 357x the distance for the bigger light collecting area, and 1870x the distance for the longer exposure, so that's 667,000 times the distance with both combined.

So (in our simplified example) if you can see the Andromeda galaxy with the naked eye at 2.5 million light years, then the Hubble can photograph it at 1.6 Trillion light years. Or it could see an object that's 1% the brightness of Andromeda at 16 billion light years.


[Note: Yes I'm repeating myself a bit here, I'm just interested in how best to explain things, so I thought I'd give it another go]
 
In practice though, a specific telescope has limits.
I understand what you mean, but what I wrote is that no telescope (or naked eye) have any limit in distance (other than the size of the observable Universe). As long as you have a source intense enough, you can see it even at the edge of the observable Universe. What I wanted to tell is that it is the light intensity that is limiting, not the distance, so you cannot tell about any telescope that you can use it only for certain distances. That the distance impacts the light intensity is clear, but distance alone is no limit.

Otherwise, I perfectly agree that the main point of mistake at Justatruthseeker was that he did not understand the impact of the length of the exposure.
 
Last edited:
and yet another comment from me - when comparing the human eye with a telescope, you forgot to consider the width of the angle of view (corresponding to the area from where the sensor gets the light from). While a human eye gets light from 60°; Hubble telescope at the Deep Field imaging used 5.3 arc minutes (0.083°) width of field. That corresponds to 522 572 time smaller area of observation at Hubble than at a a human. In other words, the light is 5*10^5 more focused (and hence more intense) at Hubble than at a human eye (and that's without counting in the light collecting area).

This does not seem correct, the same amount of light from that direction is hitting the mirror. If you take a narrower slice of that light, you get less light not more. Consider this test. I took identical exposures at 50mm and 500mm, then scaled the 50mm image to the size of the 500mm image. The results are basically identical (except I forgot automatic white balance, not a factor here, but makes them different colors).



So in terms of detecting a dim image, spreading the incident light over the sensor does not help. JAT touched on this earlier, although I'm not sure I agree with the "four times dimmer" thing - that assumes the aperture gets smaller with a longer focal length. In my case I kept the aperture at 6.3, although it could have been wider at 4.5 for 50mm.

Incorrect. Magnification defeats the purpose of making an image brighter. A common misconception among the populace is that magnification helps.

https://starizona.com/acb/basics/equip_magnification.aspx

Another reason for keeping the magnification low has to do with image brightness. An unfortunate law of physics dictates that when the magnification is doubled, the image gets four times dimmer.
Content from External Source

This is the same object under the exact same lighting conditions, with automatic exposure, so you can see what you are looking at.
 
Last edited:
Yes, that's correct, the same amount of light from given source will reach the sensor, regardless of the width of the angle. I had it wrong. I'll edit it in the previous post.
 
Just to put things in perspective:
During my study astronomy I remember we used the Palomar Observatory Sky Survey. The limiting magnitude of the faintest objects was +22. It was made by the 1.2 meter Schmidt telescope. Exposure times were 10 minutes for the blue plates up to 45 minutes for the red ones. Longer exposures would saturate the emulsion with background light from the atmosphere.
Now let me make an estimate of the magnitude the Andromeda nebula would have at a distance of 14 billion ly, using the inverse square law and the magnitude calculation rules (in short: 10x as far away means 100x as weak which means adding +5 magnitudes).
The andromeda galaxy has an apparent magnitude of 3.5 and a distance of 2 million light years. Moving it 7000x as far away (7x10x10x10) means adding (a bit more as)4+5+5+5 = +19 magnitudes. At 14 billion ly its apparent magnitude would be +22.5
The Hubble telescope has twice the width and virtually no background light (no atmosphere) so it can use exposure times up to 2 million seconds (23+ days).
Wouldn't it be able to photograph fainter objects as the POSS-I could do 70 years ago?
 
Just to put things in perspective:
During my study astronomy I remember we used the Palomar Observatory Sky Survey. The limiting magnitude of the faintest objects was +22. It was made by the 1.2 meter Schmidt telescope. Exposure times were 10 minutes for the blue plates up to 45 minutes for the red ones. Longer exposures would saturate the emulsion with background light from the atmosphere.
Now let me make an estimate of the magnitude the Andromeda nebula would have at a distance of 14 billion ly, using the inverse square law and the magnitude calculation rules (in short: 10x as far away means 100x as weak which means adding +5 magnitudes).
The andromeda galaxy has an apparent magnitude of 3.5 and a distance of 2 million light years. Moving it 7000x as far away (7x10x10x10) means adding (a bit more as)4+5+5+5 = +19 magnitudes. At 14 billion ly its apparent magnitude would be +22.5
The Hubble telescope has twice the width and virtually no background light (no atmosphere) so it can use exposure times up to 2 million seconds (23+ days).
Wouldn't it be able to photograph fainter objects as the POSS-I could do 70 years ago?

All the answers rely on completely unreliable information. [...]

It is claimed a 6 inch telescope can see 536 times further than the human eye, yet the Keck telescope with a 10 meter mirror is only capable of seeing 1,111 times further than the human eye. At the same time they try to claim the Hubble with a mirror of only 2.4 meters can see 6500 times further than the human eye.

If all of you can't see the discrepancies right there in front of you, then there really is no hope at all for science.

By the very math itself the Hubble collects 127,551 times more light than the human eye. The square of 127,551 is 357.14.

Regardless of how long the exposure, all light received is subject to the inverse square law. This would mean that at 13 billion light years the magnitude would have to be approximately 11x10^10 higher magnitude than the Andromeda galaxy. Reaching well into the metaphysical range. And theoretically since those further galaxies are younger, then they should also be dimmer since they are in the process of forming stars - not with complete star systems.

And then for such distances the K factor must also be taken into account due to the fact that the radiation is shifted into the red end of the spectrum.

https://en.wikipedia.org/wiki/Absolute_magnitude

For very large distances, the cosmological redshift complicates the relation between absolute and apparent magnitude, because the radiation observed was shifted into the red range of the spectrum. To compare the magnitudes of very distant objects with those of local objects, a k correction might have to be applied to the magnitudes of the distant objects.
Content from External Source
As well as Relativity since spacetime would no longer be Euclidean.

For nearby astronomical objects (such as stars in the Milky Way) luminosity distance DL is almost identical to the real distance to the object, because spacetime within the Milky Way is almost Euclidean. For much more distant objects the Euclidean approximation is not valid, and general relativity must be taken into account when calculating the luminosity distance of an object.
Content from External Source
And of course, this is ignoring dust extinction as being no factor at all.

If extinction by gas and dust is not significant, one can compute the absolute magnitude of an object given its apparent magnitude and luminosity distance :
Content from External Source
But at those distances it becomes quite significant, since it is 30 times more abundant than previously believed as measured by the Ulysses Spacecraft.

https://en.wikipedia.org/wiki/Ulysses_(spacecraft)

Ulysses discovered that dust coming into the Solar System from deep space was 30 times more abundant than previously expected.
Content from External Source
And we have not even began to talk about the huge halos of gas and dust surrounding every galaxy, calculated to have twice the mass of the galaxy itself, because they have no answer to this problem so wish to ignore it.

There is an answer, but the answer falsifies another astronomical myth, and so will not be entertained, as I expect the dust problem to be clipped and not entertained. All so the myths of modern astronomy can continue to survive unchallenged.
 
Last edited by a moderator:
Regardless of how long the exposure, all light received is subject to the inverse square law.
Here lies the clue of your mistake. With a N times longer exposure you can collect N times more light from the given source (regardless of its distance), hence you can detect N times weaker light source from the same distance, or a light source of the same intensity from the distance times square root of N. Hence, there is absolutely no problem in looking up to the edge of the observable universe with a sufficiently long exposure.

I believe that your problem is that you probably believe that when light from a distant source is so weak that it cannot be detected within 0.15 sec (or any other short moment), it cannot be detected with a longer exposure either. That's wrong. It can. You need to be aware of the quantic character of the light to understand it. A detector can detect a single photon, and the photon has always the same intensity, regardless of the distance it comes from (it can be 1nm or 14Gly). It is not N² times weaker with the distance, just N² times less probable to detect within given time, with N times longer distance. It can have a different energy (depending on the frequency or the red shift), but that's irrelevant here. What is important is, that if you collect the light during ~0.1s, the likelihood that you detect a photon from a source 13Gly distance is extremely low. However, the longer you record the exposure, the higher the probability is that you detect some photons from the source, and hence you can image them.

PS: edited for spelling errors, grammar, formatting, to formulate some expressions better, more precisely, and adding some minor details
 
Last edited:
Because you can't show me two sources that agree on the distance telescopes can see. Every single one gives you a different answer. The Keck supposedly sees 1111 times further than the human eye and it's objective diameter is over 3 times Hubble's. But you insist Hubble sees further than the Keck. And even a larger gap than the 6 inch compared to the Hubble. Something doesn't add up - but it's not on my side.

Are you going to argue dust in the atmosphere as the reasoning? If so let's include it in space as well then.

https://www.courses.psu.edu/astro/astro001_pjm25/telescopes.html

(10 m/2.4 m)2 = 17.4; i.e, Keck gathers 17.4 times as much light as Hubble (per unit time). Hubble's great advantage, of course, is that it's above the atmosphere.
Content from External Source
Even astronomers admit the Keck gathers more light than the Hubble - but is only limited because of atmospheric dust.

Just to clarify something here, you're aware the main issue with telescopes on earth isn't 'dust' but optical atmospheric distortion? Something that can be managed out to a large degree with adaptive optics.
 
BTW, there is also another reason why it is difficult to see very distant objects from the terrestrial surface (not only with the naked eye, but also with terrestrial telescopes). It is due to the Red Shift that distant objects exhibit. Because the atmosphere filters the IR part of the light spectrum, we could only see those very distant sources that emit far in the UV spectrum or above it (X-rays, gamma rays) so that the Red Shift transforms the spectra into frequencies that pass through the atmosphere.
 
Last edited:
It is nonsense to say “the Keck telescope can see 1111x further away than the human eye”

You should talk about the limiting magnitude of an object that can be seen. This limiting magnitude depends on the recieved amount of light and on the background brightness. When an object is bright enough it doesn’t matter how far away it is.

For the human eye the limiting magnitude is at best ca. 6.5, but close to a city perhaps only 4 or 3.

When looking through a telescope you may add a number to that naked-eye-limit, depending on the diameter (D (mm)), optical quality (transmission factor T) and magnification (V): Δm = 2.5*log(DTV) – 2. This new limiting magnitude could mean that an object which was on the threshold for the naked eye would theoretically be visible so much further away.

But when you start making photographs with long exposure times the limiting magnitude simply equals the background magnitude. The diameter of the telescope doesn’t matter any more, because you can collect light as long as you please. On earth a real dark sky has a limiting magnitude of 21.5 to 22 (magnitude/arcsec^2). Absolute darkness (without the earth’s atmosphere) however, has a limiting magnitude of 27. [Edit: another source mentions 31; may have something to do with the broader spectrum, including IR]. The difference of 5 magnitudes corresponds to a brightness ratio of 100. So if you insist on translating that to a distance ratio it would mean that the Hubble telescope can see an object that was on the threshold of the Keck-telescope (or whatever telescope on earth) 10x further away. That is one of the main reasons why we wanted a telescope outside earth’s atmosphere.

Your magnitude calculation is completely obscure to me. Are you trying to say that moving the andromeda galaxy 7000x further away would not change its magnitude from 3.5 to about 22.5? Please explain. Where does the 11x10^10 come from and what does it mean?

The K correction only holds for a certain filter f.i. visible magnitudes, but Hubble can detect infrared just as well.

Younger galaxies are actually brighter, because the star formation rate is much higher.

The furthest objects have a redshift of about 10. Could you explain what effect that would have on the luminosity distance?
 
Last edited:
[Edit: another source mentions 31; may have something to do with the broader spectrum, including IR].
The story becomes even more interesting: https://en.wikipedia.org/wiki/Limiting_magnitude

Telescopes at large observatories are typically located at sites selected for dark skies. They also increase the limiting magnitude by using long integration times on the detector, and by using image-processing techniques to increase the signal to noise ratio. The Keck Telescope, for example, 10 meters in diameter, can detect stars at 24 to 26th magnitude using a one-hour integration and adaptive optics techniques.[5]

Even higher limiting magnitudes can be achieved for telescopes above the Earth's atmosphere, such as the Hubble Space Telescope, where the sky brightness due to the atmosphere is not relevant. For orbital telescopes, the background sky brightness is set by the zodiacal light. The Hubble telescope can detect objects as faint as 31st magnitude,[6][7] and the James Webb Space Telescope (operating in the infraredspectrum) is expected to have an absolute magnitude limit of 34th magnitude.[6]
Content from External Source
 
yes, that's my take on it

simply overlooking the role of exposure in all of this

[...mod edit: impolite text...]
 
Last edited by a moderator:
It seems to me that the OP is under the impression that individual photons are subject to dimming. Which could explain why he feels exposure time isn't a factor. If that's the case then maybe it would help the OP to know that the "inverse square law of light" is referring to the number of photons captured not the strength of the photons.
 
Right!

This thread is an example of people talking past each other. The OP is confused about what the inverse square law is saying and the members responding to him never realized it.

The OP thinks that light is pooping out rather than spreading out. Rather than fewer photons, he thinks the problem is less energetic photons.

He thinks that the photons themselves are getting less "luminous." It then wouldn't matter how many of these pooped out photons you gather over a long exposure, they would still be too tired to react with the sensor, no matter what it was. The MB members responding to him aren't picking up on this, thus the whole conversation is a frustrating muddle.
 
Last edited:
The OP thinks that light is pooping out rather than spreading out. Rather than fewer photons, he thinks the problem is less energetic photons.

He didn't even bring up photons until post #38
More photons that still obey the inverse square law.

He did not mention photons again, but his misconception was address by @txt29

Here lies the clue of your mistake. With a N times longer exposure you can collect N times more light from the given source (regardless of its distance), hence you can detect N times weaker light source from the same distance, or a light source of the same intensity from the distance times square root of N. Hence, there is absolutely no problem in looking up to the edge of the observable universe with a sufficiently long exposure.

I believe that your problem is that you probably believe that when light from a distant source is so weak that it cannot be detected within 0.15 sec (or any other short moment), it cannot be detected with a longer exposure either. That's wrong. It can. You need to be aware of the quantic character of the light to understand it. A detector can detect a single photon, and the photon has always the same intensity, regardless of the distance it comes from (it can be 1nm or 14Gly). It is not N² times weaker with the distance, just N² times less probable to detect within given time, with N times longer distance. It can have a different energy (depending on the frequency or the red shift), but that's irrelevant here. What is important is, that if you collect the light during ~0.1s, the likelihood that you detect a photon from a source 13Gly distance is extremely low. However, the longer you record the exposure, the higher the probability is that you detect some photons from the source, and hence you can image them.
 
Yes, I'm generalizing to get to the more meta level...

This is a good example of a common problem. The conversation starts too far into the subject, with the two parties beginning with different basic assumptions.

I've seen numerous arguments between FE believers and globeheads that are puzzling to each side. (Anything to do with Galilean relativity, Coriolis effect, "centrifugal force" at the equator, etc.) The argument devolves into shouting and name calling. The key is this. The FE believers know nothing about inertia, momentum, Newton's three laws of motion, etc. It's not just that they don't understand it. They've never heard of it. Even that isn't going far enough. Inertia isn't a part of their mental landscape. But this mindset is so foreign to the globeheads that they don't realize it and go on talking as if the FE believers are starting with the same basic knowledge and assumptions. On the other side the FE believers are completely frustrated by the senseless things the globeheads are saying.

It's a misunderstanding on such a basic level that neither side realizes what the problem is.
 
Last edited:
It's a misunderstanding on such a basic level that neither side realizes what the problem is.

It's a fundamental problem when debunking (or just explaining). It comes up in 9/11 discussion too, for example where people say things violate Newton's laws of motion. But since they are not familiar with Newton's laws of motion (and the underlying concepts of force, mass, acceleration, inertia, etc) it makes it hard to explain why it does not.
 
Uhm, where did the estimate for the exposure time of the human eye come from? I must have missed it.
 
Back
Top