The Shape and Size of Glare Around Bright Lights or IR Heat Sources

I noticed the following video referred to on Twitter in the context of Dave Falch's arguments against Mick's 'rotating glare' hypothesis. It shows planes passing overhead filmed simultaneously in standard light and infrared modes, using two lenses in the same camera. It is striking that the 'glare' of the planes is actually larger in the IR mode, and sometimes obscures the plane itself completely. Whether this is 'glare' in the same sense as in Mick's hypothesis I don't know. The photographer describes it as 'IR over exposes more'. I dare say David Falch would reply 'wrong kind of IR', or 'overexposure doesn't count', or maybe 'just a crappy amateur camera, not military grade FLIR', but the fact remains that, whatever you call it, the image of an object in IR can be larger than the image in normal light.


Source: https://www.youtube.com/watch?v=8oQdsyIwsLE
 
I noticed the following video referred to on Twitter in the context of Dave Falch's arguments against Mick's 'rotating glare' hypothesis. It shows planes passing overhead filmed simultaneously in standard light and infrared modes, using two lenses in the same camera. It is striking that the 'glare' of the planes is actually larger in the IR mode, and sometimes obscures the plane itself completely. Whether this is 'glare' in the same sense as in Mick's hypothesis I don't know. The photographer describes it as 'IR over exposes more'. I dare say David Falch would reply 'wrong kind of IR', or 'overexposure doesn't count', or maybe 'just a crappy amateur camera, not military grade FLIR', but the fact remains that, whatever you call it, the image of an object in IR can be larger than the image in normal light.


Source: https://www.youtube.com/watch?v=8oQdsyIwsLE

If IR overexposes more it would mean the exposure settings aren’t optimized for the targets. They could be adjusted but perhaps it is a contrast issue; that is, if the targets are substantially brighter than their surroundings in the infrared then adjusting the exposure settings could result in images in which the target is the only thing seen and there’s no context in the image. If you want to also see clouds or the ground you may have to purposefully let the target saturate and that would bring out much more of the wings of the point spread function of your optical system.
 
The photographer describes it as 'IR over exposes more'. I dare say David Falch would reply 'wrong kind of IR', or 'overexposure doesn't count'

Glare is overexposure. it's overexposing the heat/light source.

If you take a correctly exposed photo of the sun, then it's a small round disk in a black background. But cameras expose for a scene, so the sun ends up 10,000x overexposed, and any stray light around it shows up.

2023-08-26_08-14-43.jpg

Notice how much bigger the glare is than the sun. You see the actual size in the filtered version and in the internal reflection.
 
Glare is overexposure. it's overexposing the heat/light source.
That's only half true.
Mie scattering causes glare, because it is highly angular. Some types of scattering are also sensitive to polarization.

http://www.iup.uni-bremen.de/~luca/? download=01_LL_VO.pdf
image.png image.png image.png image.png



Here's video of cars in fog, which has motion and concentrated moisture that, on a less foggy day, would be distributed over a larger distance:

Source: https://youtu.be/XdK8BYFY3SY

The screenshots are from the first 20 seconds.

image.jpeg image.jpeg image.jpeg

image.jpeg image.jpeg image.jpeg


The point here is that if you underexpose the scattered light, the glare becomes too dark to see, but it's still there. So glare size depends on exposure because it's a gradient around the light source.

If some stars seem bigger than others, that's because of glare.
 
And keep in mind that the surface brightness of the Sun (radiance) is of order 10^4 times higher than a diffuse scatterer illuminated by sunlight. So any camera properly exposed for an outdoor scene on a sunny day would overexpose the Sun’s disk by four orders of magnitude.
 
When I refer to glare, I'm referring to an in-camera effect, but not bloom. Unfortunately the terminology here is not consistent, but I use "glare" because the type of stray light we refer to is discussed in the literature as having a "glare-spread function".

I probably should have just invented a new term.,
Blooming can be due to optical scattering and FPA crosstalk.

Article:
Blooming effects in indium antimonide focal plane arrays

Studies of blooming effects in InSb focal plane array (FPA) detectors, are presented. Two blooming test devices are described, which have allowed to isolate optical, charge-diffusion and electronic blooming mechanisms. It is demonstrated that when a spurious illumination due to optical scattering is eliminated, then no extended blooming occurs, and only normal cross-talk mechanisms cause signal offset in elements adjacent to the hot target image. Cross-talk data are analyzed in terms of the signal decay versus element position, and the lateral carrier diffusion length is derived. Susceptibility of different diode structures to blooming, is discussed. It is also shown that an EPA signal processor may cause an extensive electronic blooming.

Article:

The Effects of Photo-Generated Carrier Diffusion on Blooming in an IR Detection System Exposed to High Intensity Sources​

Wysocki, Bryant ; Haeri, Mitchell
April 2005

A study of the blooming effects in a 640 x 480 indium-antimonide (InSb) FPA with 20 μm pixel-to-pixel spacing and operated at 77 K is presented. Electronic and optical blooming from both a high-intensity blackbody spectral source and a 4.6 μm narrow line-width laser source is examined. A 30-μm pinhole/mask was fastened directly to the FPA to achieve spatial isolation of individual pixels in an attempt to separate optical and electronic effects. A series of experiments were run to determine the relative contributions of each. The spectral source and the laser were each used to bloom the FPA both with and without the mask present. Optical effects caused by ghosting, diffraction, lens and housing scatter were shown to dominate, resulting in global loss of image quality. Effects due to electronic phenomenon, such as carrier diffusion and charge bleeding, were shown to be minimal and locally constrained to ∼ 20 μm. A simulation of carrier drift and diffusion was constructed to provide a theoretical model of the crosstalk between pixels.

The second author is Mitch Haeri from Raytheon Space and Airborne Systems, so you can guess what sensor they were testing in 2005.
 
And keep in mind that the surface brightness of the Sun (radiance) is of order 10^4 times higher than a diffuse scatterer illuminated by sunlight. So any camera properly exposed for an outdoor scene on a sunny day would overexpose the Sun’s disk by four orders of magnitude.
I'm figuring 1:10³ in power, for an albedo close to 1. How do you figure your number? I wasn't able to find good sources.
 
I noticed the following video referred to on Twitter in the context of Dave Falch's arguments against Mick's 'rotating glare' hypothesis. It shows planes passing overhead filmed simultaneously in standard light and infrared modes, using two lenses in the same camera. It is striking that the 'glare' of the planes is actually larger in the IR mode, and sometimes obscures the plane itself completely. Whether this is 'glare' in the same sense as in Mick's hypothesis I don't know. The photographer describes it as 'IR over exposes more'. I dare say David Falch would reply 'wrong kind of IR', or 'overexposure doesn't count', or maybe 'just a crappy amateur camera, not military grade FLIR', but the fact remains that, whatever you call it, the image of an object in IR can be larger than the image in normal light.

Article:
Most of the world’s armies use LWIR for “dirty battlefield” conditions and most of the world’s aviation and maritime communities use MWIR due to smaller wavelength, higher standoff range, and more humid atmospheres. The clear benefit of having MWIR for ground applications is that an MWIR camera with given aperture can see about 2.5 times as far as an LWIR imager (due to diffraction limits) under “good” conditions.

However, most armies will not sacrifice LWIR for MWIR because of the dirty-battlefield conditions of hot targets, burning objects, smoke, obscurants, and even cold weather, where MWIR is photon-starved.


Figure 2. Burning barrel in field of view of MWIR imager and LWIR imager. (Courtesy NVESD).

A good example of the dirty-battlefield power of LWIR is burning barrels as shown in figure 2. Hot targets shift left on Planck’s curve with huge energy in the MWIR that blooms the detector and provides veiling glare, whereas the LWIR image is still functional with burning barrels in the field of view.
Also, since there are many more photons in the LWIR band with terrestrial targets and backgrounds, integration times are much shorter, and moving around fast does not blur your image. So having both bands in a single camera can be of significant benefit.
 
That's only half true.
Well it's entirely true if we are only talking about in-camera light spread, which I am.

Of course atmospheric diffusion/scattering exists, but I don't see it as a factor in any of these cases.

If some stars seem bigger than others, that's because of glare.

Sure, but is it atmospheric glare? You get different sized stars when viewed from space, because of in-camera scattering/diffraction.

Hubble:
hubble_ngc2660.jpg

ISS: https://eol.jsc.nasa.gov/SearchPhotos/photo.pl?mission=ISS044&roll=E&frame=452152023-08-26_12-22-28.jpg
 
Well it's entirely true if we are only talking about in-camera light spread, which I am.
I think it's only entirely true if you're talking about sensor bleed or bloom.
In every other case, adjusting the exposure darkens the glare, but doesn't make it go away.
If there's something obscured by optical or atmospheric glare, you're not going to make it visible by adjusting your exposure. (unless it's another light source that saturates the sensor like the glare does)
 
I'm figuring 1:10³ in power, for an albedo close to 1. How do you figure your number? I wasn't able to find good sources.
It’s essentially the ratio of the angular size of the Sun in the sky versus a full hemisphere of the sky. The radiance is in units of W/m^2/sr. So any point on the ground has an irradiance (W/m^2) equal to the radiance of the sun integrated over its angular area in steradians. That point if properly diffuse will reflect into a full hemisphere, which is of order pi steradians.
 


This video starts out with the light source correctly exposed. We see the disk of the sun, partly occluded by a plant.
2023-08-26_12-27-05.jpg

I then remove ths filter, and the sensor is briefly overloaded.
2023-08-26_12-28-41.jpg

The camera adjusts the exposure and the glare shrinks:

2023-08-26_12-29-26.jpg

Eventually the camera adjusts, and we see various type of glare.
2023-08-26_12-41-51.jpg

We can separate out the in-camera glare by occluding the direct lines of sight to the sun.
2023-08-26_12-42-57.jpg

So now we see the atmospheric glare around the sun. Notable here is that it appears behind the plants.
 
Well it's entirely true if we are only talking about in-camera light spread, which I am.

Of course atmospheric diffusion/scattering exists, but I don't see it as a factor in any of these cases.



Sure, but is it atmospheric glare? You get different sized stars when viewed from space, because of in-camera scattering/diffraction.

Hubble:
hubble_ngc2660.jpg

ISS: https://eol.jsc.nasa.gov/SearchPhotos/photo.pl?mission=ISS044&roll=E&frame=452152023-08-26_12-22-28.jpg
Essentially the brighter stars look larger because you are seeing more of wings of the point spread function as the core saturates. The PSF will depend on the diffraction of the optical system, its geometric image quality (I.e., aberration content) and any scattered light effects, typically due to surface quality and/or cleanliness of the optical surfaces.
 
I think it's only entirely true if you're talking about sensor bleed or bloom.
In every other case, adjusting the exposure darkens the glare, but doesn't make it go away.
If there's something obscured by optical or atmospheric glare, you're not going to make it visible by adjusting your exposure. (unless it's another light source that saturates the sensor like the glare does)
But we are not talking about cases where that happens. Like I said, if you expose correctly for the sun, then everything else is black.

The discussion here is about the size and shape of glare around bright lights. We are specifically interested in Gimbal and Chilean Navy. In both of those the exposure is set correctly for the dynamic range of the broader scene, the clouds are correctly exposed. But in both cases the bright engines are vastly over-exposed, which means you get glare.
 
Essentially the brighter stars look larger because you are seeing more of wings of the point spread function as the core saturates. The PSF will depend on the diffraction of the optical system, its geometric image quality (I.e., aberration content) and any scattered light effects, typically due to surface quality and/or cleanliness of the optical surfaces.
Yes, and the PSF (or rather the physical state of the optics that creates the PSF) is what leads to irregular shaped glares.
 
In July I was taking a photo of the rising “super moon” and here are two photos. The first is exposed for the clouds reflecting the moonlight (though not well focused) and you see an overexposed moon with glare around it (most behind the trees so not in camera) and the second is properly exposed for the moon in which everything else is black and unseen.

IMG_1447.jpeg
IMG_1448.jpeg
These photos were taken two minutes apart.
 
Last edited:
The camera adjusts the exposure and the glare shrinks:

2023-08-26_12-29-26.jpg

Eventually the camera adjusts, and we see various type of glare.
2023-08-26_12-41-51.jpg
This I don't understand, unless it's caused by the upper picture being defocused.

The sky is a light source that is brighter than the sensor? and reducing the glare makes it show. The trees are discernible because they're now darker than the sky, but they don't have discernible shades or color.
 
Yes, and the PSF (or rather the physical state of the optics that creates the PSF) is what leads to irregular shaped glares.
Yes. Out in the wings of the PSF it’s not necessarily well behaved. And for non point images what you record is a convolution of the source intensity with the PSF so that will lead to irregular shapes as well.
 
This I don't understand, unless it's caused by the upper picture being defocused.

The sky is a light source that is brighter than the sensor? and reducing the glare makes it show. The trees are discernible because they're now darker than the sky, but they don't have discernible shades or color.
Think of the glare as like the topography of the mountain and your image is cutting a slice across a certain elevation. At low exposure settings it’s like cutting at high elevation and you just see the tip of the mountain but with more exposure you see down to a lower elevation, which has much larger contours. Everything within that contour line is overexposed so the detector is pinned and there’s no contrast.

I apologize if you might already understand and I’ve dumbed it down too much.
 
This I don't understand, unless it's caused by the upper picture being defocused.

The sky is a light source that is brighter than the sensor? and reducing the glare makes it show. The trees are discernible because they're now darker than the sky, but they don't have discernible shades or color.
I think this particular video is perhaps complicated by the varying levels of occlusion of the sun by the plants.

Here's another illustration.

2023-08-26_13-07-05.jpg

First frame has full sun (note the actual size of the sun is the same as the green dot), with a large glare that goes in front of my garage. The shape is flattened, and relative to the camera rotations.
Second frame, the sun is occluded (blocked) by the garage, but only just some the brighter atmospheric scattering still shows up as bright white, and slightly veils the garage with in-camera scattering.
Third frame, the sun is a bit more behind the garage, so we don't see any effect in front of it, but we still see atmospheric spread.

So I suggest the gimbal glare is the in-camera type - largely because of the observables seen in the video.
 
Think of the glare as like the topography of the mountain and your image is cutting a slice across a certain elevation. At low exposure settings it’s like cutting at high elevation and you just see the tip of the mountain but with more exposure you see down to a lower elevation, which has much larger contours. Everything within that contour line is overexposed so the detector is pinned and there’s no contrast.
20230826_221754.jpg
My contention is that you won't be able to bring out the green zigzag inside the glare however you adjust the exposure, because it's gone. But Mick's two pictures seem to disprove that idea, until you realise that the trees you think you see are actually underexposed glare that is darker than the sky which we do see.
2023-08-26_12-29-26.jpg2023-08-26_12-41-51.jpg
 
So I suggest the gimbal glare is the in-camera type - largely because of the observables seen in the video.
Yes, we already know that because it rotates with the camera. It probably obscures a smaller atmospheric glare inside it?

I'm working on your claim that exposure affects the size of the glare; occlusion is a distraction in that regard. I'd say exposure affects the size of the overexposed area? Which is obviously true. And because the glare spans so many orders of magnitude of illumination, there's a continuum of sizes. And if you're exposing for the brighter parts of the glare, the darker parts become underexposed, but that doesn't mean it goes away?
 
This shows the effects of exposure on glare size with the sun fully exposed the entire time after I remove the filter


Again we see the entire scene wiped out at first, then the (in-camera) glare shrinks as the exposure is reduced (i.e. less light being captured) But I think we can see more details of the siding and the tree in the less exposed image.
2023-08-26_13-28-03.jpg

The fully saturated portion of the glare is certainly smaller though.

Dave Falch posted an interesting comparison yesterday showing IR glare vs visible glare. The IR glare has much more well-defined boundary between fully saturated and not.

2023-08-25_21-52-38.jpg

So we have to keep in mind that the visible light glares are not direct analogs of the IR glares.
 
Dave Falch posted an interesting comparison yesterday showing IR glare vs visible glare. The IR glare has much more well-defined boundary between fully saturated and not.

2023-08-25_21-52-38.jpg

So we have to keep in mind that the visible light glares are not direct analogs of the IR glares.
It’s a little hard to tell from these two because one is inverted and we don’t know the relative scalings of the stretch. But also in the IR materials may have different restive reflectivities. Things could definitely look different in IR and not intuitive based on our visible understanding of how things look around us.
 
Dave Falch posted an interesting comparison yesterday showing IR glare vs visible glare. The IR glare has much more well-defined boundary between fully saturated and not.

2023-08-25_21-52-38.jpg

So we have to keep in mind that the visible light glares are not direct analogs of the IR glares.
He keeps proving your points but saying you're wrong.
 


This is FLIR footage of the same scene as earlier, we see the sun emerge, with a very well defined glare. Here it is in Black Hot



We see a smaller glare as the sun is emerging (only a part of the sun is visible, so less light/heat), then when it's in the open we get a nice well defined larger glare. There's no real veiling glare or atmospheric scattering apparent before it emerges.
2023-08-26_14-33-31.jpg

This reminds me of the Omaha Sphere going behind the horizon.

 
Essentially in order to demonstrate it we'd need a FLIR camera capable of as narrow an FOV as ATFLIR and then we'd have to take it to a military test range where we'd have a clear LOS of jets up to 30NM away, ideally as high elevation as possible, without getting arrested by by the FBI :)

Somewhere like the mach loop

https://machloop.co.uk/

Does Falch have access to this equipment?
 
Back to the Chilean/Gimbal comparison. I replaced on the the engines in an A340-600 with an F/A-18F, to scale.
2023-08-26_15-32-15.jpg


Then dropped this in Sitrec.

2023-08-26_15-39-26.jpg

On the right we see the original video, and the correctly scaled plane viewed with the same FOV (as discussed earlier).
We can zoom in on both the same amount (2000%, 20x)

2023-08-26_15-39-59.jpg

And then position the A340+F18 over the glare:

2023-08-26_15-41-08.jpg

Then we get the comparison with the F-18 in Gimbal:
2023-08-26_15-49-27.jpg


And combine, scaling the F/A-18F to the correct size, centering the engines in the glare.
2023-08-26_15-58-42.jpg

So we see the glare around a single engine is comparable in size to the Gimbal glare:
2023-08-26_16-00-32.jpg

It actually fits inside it!

2023-08-26_16-02-04.jpg

And that one probably wasn't the best fit, as it's 70NM away. Here's one a bit closer at 54NM:

2023-08-26_16-09-13.jpg
 
Dave Falch posted an interesting comparison yesterday showing IR glare vs visible glare. The IR glare has much more well-defined boundary between fully saturated and not.
One factor here is that IR isn't affected as much by moisture in the air, so there's less atmospheric glare with IR.
 
This shows the effects of exposure on glare size with the sun fully exposed the entire time after I remove the filter
View attachment IMG_0101-clip.mp4

Again we see the entire scene wiped out at first, then the (in-camera) glare shrinks as the exposure is reduced (i.e. less light being captured) But I think we can see more details of the siding and the tree in the less exposed image.
2023-08-26_13-28-03.jpg

The fully saturated portion of the glare is certainly smaller though.
I don't think I can see more details.
2023-08-26_13-28-03.jpg

Cover up the area outside the red lines to prevent optical illusions, and there's no trace of the thin dark line inside, where the saturated area was.

So it's mostly "lens glare".
 
The target may have been less saturated in the raw video, but the video processor adjusted the level and gain for the whole frame instead of just the center.
That's a good point. The sensor may be able to resolve detail that gets lost when the bit depth is reduced? and when the dynamic range of the image is reduced, these details then gave a chance to show up. Kind of like how fiddling with the contrast in image editing can kill details.

This would be a way for detail to become visible inside the "glare" area once the exposure is adjusted.
 
2023-08-27_08-18-02.jpg
This is at 13:58:15 in the original video. It appear to be shaky motion blur from a magnification change. It's going from 1012 to 675, so this is blurred 1012. I scaled up the A340 to 1.5x (1012.5/675), and it matches the blurred tracks of the four engine - although they fade/out vanish at different points - possibly from being occluded by internal optics.
 
Last edited:
That's a good point. The sensor may be able to resolve detail that gets lost when the bit depth is reduced? and when the dynamic range of the image is reduced, these details then gave a chance to show up. Kind of like how fiddling with the contrast in image editing can kill details.

This would be a way for detail to become visible inside the "glare" area once the exposure is adjusted.
That's right, the video processor applies ALG that probably reduces bit depth and blows out the target, and it applies image sharpening that causes the glow around the target, and adds symbology for the pilot. The target tracker processes the raw data, not the enhanced video that has reduced bit depth, so it sees more detail.
 
I've updated Sitrec with what I'm pretty sure is correct FOV chages and model sizes for Gimbal, Chilean, and Flir1. They all now have a slider for "video zoom", which will zoom the video and the simulation in sync.
2023-08-27_16-51-37.jpg
If you hold down the shift key you can move and scale those windows.

To verify the sizes, there's a Target Sphere you can adjust with "Target Size" - like set to 45 feet for the wingspan of an F/A-18 F

https://www.metabunk.org/sitrec/


An F/A-18 seems about the right size, but not really the right shape - at least with the default setup.
2023-08-27_16-55-57.jpg
 
Back
Top