The Shape and Size of Glare Around Bright Lights or IR Heat Sources

Mick West

Administrator
Staff member
When you take a photo or video of a bright light source (or even just look at it) the light often appears a lot bigger than it actually is. The phenomenon is well known, but not really something I'd thought about. Why does it happen? What determines the shape of this glare and how big it is.

Glare comes up in several topics here in Metabunk, but there are two in particular. Firstly there's the glare around the Sun, which has led some Flat Earth people to claim that the sun gets visibly bigger during the day. However, it actually stays the same size and is just brighter.

Secondly, there are UFOs on thermal cameras. Jet engines are very hot, and the glare around those engines can be bigger than the plane itself, resulting in an off shape. The classic example is the Chilean Navy UFO

engine flares banked closeup overlay.jpg

Here the actual "light" (the heat source) is quite small, but the glare is very large. Multiple glares combine together and overlay, obscuring the actual shape of the planes.

That example was proven by radar tracks to be that particular plane. A hypothesized example is the "Gimbal" UFO video:


This was discussed at some length in that thread and continues to be discussed on Facebook. The notable things about the Gimbal video are:
  1. The rotation, which seems very unlike a normal aircraft.
  2. The shape of the glare - a flatter oval, with protuberances
  3. The "aura" around the object (which is actually a darkening, as the video is black=hot)

#1 and #3 are discussed at length in the other thread. Both things can be seen in this animation by @igoddard :



The aura is simply something you get around any bright glare in an IR image. It does not indicate anthing of interest.

The rotation of the glare is due to the rotation of the camera relative to the horizon. The image is then derotated to keep the horizon level with the plane. Since the glare is an artifact of the optical system it rotates relative to the horizon, even when there's nothing else rotating. This derotation is described in the patent for the ATFLIR system used for the GIMBAL video:

This derotation, and the potential for rotating glare/flare is discussed here. It's not really disputed, but can be difficult to explain. However it's clear that rotating glares can happen.

Glare or Flare
I'd previously referred to this as a "flare" or "glare" interchangeably. However "flare" is more commonly associated with "lens flare", which is a reflection inside the lens resulting in multiple images of the light - often as a line of different circles, or sometimes as a single reflection. To avoid confusion I'm going to try to stick to "glare" - meaning an enlarged and distorted shape of the light viewed directly.
Metabunk 2019-03-30 08-43-13.jpg


So with the rotation and aura explained, the question remaining with the Gimbal video is: "why is it shaped like that?" and "what is the underlying object."

The first question raised would be "is it even a glare?" Perhaps instead this is the actual shape of the object?
Metabunk 2019-03-30 08-53-21.jpg

I think not, largely because of the way the shape changes. Is the above the true shape, or is this early shape?
Metabunk 2019-03-30 09-02-16.jpg
Of the more dramatically different "white-hot" image at the start?
Metabunk 2019-03-30 09-04-03.jpg

But perhaps the biggest problem with this hypothesis is: how do we get a glare with this shape? In the Chilean Navy case, the shape of the glare came from the configuration of the engines. But here the shape rotates, which suggests it is an artifact of the optical system. Can we get saucer-shaped glares that rotate? Yes, we can:

20171219-100429-rpsp1.jpg

This again was discussed in the other thread. A flashlight with the reflector removed was used a bright light source. Images were taken with an infrared camera at different angles, and the result was a saucer-shaped flare with a distinct long axis that rotated based on the camera.

Metabunk 2019-03-30 09-15-20.jpg

This is nothing new, but with the renewed discussion on Facebook I set out to investigate more what actually makes these shapes, and if the Gimbal shape was plausible.

For one experiment I set up a bright light and videoed it with a camera that I rotated by hand, then derotated in software (the equivalent of the ATFLIR mirror derotator)
Flashing and Contour HD roating flare setup.jpg

The result again was a saucer-shaped glare that rotated.


Now, this is still a ways from replicating the Gimbal glare shape, but it's a start. Given the complexity of the gimbal optical path, it seems plausible a more complex shape would emerge.

Something that struck me was that I consistently got this saucer shape with different cameras. My initial idea was that the glare was due to streaks on the glass at the front of the ATFLIR. This is probably what leads to rotating glare like this:
http://www.military.com/video/opera...trikes/f-18-takes-out-insurgents/658386321001


Streaky glass tends to give these long flare though, which is not what we see. While I did replicate a rotating saucer-shaped glare by rotating streaked glass, it still had some long streaks.

Metabunk 2019-03-30 09-48-05.jpg
Note though that it also had some short streaks, which kind of matches the short axis of the Gimbal:

So it's possible that the streaks we just too thin to be recorded in the video.

This whole issue kind of got lost in the other thread, so I'd like to revive (and hopefully resolve) it here. What could cause this rotating glare shape? Can it be replicated?
 

Attachments

  • Rotating flare on flashlight - BW Low.mp4
    598.6 KB
Last edited:

Here I've adjusted the video of my "streaky slide" to be more like the Gimbal video in terms of contrast and resolution. The long streaks are less apparent. It does change shape a lot, but that's probably because the slide is moving around, whereas in the Gimble situation it would be more fixed.

However, going back to this one:



There was no extra streaky glass. So where does the shape come from? Well, perhaps from the glass on front of the camera? It's a bit dirty, not really streaky though.
Metabunk 2019-03-30 10-22-03.jpg

I gave it a thorough cleaning, and this did seem to reduce the oblateness of the glare.
Metabunk 2019-03-30 10-29-30.jpg

Hard to get REALLY clean though. This is after I cleaned it.
Metabunk 2019-03-30 10-42-29.jpg
 

Attachments

  • UFO rotating Infrared Flare compare slide torch.mp4
    2.2 MB
Last edited:
For one experiment I set up a bright light and videoed it with a camera that I rotated by hand, then derotated in software (the equivalent of the ATFLIR mirror derotator)
Flashing and Contour HD roating flare setup.jpg

The result again was a saucer-shaped glare that rotated.

I have some questions about this experiment.

Was this an IR camera ? Otherwise I'm wondering why the light appears black.

What causes the small secondary black spot that is moving around ? - lens reflection ?

What does the "derotating" software actually do ? Does it track image features to infer a rotation matrix which it then inverts ?

It might be helpful to see the un-derotated video to get a better sense of what the software is doing.
 
Last edited by a moderator:
Was this an IR camera ? Otherwise I'm wondering why the light appears black.
It's IR, (actually this one is not, the other one with the slide is IR) but the image is inverted, to match the Gimbal video (which is mostly inverted, but starts out uninverted, with white=hot)

What causes the small secondary black spot that is moving around ? - lens reflection ?
Lens reflection, a normal "lens flare" (not glare)

What does the "derotating" software actually do ? Does it track image features to infer a rotation matrix which it then inverts ?
Essentially yes. See the two yellow balls? I put them there to give the software something easy to track.

It might be helpful to see the un-derotated video to get a better sense of what the software is doing.
 

Attachments

  • FILE0007 Underotated torch with balls.mp4
    2.4 MB
Last edited:
Sorry, just to clarify this one (in color) is obviously not IR. I just converted it to B&W and inverted. The video with the slide is using an IR camera but is also inverted.
 

I think the effect is even clearer without the derotation. You can clearly see the glare move relative to the background, which I guess proves that it's coming from the lens or something attached to the camera rather than from the light.
 
Last edited by a moderator:
I think to get into the details here we need to understand what causes glare in the first place. Why is there a big light in the image when the actual light is small? The answer seems to be mostly diffraction.

When the light from a bright object makes an image on the sensor (or film) then theoretically all the light should fall on that image. However whenever light hits something, like the edge of the aperture, or dirt or streaks on the glass, or imperfections in the lens, then it gets bent by diffraction, which spreads out the light in directions that depend on the shape of the obstruction.

In astrophotography, large telescopes have a setup like this:


Which leads to what are called "diffraction spikes", like in this image from the Hubble Space telescope.



You also get less dramatic spikes from apertures that are not circular. The degree of the spike is related to the size of the aperture. Here's one photo at F/4.5
Metabunk 2019-03-30 11-15-27.jpg
F/10
Metabunk 2019-03-30 11-16-34.jpg

And then F/20
Metabunk 2019-03-30 11-16-00.jpg

Now in all three cases there still significant glare, but smaller spikes with the wider aperture. The wide open glare seems more circular, and the small aperture glare is more hexagonal.

So it would seem that the inner glare here is the even diffusion of light all around the edge of the aperture (or just the circular sides of the lens interior if the aperture is fully open).

This glare is lens relative, and so it does the same kind of rotation we are talking about here.

So, to extend this to the Gimbal video, could it be something in the camera itself making this shape? Some diffraction spikes from something in the light path, like the de-rotating mirrors? Or maybe a combination with dirt/water/smudges on the lens?
 
In the case of the Gimbal video, isn't the purpose of the camera rotation mechanism only to counteract changes in the aircraft's attitude ?

If that's the case it shouldn't be moving unless the aircraft's attitude is changing. I think there's HUD symbology in the FLIR videos which indicate the aircraft attitude. Can we tell if the apparent object rotations correlate with changes in that symbology ?
 
In the case of the Gimbal video, isn't the purpose of the camera rotation mechanism only to counteract changes in the aircraft's attitude ?

No, it has to rotate the view because there's only two axes of rotation of the ATFLIR, one of which is along the long axis, so you can't track something without rotating the view. See this example:

Source: https://www.youtube.com/watch?v=fxlKkn1b4IY


So the derotation is to correct this "pendulum effect". It's also a bit more complicated as there are coarse and fine gimbals, so the derotation need not be continuous.
 
Glare Generation Based on Wave Optics - Kakimoto, et al,
Metabunk 2019-03-30 12-44-55.jpg
http://nishitalab.org/user/nis/cdrom/pg/glare_m.pdf

Glare is a phenomenon whereby bright light sources or
reflections cause a spreading of light, mainly in human
eyes. It is perceived as a blurry circle or a set of radial
streaks around the light source. Recently, glare generation
has become a common rendering technique to enhance
reality in computer-generated imagery.
It is widely known that glare is caused by diffraction
and scattering at obstacles close-to or inside the eye
(Figure 1). Rays from a light source are first diffracted by
the eyelashes (Some people make their eyelashes curl
upward or have such eyelashes by nature and in these
cases the eyelashes have little affect.) and sometimes by
the edge of the eyelids [1]. This produces long radial
streaks of glare. After entering through the cornea, the
light rays are diffracted or scattered by the edge of pupil,
which causes a blurry corona to appear around the light
sources. Even suspended matter in the vitreous body is
said to cause diffraction. Other than diffraction and
scattering, aberration of the lens and anomalies of
refraction are also sources of glare, but their impact is
relatively limited.
Content from External Source
While this article is focussed on the eye, it confirms the glare is largely a diffraction phenomenon from obstacles and the edges of the aperture (the pupil in this case).
 
Some relevant things from the longer thread:
Here's something call the glare-spread function.
Following that link we see it talks about "veiling glare" which is a general term for glare that obscure part of the image, and seems frequently to refer to glare that gives the entire image a hazy look with reduced contrast.
Metabunk 2019-03-30 12-57-47.jpg

In the "Glare Spread Function" chart. The narrow part of the central spike seems to be the actual sun, the widening at the "base" is perhaps hard glare then the light below this is much less.
 
To investigate the shape of glare, I set up an open "camera" consisting of a lens, and a backplane representing where the sensor would be.
Metabunk 2019-03-30 14-23-14.jpg
This was put in the sun pointing at the window. You can see the sun on the paper, and a bit of the window. The lines are used for focus.
Metabunk 2019-03-30 14-24-33.jpg

There's the glare! Or is it? I found it hard to tell what was light spread on the projected image, and what was glare in the camera taking a photo of the "photo"

I set up a bigger camera, at an angleMetabunk 2019-03-30 14-27-18.jpg
I then used a variable ND filter to adjust the brightness, getting


I then moved the camera so it was at a shallow angle, and the image flattened, showing the flare was from the first lens, and what we see in the paper is what actually arrives on the sensor (for this lens, at that aperture)
 

Attachments

  • MVI_6717 star grow shrink.mp4
    4.2 MB
  • MVI_6718 edge on star glare .mp4
    6.5 MB
Last edited:
Here's a great video by Dave Falch showing a somewhat similar glare around a jet flying away.




Metabunk 2019-04-05 16-17-00.jpg
 
Last edited:
This is great analysis. The rotation seen in gimbal is now clearly explained.

I think however this video introduces some questions:

The pilots would have just needed to switch to TV mode to see the object clearly as the video proves. So this is one more point against the aircraft theory. If you can see the glare from the IR you can see the aircraft in TV mode. They would have had to be completely out of their mind to forget to attempt to switch to TV mode.

What concentrated heat source has a similar TV image to it's heat signature (FLIR video shows TV image too and they look similar) and is bright (=warm) enough to cause glare in an ATFLIR at a range between 20 to 40 nautical miles (the maximum theoretical range for ATFLIR)?
  • A flare attached to a balloon?
  • A balloon burning?
  • A balloon with a reflector of some kind or a mirror reflecting sunlight directly?
Incidentally those wouldn't have much of a radar signature and radar may not lock onto it but may reflect a lot of light/IR.

Pilots reported visual sightings with "orb with a cube suspended inside of it with the cube's corners touching or nearly touching its edges" and similar balloons have been proposed as an explanation in the past https://www.thedrive.com/the-war-zo...are-encountering-be-airborne-radar-reflectors

 
Last edited:
This has been offered as rebuttal, I personally find it a bit curious! is it really relevant? Hmm?

It's not relevant at all. All that he demonstrates is that IR and visible light cameras use different lenses with different transmissibility.

The glare in GIMBAL is an IR glare. Visible light is not a factor. He seems confused as to what the point is, and has created a demonstration of something that's not in dispute and is essentially irrelevant.
 
It's not relevant at all. All the he demonstrates is that IR and visible light cameras use different lenses with different transmissibility.

The glare in GIMBAL is an IR glare. Visible light is not a factor. He seems confused as to what the point is, and has created a demonstration of something that's not in dispute and is essentially irrelevant.

I don't think they understand what IR is... they seem to think it isn't "light" :p. But hey, they do have a lot of cool equipment!
 
I don't think they understand what IR is... they seem to think it isn't "light" :p. But hey, they do have a lot of cool equipment!
Dave Falch has seemingly, over several years, developed a misunderstanding of thermal radiation. As he's a FLIR tech, he know there's two different major types of thermal sensor - photon counting sensors (cooled) and thermal effect sensors (uncooled). The photon effect sensors are more like regular camera sensors - photons hit the sensor, and this is detected, but they need to be cooled as there's lots of IR radiation flying around inside a camera.

Thermal effect sensors work, essentially, like an array of tiny thermal probes. Microbolometers is the common term. They work by the incoming IR radiation heating up the sensor. There's a heat sink to carry the heat away. There's a good overview of the tech here:

https://www.flir.com/discover/rd-science/high-speed-thermal-cameras--the-need-for-speed/
In general, there are two types of thermal infrared cameras in use today. These are high performance cooled photon-counting cameras and low cost uncooled microbolometer-based cameras.
Content from External Source
http://www.flirmedia.com/MMC/CVS/Appl_Stories/AS_0015_EN.pdf
Metabunk 2020-05-13 08-37-44.jpg

Note it says "The temperature of the plate changes when a photon falls on it." Both types of camera work off photons - i.e. infrared radiation. The uncooled (heat effect, microbolometer) cameras just take more photons, as it needs to actually heat the sensor. This creates a significant difference in speed.
https://www.flir.com/discover/rd-science/high-speed-thermal-cameras--the-need-for-speed/Metabunk 2020-05-13 08-45-23.jpg

Dave Falch seems to have developed from this the idea that uncooled thermal cameras don't measure infrared radiation, they measure "heat".

Of course, "heat" here is just the infrared radiation that an object gives off. There's no difference in what comes into the camera. Both types are essentially recording infrared "light"

But he's taken this vague misunderstanding, and then claimed it somehow shows that my glare theory is wrong, because "heat" does not make glares as visible light does.

Of course, this is just nonsense. You don't even need to understand his confusion about heat vs. photons - just point to the numerous videos of infrared glare. Nobody has ever said it's visible light glare. He's even made videos himself of glare, and seen it in other videos, and called it "heat glare"
Metabunk 2020-05-13 09-03-31.jpg

Source: https://www.youtube.com/watch?v=7JuyQWiD5HU&lc=Ugy8A5c3JfjsIzNMzMt4AaABAg.9-JSDYgFstY9-RlbRd-lo-

He also seems to have a mental block regarding the glare rotating independently of the horizon, despite it being explained numerous times.
Metabunk 2020-05-13 09-00-35.jpg

Unfortunately, people are grasping on to him as an expert, and, having no idea of what he's talking about, they think he's actually proven something. Jeremy Corbell being the prime example. But then he tells other UFO fans (like Joe Rogan) that this is meaningful, and the nonsense spreads.

Dave himself seems fully entrenched. I think he takes my criticism personally, which has led him to push back and not listen to anything that contradicts his initial assessment. I had hoped he would have come around by now, but apparently not.

It's a fascinating mess, and an interesting science communication challenge.
 
Dave Falch has seemingly, over several years, developed a misunderstanding of thermal radiation. As he's a FLIR tech, he know there's two different major types of thermal sensor - photon counting sensors (cooled) and thermal effect sensors (uncooled).

I see.

Actually his videos are great. I think they are interesting comparisons of the difference between IR and TV mode. The images for aircraft are clearly distinguishable usually. Not sure how his equipment compares to the NAVY's though... why does the Navy ATFLIR TV mode just stay B/W for example do you know?
 
ATFLIR uses a low light TV camera probably the sensor is more sensitive at the expense of colour vision similar to the way human vision works with rods/cones. There may be a colour mode available but likely it's rarely used as colour information is not that useful to a pilot over better low light vision.

Part of the issue with researching this whole thing is that the technology used is classified and the exact specs and stuff are not available allowing for a knowledge gap. For instance we found various sources for the FOV used in NAR mode and found that newer systems likely have greater zoom and narrower field of view but it's hard to know which exact systems were in use on each aircraft across the videos.
 
ATFLIR uses a low light TV camera probably the sensor is more sensitive at the expense of colour vision similar to the way human vision works with rods/cones. There may be a colour mode available but likely it's rarely used as colour information is not that useful to a pilot over better low light vision.

Part of the issue with researching this whole thing is that the technology used is classified and the exact specs and stuff are not available allowing for a knowledge gap. For instance we found various sources for the FOV used in NAR mode and found that newer systems likely have greater zoom and narrower field of view but it's hard to know which exact systems were in use on each aircraft across the videos.

I don't think there was an RGB camera on that pod in 2004. Other pods have RGB cameras. The technical data is unclassified but export controlled, proprietary, etc. Only certain performance parameters are classified.
 
I think this video gives a good idea of how IR glare would look like on one of those sensors.

Around 9:43 you see an F-18 go to full afterburner. It is followed by same image in visible light basically. The heat increases significantly. Couple of observations:
  • sensor is very good/fast at adjusting exposure
  • heat signature extends way beyond the airframe (obviously). Compared to visible light is around 3-4 times larger
  • aircraft shape is still visible and differences in temp are still detected despite the major new heat source
  • some light glare is visible around the afterburner and slightly rotates
Around 10 minutes in you see another interesting case of an F-18 dropping flares.
  • despite flares being engineered to fool IR sensors doesn't look like the sensor has any trouble detecting the F-18.
  • on the contrary it lowers exposure and more detail is visible. The sensor adjusts very fast and precisely here.
Notice in the video several bomb drops are shown. Not many glares despite the sensor being clearly overwhelmed momentarily. (see 10:10 for example. Huge explosion: no glare)

The video is from 2016. So around the time of the Roosevelt videos. The "glare" seen in GIMBAL seems very dramatic. I think regarding that video:
  • the dramatic glare seen is not normally seen with ATFLIR. This would explain the reactions heard despite a WISO being trained to look at the world through ATFLIR for years. They said "it's rotating" because normally glare wouldn't show up like that at all, let alone rotate.
  • ATFLIR didn't adjust exposure at all across the video. So it didn't "feel" overexposed
  • This is as much detail as ATFLIR could give us of the thing. Which is weird. It was either uniformly bright or had a powerful source of IR shooting directly in the direction of the sensor and overwhelming it locally.


Source: https://youtu.be/Hvy5RyJpB5g?t=583
 
Pardon the probably silly question, but don't the FLIR Raytheon cameras on the F/18 have anti glare lenses?
This reflects a problem with the terminology. What it an "anti-glare lens"?

Anti-glare glasses have an anti-reflective coating. This lets a bit more light in (as less is reflected), and does not really help much with glare by itself. Anti-reflective coatings inside a multi-element lens will reduce lens flare, but not glare around a light source.
https://en.wikipedia.org/wiki/Anti-reflective_coating

"Anti-glare" photographic filters and some glasses sold for night driving are polarized light filters. The "glare" they address is an entirely different optical effect - basically reflected sunlight that has either been reflected off a surface, or (to a lesser extent) diffused through clouds. The reflection partly polarizes the light. The "night driving" glasses often also have a yellow tint which can improve scene contrast for the human eye.

But the glare we are talking about here is the unfocused or scattered light glare that surrounds an overexposed bright light. The only filter that can remove the glare around a bright light is one that makes that light darker - i.e. a neutral density filter. The most obvious example is the sun, where you can remove the glare by reducing the scene brightness by a factor of 100,000. Of course, this results in the scene being black.
solar-filter.jpg

Is there an "anti-glare" lens? Well you can certainly improve glare with better quality optics. The closer a lens is to optically perfect, then the less glare there is going to be. Reducing glare is a desirable function of a lens, yet you are not going to find one that removes it, and for very bight lights (like the sun, or in IR a super hot small heat source on a large cold background) the amount of reduction possible is small.
 
Last edited:
I think this video gives a good idea of how IR glare would look like on one of those sensors.
People keep making comparisons like this. But they always show a jet that's much closer, an hence larger in the image, than the Gimbal object.

Here I've matched the vertical fields of view, and inverted the "Wildcats" video.
Metabunk 2020-05-16 11-28-40.jpg

What the Gimbal is more like, is this:
Metabunk 2020-05-16 11-35-05.jpg

It won't look like that though. With the smaller size of the object, the glare gets correspondingly larger as the camera exposes for the entire scene. So a bit more like:

Metabunk 2020-05-16 11-41-30.jpg

In addition the object is most likely flying away from the camera, and does not have full afterburners on, so you won't see the exhaust plume, so more like:
Metabunk 2020-05-16 11-43-06.jpg
 
People keep making comparisons like this. But they always show a jet that's much closer, an hence larger in the image, than the Gimbal object.

I'm trying to find "long distance" ATFLIR shots. No luck so far. We don't really know the GIMBAL range though. Or do we? Can it be estimated given the clouds behind? The clouds make it look like a relatively close object. Also: we are in NAR 2X zoom. It might be more powerful than the ATFLIR used at the time of nimitz.

It won't look like that though. With the smaller size of the object, the glare gets correspondingly larger as the camera exposes for the entire scene. So a bit more like:

Metabunk 2020-05-16 11-41-30.jpg

How do you know the size of the object? Also the camera should be adjusting exposure for the target. What do you mean the "whole scene"?


In addition the object is most likely flying away from the camera, and does not have full afterburners on, so you won't see the exhaust plume, so more like:
Metabunk 2020-05-16 11-43-06.jpg
How do you know it's flying away? What's the evidence to that? And why no afterburners?

The pilots say: "they are going against the wind" and "there's a whole fleet of them on SA".
The SA screen can be seen here:
Source: https://www.youtube.com/watch?v=xdr2_SD3nq4


At 2:25 you have an overview and at 5:52 you see how targets look and the info you can receive (such as heading). The comments by the pilots indicate they could see range, direction and number of objects on this screen. So they probably had radar lock here (either them or others) or the SA wouldn't show them (unless they are laser designating them maybe?).

So range can't be too crazy and the objects were probably not going away from the observer given the bearing of the pod and movement of the clouds (moving rapidly to the side as the aircraft banks) but moving to the side. I think we can probably establish the exact path by using the info in the video. And the pilots clearly say they are moving against the wind (SA would have confirmed this with a little line indicating heading).

Also: if we know the camera does not create glare with bright afterburners at short range or big explosions why should it glare at long range and less bright objects?

I think to create such glare this needs to be a much more direct, concentrated, and bright source of IR.
Could this be something shining IR directly into the sensor? Maybe on purpose? Or just in general emitting IR directly.

The aircraft in the distance theory just raises more questions than it solves.
 
Last edited:
How do you know the size of the object? Also the camera should be adjusting exposure for the target. What do you mean the "whole scene"?
It's smaller on screen.
The whole scene is everything in the image. There would be no point in everything except for a single target being invisible.

Also: if we know the camera does not create glare with bright afterburners at short range or big explosions why should it glare at long range and less bright objects?
We know IR camera create glares at long distances from normal power engines. Because that's what we see. Like in this classic example.
engine flares banked closeup overlay.jpg

You can replicate what is going on with a thermal camera:
Candle Flare Comparison.jpg

And we see it in footage using military IR camera. A glare that's big enough to cover the jet, when it's a certain size on the screen.


And you've see it yourself in Dave Falch's videos. A glare from an ordinary jet doing nothing special, a glare that obscures the plane.

The aircraft in the distance theory just raises more questions than it solves.
You don't seem to be trying to answer those questions. But they have answers. While I appreciate a rigorous critique, I'd also appreciate it if that was based on actually following what you are critiquing. Glare around IR sources should not be controversial at this point.
 

Attachments

  • optical and digital zoom ATFLIR.mp4
    8 MB
It's smaller on screen.
The whole scene is everything in the image. There would be no point in everything except for a single target being invisible.

As the videos shown in the cruise videos demonstrate: the camera is very good at avoiding those problems. Exposure is adjusted very fast to avoid loss of details. Those are not your typical IR cameras but a complex sensor suite made exactly for this purpose: to track flying targets. The environment is useless here (actually it may be preferable to remove it in most cases as the sky is empty and the environment would mean clouds or fog). If engines caused this glare it would be unusable. Especially for a target seen from the side as in this case (maybe you missed my edit to the above comment...I've added more after my initial post)

We know IR camera create glares at long distances from normal power engines. Because that's what we see. Like in this classic example.

You can replicate what is going on with a thermal camera:

And we see it in footage using military IR camera. A glare that's big enough to cover the jet, when it's a certain size on the screen.


And you've see it yourself in Dave Falch's videos. A glare from an ordinary jet doing nothing special, a glare that obscures the plane.

The last screenshot is the only really relevant one. What kind of military IR camera is it? ATFLIR? As you can see: the cloud detail is maintained. And the jets are clearly visible in the optical zoom and the apparent magnitude of the objects is not different from the GIMBAL videos.
In the GIMBAL videos however: cloud details visible, but the jets are nowhere to be seen and completely blurred. Quite strange, even with full burner.

To be clear: when I say "glare" I mainly refer to the "spikes" that rotate. The military example you show above has no "spikes" and there are no "spikes" that can rotate in the cruise videos examples (including flares that are engineered to overwhelm the sensor).
That's why I believe for such artefacts to appear a lot of direct IR would be needed (just a theory).

I am not questioning the fact that glare can exist. I'm wondering what conditions would need to exist for such "spiked" glare to exist if this is a target seen from the side at a range at which ATFLIR normally operates (as the data seems to indicate). But also "the blurry kind" of glare which would cover the target is strange in this case. ATFLIR can see clearly the clouds behind distinguishing details. So why so much detail is lost in a closer target it is actively tracking? No examples of this have been shown in any of the videos of ATFLIR we have. Why would ATFLIR behave so differently from all the other IR videos we have? When a bomb drops ATFLIR obscures the scene behind it adjusting for the target high luminosity. It clearly has a very sophisticated exposure adjustment algorithm. Why is it failing to operate here?

Once again: remember this is an object seen from the side (given it's visible movement) and not directly from the back.

You don't seem to be trying to answer those questions. But they have answers. While I appreciate a rigorous critique, I'd also appreciate it if that was based on actually following what you are critiquing. Glare around IR sources should not be controversial at this point.
Glare from IR is not controversial. What would be needed to produce such glare in an ATFLIR observing an object from the side is. I think we have demonstrated that an aircraft would have a very different image on ATFLIR.
 
Glare from IR is not controversial. What would be needed to produce such glare in an ATFLIR observing an object from the side is. I think we have demonstrated that an aircraft would have a very different image on ATFLIR.
No, you have not.

"ATFLIR" isn't a magic technology. It's a combined targeting/tracking/imaging system that includes an IR camera. The IR camera is just an IR camera. We don't know much about its specification.

I am not questioning the fact that glare can exist. I'm wondering what conditions would need to exist for such "spiked" glare to exist if this is a target seen from the side at a range at which ATFLIR normally operates (as the data seems to indicate). But also "the blurry kind" of glare which would cover the target is strange in this case. ATFLIR can see clearly the clouds behind distinguishing details. So why so much detail is lost in a closer target it is actively tracking? No examples of this have been shown in any of the videos of ATFLIR we have. Why would ATFLIR behave so differently from all the other IR videos we have? When a bomb drops ATFLIR obscures the scene behind it adjusting for the target high luminosity. It clearly has a very sophisticated exposure adjustment algorithm. Why is it failing to operate here?

It's not failing to operate. Glare is utterly unavoidable if there a large difference between a small part of the scene and the rest of it. You don't see it in the video of ATFLIR we have because there are hardly any ATFLIR videos. A video of a farway plane being obscured by glare is not really an interesting video (other than in this context). Bombs going off are irrelevant. The video is of a CONSTANT, SMALL, and BRIGHT source of IR.
 
To be clear: when I say "glare" I mainly refer to the "spikes" that rotate. The military example you show above has no "spikes" and there are no "spikes" that can rotate in the cruise videos examples (including flares that are engineered to overwhelm the sensor).
That's why I believe for such artefacts to appear a lot of direct IR would be needed (just a theory).
Spikes are largely diffraction artifacts. You see them in cameras with a variable aperture, like with these three shots:
Different len glares.jpg

Here the location of the spikes relates to the shape of the aperture, with the one in the middle (Sigma 17-50) being an octagon.
MG_0111.jpg
The spikes are part of the glare, but the glare we see in gimbal, has more limited spikes.

Metabunk 2020-05-16 21-54-44.jpg

I suspect this is due to a diamond-shaped aperture, like this:
Metabunk 2020-05-16 21-58-08.jpg
Such an aperture might be used because it's simple and robust.
 
With the Aperture wide open, you don't get the starbust of spikes. Instead you get a more "blob" shaped glare. Like:

Metabunk 2020-05-16 22-16-19.jpg

Reducing the aperture makes the spikes more and more dramatic.
 

Attachments

  • ScreenFlow aperture PP BW.mp4
    5.7 MB
We have seen FLIR videos of confirmed distant aircraft that have IR glare, we know it is possible for this effect to be a plane. The Chilean case is the most easily sourcerable one, a confirmed distant aircraft with IR glare larger than the airframe.

Heat radiates in all directions it's not very directional especially not at range the exact direction of the engines doesn't really matter, heat from engines on an aircraft can be seen from the side or at the 45 degree rear angle, again the Chilean case shows this the aircraft was at some points in the video not heading directly away from the FLIR camera.

A lot of the other FLIR videos we see around are from aircraft that are a lot closer at this close range the airframe extends past the range of the glare at longer ranges the glare obscure the airframe, again the Chilean case shows this.

Yes the IR camera will go out of range, ie solid colour for the major heat source and maintain the floor of the range for displaying the clouds, this is pretty much what we see in the gimbal video, it is possible pilots prefer this as the clouds provide some level of situational awareness or context for movement and they are less bothered about seeing details in the target IR source which would usually be a hot jet engine for airborne objects.
 


I'd written off my little FLIR camera as not being able to replicate the effect, but someone asked me to try it. So I fiddled around with smudging the lens. Then I thought back to the old experiment with a piece of glass in front of the lens. Normal glass blocks IR, but plastic does not. I had a Ziploc bag in front of me, so I put the camera in it, and took it out to film the sun. Bingo!
 

Attachments

  • GIMBAL Recreation with thermal camera.mp4
    17.9 MB
Back
Top