Why does the Gimbal shape change?

The topic of the thread is the shape in the Gimbal video and how it changes. Please stay on topic.
 
Your explanations on glare are very detailed, but frankly they are just speculations on how a glare would behave in those systems.
No, what I'm talking about is how glare fundamentally works on any optical system. The mathematics of wave propagation are the same whether we're dealing with visible light, infrared, radio, or even sound.
You can make all sort of theories like this to explain what we observe, but without evidence or backing by a real expert, it's just speculations.
I was pretty transparent in saying that simple convolution with a kernel is speculative, and just one example of something like what is observed could happen. But we already know it's glare on the grounds of extremely strong evidence, which my speculations here have no bearing on. The question I'm answering here is simple: can it be that a glare can change shape like that? Yes, it can be.
At the end of the day we have aviators saying they have never seen an IR signature like this before, and no directly comparable examples that shows an IR signature like this either. We need to accept we are left in the dark here.
Just the other day a commercial pilot on reddit posted a video of a UFO that turned out to be another airliner on a regularly scheduled flight, that just happened to look like it was stationary next to a contrail because it happened to move in just the right way for that to happen. He, as a commercial pilot, would (and did!) say he never saw anything like it. But it was just another airplane -- the number one thing pilots are trained to look out for. Sometimes, common things look unusual.

More to the point, we already have one example (in the Chilean case) of a glare imaged by a military pilot that turned out to be an airplane farther than he expected, so the idea that it's impossible for a trained aviator to make this sort of mistake is in contradiction with the evidence.
Putting all the evidence together makes this case incredibly puzzling, objectively. I believe it needs to be presented as such, and not as "debunked" or "mundane", something I still see all the time in online discussions (Twitter, Reddit etc).
From a scientific perspective, we don't have to show that it's "mundane", though of course it's very satisfying when a case admits a complete resolution. What matters is; does this case admit only an extraordinary explanation, or could it be a confluence of ordinary factors? What I mean when I say "defeat that line of reasoning" is not about winning internet debater points, it's about whether or not the evidence is sufficient to reject the null hypothesis that this is an ordinary airplane doing ordinary airplane things. I'll have more to say on the other thread, but to stick to the topic here I'll simply ask: is there anything about this glare, whether its shape (and change thereof) or its movement that's impossible to realize with a jet engine?
 
Last edited:
It's also possible the exposure is being adjusted, or that the software filters are doing something different as the video progresses.
Based on what I've found here it's a little unclear whether the ATFLIR changes its exposure or only the gain and level, but either way if those are being adjusted here then we should see the effect of those on the entire image, not just the blob in the middle. And based on some initial checks, it seems that maybe we do ? I'm not sure. The whole image seems to get brighter in WHT or darker in BHT in a way that's somewhat correlated with the change in the size of the blob. The correlation is not perfect, but you wouldn't expect it to be perfect as there are multiple things that go into determining the size of the blob. It's not clear if something other than exposure/gain/level could be causing this correlation though. It has been suggested that the heat from the object might be reflected from other things in the scene, but could it really light up the cold night sky like that ?

Here's the first frame of the video compared with one of the last white hot frames. It's subtle but if you squint I think you can see the latter is slightly brighter overall.

Here's one of the first black hot frames compared to the last frame. Slightly darker here as it's inverted.

I made a quick comparison between the average pixel intensity (blue) over the entire image and the area of the blob (green) as found using the method I described here. Note that the area does not discontinuously change when switching from white hot to black hot because the frames in the white hot region are adjusted as described. The drop in the blob's area around 33 seconds is just my blob detection algorithm not working due to the object moving behind trackbars. Here the area is scaled 10x and the brightness is scaled 3x.

1675123124347.png

To try to ignore any noise introduced by the clouds I also looked at the average brightness over only the upper third of the image. Here are the results for the white hot region. Both the area and brightness are scaled 10x here. Both seem to trend up in steps that are somewhat correlated in time.

1674529256598.png

And here are the results for the black hot region. This time the scale of the brightness is only 2X, so actually the swings in brightness are larger here than before. The whole image can sometimes be seen flickering in brightness, perhaps as a result of the auto exposure/gain/level algorithm struggling to find the right value ?

1674530699022.png

So how can we go further here to figure out how much of the change in shape/size is due to exposure/gain/level changes and how much is due to other factors like the heat source being seen from different angles, or otherwise changing how much IR it's emitting ? What criteria might the auto exposure/gain/level algorithm use to determine whether to change the settings, and can we measure that ? What might be the telltale signs of those other factors contributing ?
 
Last edited:
Back
Top