GImbal Glare, Rotation, Clouds, and Angles

Zaine M.

Banned
Banned
I apologise for commenting into this thread, I was discussing 3d camera orientation and items I found while implementing that,

https://www.metabunk.org/threads/some-refinements-to-the-gimbal-sim.12590/post-358343

And as this has impact on glare, I asked where the appropriate place is to further that discussion I was directed here,


By way of background, the above mentioned thread, essentially, came about by asking, if the clouds are orientated this way, how did they get orientated that way. And I would like to take this opportunity to dive in to "it looks like glare, where is this energy coming from to generate glare".

1. Yes, we have the object and the light patterns that appear to be consistent with glare.


Source: https://www.youtube.com/watch?v=KXvVJsg37Yk


In that video, I demonstrate that an F-18 taking off from a carrier in full afterburner does NOT generate the same light patterns we see in the gimbal footage.

I then apply the inverse square law, as IR energy/ light is subject to it, noting that a distant plane, would be at a significant disadvantage, to overwhelm the sensor to such a degree, light patterns would appear.

I have also looked into DIRCM and the effects of a laser counter measure being the reason,


Source: https://www.youtube.com/watch?v=7gqQuwRX-ps


Now, that footage is pretty old and the patterns it generated isnt consistent with gimbal, so when i looked into more appropriate time period, I found this


Source: https://youtu.be/K97DQIRKZtg?t=53


Which still has the hallmarks of the first example.

What I was able to locate is people just shooting lasers at IR/ low light home camera systems and I did finally get a result "similar" (being used loosely) to gimbal,

laser7K.jpg


laser9.jpg


link to source,


Source: https://www.youtube.com/watch?v=wREpnGqEhSM



Now it should be noted that these are only visible for frames, and does not hold that shape effect for any length of time to be comparable to gimbal.

I "get" the broader point "glare is responsive to camera orientation", but with the discussion, I previously mentioned, that it is alleged that glare can rotate independent of what the camera is doing,


Source: https://x.com/lbf_tweet/status/1853613752559579298


So if I may humbly ask,
1. how do we know that this is glare from a distant plane?
2. how is it generating enough IR energy to create the effects we are seeing?
 
1. how do we know that this is glare from a distant plane?
We don't. But we do know that jet engines are very hot. And we know there's a plausible path across the LOS that physically matches a plane


2. how is it generating enough IR energy to create the effects we are seeing?
A combination of being hot and the camera settings - specifically focus and gain.
 
it is alleged that glare can rotate independent of what the camera is doing,
There is no such allegation. The glare rotates relative to the background due to changes in the physical orientation of the front facing pod components relative to the background, mainly due to pod roll, but in combination with pod pitch and yaw. The whole image, both the glare and the background, is then rotated by the derotation device. The rotation of the glare in the final video is thus a combination of those two factors. It always depends on both what the camera and the dero is doing.
You make a lot of false claims about the model on a regular basis. The most precise current answer to all of these questions can be found in the source code of the model. It's messy and complicated, but you already spend a lot of time on this so in the end you would save both yourself and others a lot of time if you became familiar with it.
 
Last edited:
The whole image, both the glare and the background, is then rotated by the derotation device. The rotation of the glare in the final video is thus a combination of those two factors. It always depends on both what the camera and the dero is doing.
Just to be specific, not the mirror, this de-rotation device correct?


Source: https://x.com/lbf_tweet/status/1853613752559579298


Because of the CW rotation, meaning,

that it is alleged that glare can rotate independent of what the camera is doing,

I don't want to misunderstand you, so please elaborate.
 
I then apply the inverse square law, as IR energy/ light is subject to it, noting that a distant plane, would be at a significant disadvantage, to overwhelm the sensor to such a degree, light patterns would appear.
Do you, though? How much energy is radiated by a commercial jet engine? How much does the background radiate?
Your example footage shows an overloaded sensor, so it can't be used for scaling.
So if I may humbly ask,
1. how do we know that this is glare from a distant plane?
2. how is it generating enough IR energy to create the effects we are seeing?
We don't "know". But the geometry is consistent with an airliner at cruise 30nm distant. We lack the data to verify which airliner it was.

Compare https://www.metabunk.org/threads/ex...deo-aerodynamic-contrails-flight-ib6830.8306/ for IR footage of another airliner at similar range.
 
Do you, though? How much energy is radiated by a commercial jet engine? How much does the background radiate?
Your example footage shows an overloaded sensor, so it can't be used for scaling.

We don't "know". But the geometry is consistent with an airliner at cruise 30nm distant. We lack the data to verify which airliner it was.

Compare https://www.metabunk.org/threads/ex...deo-aerodynamic-contrails-flight-ib6830.8306/ for IR footage of another airliner at similar range.
I'm not sure i'd be as specific as airliner, jet engined aircraft
 
Seeing that thread now, this is false:
That said, the basic way the ATFLIR operates is generally understood. I don't think Marik and Peings dispute that the lens and mirrors move in the pod to track a target, and that movement will hit a physical limit, causing the assembly to rotate in the pod to continue tracking. Their main argument is that the craft is very close and defying physics, per the 2nd hand testimony of Graves.
There is zero mention of "defying physics" in our paper. Simply that a close path with sudden reversal/quick radius of turn, as reported by witnesses and found in the data, is anomalous. Doesn't mean it "defy physics". Something flying in the wind then reversing with wind does not "defy physics", it's just weird and unexpected from a regular aircraft or balloon. It's just a slow mover in the end. Anomalous, not breaking the laws of physics, as far as we can tell.
 
I honestly have know idea what you are asking Zaine.

Allow me to explain another way,


Source: https://youtu.be/Eozxt_HnPu4?t=2652


"but you also see light patterns in the sky rotate coincidentally with this thing here at the same time... its something in the camera system" (should be cued up)

If an F-18 in full after burner, is unable to overwhelm the sensor to generate these internal camera reflections, how is a non after burning distant plane with "hot engines" able to do that from 32 miles away? Inverse square law places IR energy into context as to how much energy at range would be received by the camera sensor

How much energy is radiated by a commercial jet engine?

Chilean Navy Footage, used as an example of distant plane being obscured by glare,

Screenshot (3929).png


Doesnt generate enough to create these light patterns. * noting different camera system as a limitation
 
Just to my comment in my video, that I have not seen these light patterns in any other footage.


Source: https://www.youtube.com/watch?v=91H771zhTRg


There is 20 minutes of atflir footage cut from the cruise videos, everyone spent hours looking at those videos for supporting data to the claims. Even when bombs are going off, the energy doesnt overwhelm the sensor to create those light patterns.

[edited for more context]
 
There is zero mention of "defying physics" in our paper. Simply that a close path with sudden reversal/quick radius of turn, as reported by witnesses and found in the data, is anomalous. Doesn't mean it "defy physics". Something flying in the wind then reversing with wind does not "defy physics", it's just weird and unexpected from a regular aircraft or balloon. It's just a slow mover in the end. Anomalous, not breaking the laws of physics, as far as we can tell.
"Defying physics" is a a bit much, but it would be moving in a very aphysical manner that I can't see how a conventional craft would do that. There's just no physical explanation that makes sense, even if you posit some kind of Lazarian drive.

It essentially, as @Edward Current pointed out years ago, has to precisely adjust its position with continually varying acceleration, just to mirror the curve of the camera plane in a way that traces out a distant straight-line constant-speed path
 
Doesnt generate enough to create these light patterns. * noting different camera system as a limitation
apparently enough to create glare, though
Just to my comment in my video, that I have not seen these light patterns in any other footage.
the pattern with the spike / the rotating background may indicate a blemish/scratch/dirt on a lens or camera window. It looks very much like a diffraction spike.
 
Doesnt generate enough to create these light patterns. * noting different camera system as a limitation
I don't think people are claiming the distant jet is creating those light patterns. They are just fixed regions of the sensor that vary in particular ways, like NUC. It probably just reflects changes in temperature of part of the camera.

We see it because we have a clear sky background during a significant gimbal motion, with the contrast very high.
 
It essentially, as @Edward Current pointed out years ago, has to precisely adjust its position with continually varying acceleration, just to mirror the curve of the camera plane in a way that traces out a distant straight-line constant-speed path

Assuming you know the elevation angle to the hundredths of degree, which we of course don't because we don't even have it to the tenth of degrees.

What you describe is just a consequence of :
1- Assuming the last cloud line has a constant elevation and is a perfect indication of change in elevation.
2- Interpolating between the beginning and end value of El angle (-2 to -1.95 or so), after estimating from the clouds that El angle only changes by 0.05°. This creates an artificial curve in the evolution in altitude of the close path, what you call "continually varying acceleration", but is just a consequence of limited data. And probably poor assumptions such as "elevation angle only changes by 0.05°".
 
This creates an artificial curve in the evolution in altitude of the close path, what you call "continually varying acceleration", but is just a consequence of limited data.
Can you suggest what the actual curve might be? We could test it against the video.
 
"Defying physics" is a a bit much, but it would be moving in a very aphysical manner that I can't see how a conventional craft would do that. There's just no physical explanation that makes sense, even if you posit some kind of Lazarian drive.
None of us can, but what does the data show is why we are all here.
 
We cannot invent data we don't have, but I tried something more like that -0.35° estimated change.
and by using the FOV 0.36X0.35, we can measure how much pod elevation changes has occurred

410.jpg


Screenshot (3841).png


This is the same thing we see occur in @Mick West example of a helicopter getting closer to a balloon. The background moves through at an angle, because the pods elevation is looking more down.

Doing a crude interpolation between these values, this is what I get in the "best-guess" scenario we present in our paper. Of course this is a crude estimation of change in El angle, we'd need precise temporal evolution of it. But using this we get same altitude for the object at the beginning versus end (~23,300ft).

1765319831844.png
 
This is all extremely sensitive to very small variations in El angle, as you know. Why I insist in the other thread that estimating it has only changed by 0.05°, and thinking it's gonna give a robust path in the distance, is wishful thinking.
 
If an F-18 in full after burner, is unable to overwhelm the sensor to generate these internal camera reflections, how is a non after burning distant plane with "hot engines" able to do that from 32 miles away?

ATFLIR technician Jeremy Snow raised the possibility that the ATFLIR that videoed GIMBAL might have been deployed without the sensor array having been calibrated against its on-board black body; see OP in thread ATFLIR Technician Jeremy Snow discusses Gimbal, FLIR1, and GoFast; approx. 55:44 to 56:32 on the video, 54:40 - 56:03 transcript.

I'm not sure that posters here who think GIMBAL is likely to have been caused by a bright IR source are claiming that internal camera reflections played any major role in producing what we can see in the video.
 
@Mendel I see you disagree, but about what exactly? You don't think that lines of sight are sensitive to small variations of El angle? If this is the case you haven't tried playing with it, it's just factual.
 
I'm not sure that posters here who think GIMBAL is likely to have been caused by a bright IR source are claiming that internal camera reflections played any major role in producing what we can see in the video.

Sorry, can you word that a different way?

(agree first part of this is better suited to the other thread, but Marik has made mention of the rotation indicating its a real object, plus others have bought up the flight path based on level clouds etc, so i do feel all of this is on topic to "Mick V Marik rotation glare")


Source: https://x.com/MvonRen/status/1838053654638841968


But heres the tldr to where everything is coming from,
1. When the footage is derotated on the artificial horizon, we get angular motion of the clouds.
2. A formula was used to account for that to place the clouds in a always level orientation (sin cos method)
3. This now resulted in rotation of the object (the glare, the thing we see in the footage), something consistent with a physical object or meaning the pod is rolling due to the way the pod rotates (glare is relative to the glass).
4. A computer generated rotation system was evoked, separate to the de-rotation mirror (they havent cleared that up so maybe theres no derotation mirror?), so now glare can rotate independent of what the camera is doing, but doesnt rotate the glare, only the background when it gets to stepped rotation, so maybe theres now a mirror involved?
5. In my initial thread I accounted for factors needed to be included to de-rotate the footage, that results in background angular motion, with no one able to provide where these extra degrees of rotation, to put the clouds level comes from (except if you start with, well the clouds look right so its not angular motion its now the camera rotating... go fast looked like it was going fast didnt mean that was the truth)
6. I see 3 degrees of rotation, but i will say thats not definitive,

I can see three degrees of rotation, from the start to the first major rotation of target, but at this stage, I can NOT say that is definitive, i will have to work on a more stable version.


Source: https://www.youtube.com/watch?v=-KVvebg4cXc

7. Cue this thread where I am again challenging "how did it get there" assumptions, if its glare from a strong IR source, how is it doing that.

It probably just reflects changes in temperature of part of the camera.

8. If its due to changes in temperature, we would expect to see that from the footage 10 minutes earlier in go fast. I don't, and i acknowledge "a lot can happen in ten minutes"
9. If those light patterns are from the object, contextually the footage was presented as anomalous craft, so they may of played a major part as a result.
 
6. I see 3 degrees of rotation, but i will say thats not definitive,
If you have looked into it, how much rotation do you get for the clouds (realignment) vs the thing?

EDIT to clarify, @Zaine M. : I'm referring to this angle, the angle of the clouds after removing bank, then the effect of pitch in your stabilized video (https://www.metabunk.org/threads/some-refinements-to-the-gimbal-sim.12590/post-358791)

1765342987314.png


It's about 10°.

There is almost no rotation of the object in that section, according to your post here : https://www.metabunk.org/threads/some-refinements-to-the-gimbal-sim.12590/post-358832

That section is when dero without roll should rotate both the thing (glare) and clouds, so I'm unclear how the dero can do that here.

Your tldr of Mick/LBF's theory is not quite right I think:
1. When the footage is derotated on the artificial horizon, we get angular motion of the clouds.
Yes, we see a mismatch between cloud slant and artificial horizon.
2. A formula was used to account for that to place the clouds in a always level orientation (sin cos method)
Yes, or the other method that isn't sincos but depends on elevation (LBF algorithm). It realigns both horizons ("real", the clouds, and artificial)
3. This now resulted in rotation of the object (the glare, the thing we see in the footage), something consistent with a physical object or meaning the pod is rolling due to the way the pod rotates (glare is relative to the glass).
Not sure I follow you here. My understanding is that it should result in a CW rotation of the object, that follows progressive realignement of the clouds with artificial horizon. Because dero is invoked to explain realignement of the clouds, and in the absence of pod roll it should rotate the entire image (glare+clouds). I do not see that rotation in tandem at all.
 
Last edited:
I'm not sure that posters here who think GIMBAL is likely to have been caused by a bright IR source are claiming that internal camera reflections played any major role in producing what we can see in the video.
Sorry, can you word that a different way?

Er, what about
"I get the impression [which might be incorrect] that most of the posters here who are of the opinion that the GIMBAL video is the result of glare from a mundane IR source (e.g. a jet aircraft) think that what is seen in the video- the "GIMBAL"- is the result of the optoelectronic sensor array imaging a bright IR source and/or post-detection processing, but perhaps not by internal camera reflections of IR."

A weak analogy might be with older image intensifiers; they could display flares/ white out if something bright was imaged, but this wasn't caused by the optics (lenses, prisms etc.) per se, but by the electronics. This analogy might be more relevant if Jeremy Snow's conjecture re. the ATFIR being deployed before the sensor head was calibrated against the black body is correct.
I'm not going to argue the point, though (and in retrospect there might not be much evidence to support my impression of what others think).
 
ahh, thank you, combination of black board calibration AND as per mick, temperature issues resulting in an over exposure type event, meaning light patterns and object not linked, object being target not the cause of light patterns.
 
We cannot invent data we don't have, but I tried something more like that -0.35° estimated change.
at 30 nm, that's 1000 feet of altitude, over 10000 feet of "sideways" distance. That's twice as steep as a mountain road. Is that even possible at that altitude over flat terrain?

Isn't it more likely that the stitching curvature is caused by barrel distortion of the camera, since the clouds are below the center of the lens?
 
@Mendel I see you disagree, but about what exactly? You don't think that lines of sight are sensitive to small variations of El angle? If this is the case you haven't tried playing with it, it's just factual.
you made a claim to the known elevation angle precision. I agree with Edward Current and his precision measurement, which you implicitly rejected in your post.
 
at 30 nm, that's 1000 feet of altitude, over 10000 feet of "sideways" distance.
I apologise I thought you were talking to the cloudline, but yes, a distant plane at 30NM would require 1200 feet of decent, before increasing again.



Yes, if at that distance it were vertical, but its more of, elevation change. The Az values for sidewards, its the same as mick saying 3.5 - 4 degrees. I get the same, 10.5 fov boxes, 3.65 degrees

Each of the large blue boxes, is the FOV of 0.35 degrees X 0.35 degrees


410.jpg





[edit to correctly address]
 
elevation is angle, using the sin() of the elevation gives you vertical
same with azimuth and sideways distance

so these heavily tilted clouds are really strange, because air can flow to equlize this difference
but if it did, there would be turbulence
but we see no signs of it over the "away from the camera" distance, which is a lot longer
 
Screenshot from 2025-12-10 08-44-57.png

Screenshot from 2025-12-10 08-45-08.png

at 30 nm, that's 1000 feet of altitude, over 10000 feet of "sideways" distance.

Precisely, the "sideways" distance seems to change. Do you feel like the cloud layer at the very beginning (top picture, to the right) looks the same as the cloud layer at the end (bottom picture, to the left), regardless of white vs black hot ? We see a flat and distant-looking layer first, with some perception of depth, versus what gradually becomes a more vertical-looking and bumpy layer of clouds at the end.

What makes you feel sure this is a consistent marker for elevation angle, to the hundredth of degree?
 
ahh, thank you, combination of black board calibration AND as per mick, temperature issues resulting in an over exposure type event, meaning light patterns and object not linked, object being target not the cause of light patterns.

Yes, something like that.
ATFLIR technician Jeremy Snow raised the possibility of the F/A-18 crew deploying ATFLIR before calibrating it against its black body; unless we have evidence that this can't (or in this case didn't) happen then it might be something worth considering.*
I don't know what the effects of a poorly-calibrated ATFLIR might have on the images it produces, but Mr Snow probably has some idea, and it certainly wouldn't be a beneficial effect (i.e., it's likely the images produced will be less accurate than those from a calibrated ATFLIR).
I guess it's important to note that Jeremy Snow doesn't say this is what happened, more sort of muses that it might have happened.
Thinking about it, even if non-calibration was an issue, or "over-stimulation" (or overload, whatever the term is) of the sensors played any part, this doesn't mean that flare at the objective lens wasn't also an issue; you were right (@Zaine M.) to query my statement in post #20.

A hypothetical failure to calibrate the sensor might not be necessary to explain the GIMBAL footage;
YouTube poster Ian Goddard (channel GoddardsJournal) has a video, "NY Times UFO Debunked?" posted c. 2018.

Between approx. 1 min 30 secs and 1min 50 secs into the video, Goddard shows footage from a Navy F/A-18 with lens flare rotating independently of the background:

v 1-39 lens glare.jpg


v 1-40 lens glare.jpg

v 1-45 lens glare.jpg


There is also an example of how conventional aircraft viewed in IR can be seen as (well, displayed as) unidentifiable blobs;
External Quote:

It's not until the pilots zoom in on these jets that we can see their wings and other signs of their terrestrial origin.
Narration at approx. 2 mins 03 secs - 2 mins 11 secs; these 2 screengrabs are from the video:

v 2-01 gimbal compared to f-14s.jpg

v 2-06 gimbal compared to jets.jpg


The video itself

Source: https://www.youtube.com/watch?v=Y2-4rL20ju0&t=177s


A note of caution; Goddard, quoting a respondent to another post of his, appears to think the GIMBAL (and presumably the jets in his examples) are black due to sensor overload (again, I'm unsure of the correct term), this is displayed 1 min 52 secs - 1 min 56 secs on the video
(red underlining is mine),
v black.jpg


...but we know the targets (GIMBAL and the jets) are black because they are being filmed in black=hot mode.


*Perhaps it would be possible to engineer in a feature which automatically calibrated the sensor head "on start up" every time ATFLIR was used; maybe there are practical considerations why this (apparently) hasn't been done. Supposition: The calibration takes time; during a mission the ATFLIR sensor needs calibrating once, but subsequently within a given timespan the ATFLIR can be turned on and off without the need to re-calibrate each time, avoiding the delay in functionality that re-calibration might entail (speculation on my part, though).
 
You keep saying you did those things, but don't explain how.

Define "elevation", in this context, exactly.

We know the altitude of the camera is fixed at 25000 feet


I really don't know what you are try to get at here. Why is the cloud horizon curved?
View attachment 86824

This seems more like a sitching artifact than anything. Like, why not:
View attachment 86825

@Zaine M. the problem with these stitched panoramas is that we are trying to project a 3D scene with a moving point of view on the 2D plane, so they will always look weird. However I think doing it after getting things level as you do ("up is up") it the less bad way of doing it (more natural). And what's missing is how the perspective would actually change as the camera is getting closer of the thing.

I try to illustrate this with an edit of the stitched panorama:
export(2).png


We are looking sideways at first, with perspective change as the plane is moving. More depth in the clouds at the beginning, and as the plane is getting closer to the object the elevation goes down (or the scene rises here), revealing closer clouds which explains that they look more vertical, bumpy, and less distant, at the end versus the beginning.

This is now how I see the encounter happening, and everything clicks as far as I'm concerned.
 
I post here because this thread is a bit about everything Gimbal-related.

It's about the light patterns in Gimbal, probably seen as the most compelling evidence that the steps rotations of the object, are from the camera.

A common argument is that similar light patterns happen in FLIR1, when the pod takes a big roll near singularity. Around 49 sec here:
https://www.metabunk.org/sitrec/?sitch=video&video=FLIR1

FLIR1, the pod is looking up (5° elevation angle) and it's tracking right to left (4R to 8L), so we expect a counterclockwise (CCW) pod roll during tracking. But the light patterns rotate clockwise (CW).

Another common example of abrupt roll near singularity is in this video, at 5'03:

Source: https://youtu.be/PTy5letXGuo?si=bYwgLGgE_GsoOb_R&t=303



We see a plane being tracked left to right, looking down, as can be more clearly seen at the very end (5'14)
1767562340490.png


It looks to me like this FLIR video comes from the dogfight being filmed from the cockpit just before, but it doesn't really matter (except for the cloud layer in view which may be interesting for comparison to Gimbal).
1767562409641.png


In that configuration, tracking left to right, looking down, we expect CCW roll like in Gimbal. But again, the light patterns rotate CW.

My interpretation is that the light patterns in FLIR1 and this video are sunlight/ambient radiation entering the pod at a constant angle, and hitting a portion of the sensor, or at least having a non-uniform impact at its surface. Then when the pod and the sensor roll CCW (they go together because the sensor/camera is in the pod head), another portion of the sensor is affected by this ambient radiation that hits at the same angle, i.e. CW relative to the initial position in the image-> the light patterns rotate CW in the image.

Now, I don't recall having read/heard what exactly the source of the light patterns is hypothesized to be in the glare theory of Gimbal, but they are supposed to be independent from the object. If this is ambient radiation affecting the sensor, they should rotate opposite to roll, as seen in two comparable examples of light patterns/pod roll.

So the question is: what is their source, and what is the theory behind them rotating in the direction of roll, if they are not a result from the high-contrast object being tracked?
 
They are just fixed regions of the sensor that vary in particular ways, like NUC. It probably just reflects changes in temperature of part of the camera.

@Zaine M. pointed me this bit, that I had missed.

NUC: "Thermal imaging sensors are sophisticated devices that require precise temperature readings. However, these readings are only accurate if all pixels on the sensor are calibrated to read the same temperatures at the same value. Non-Uniformity Correction plays a pivotal role in this calibration process. It adjusts for minor fluctuations in heat on internal components, lens temperature, and lens characteristics, ensuring that all pixel readings are consistent and accurate."

How would affected pixels rotate in the scene? Are you now saying that there is a post-processed derotation (after the sensor)?

How is this less speculative than a high-contrast object, in a low-resolution video, inducing light streaks?
 
a simple way to see if dero is before or after the sensor, is to check for dead pixel. Oh, here is one.
Screenshot (4251).png
 
Ahem, yeah that's right I can see this little guy in there. Not moving one bit throughout the entire video.

We've been doing a bit of back and forth with @Zaine M. tonight and noted a few interesting things:

- the light patterns being "fixed regions of the sensor that vary in particular ways, like NUC" requires post-processed dero, or computer dero, whatever you want to call it. Dero after the sensor because if it was before the sensor, the affected pixels would stay where they are in the frame all along.

- a simple way to check if dero is happening before or after the sensor, is to look for dead pixels, that remain at outlier values throughout the video versus the surrounding pixels. One of them is marked by @Zaine M. just above. It doesn't move one bit in the frame, pointing to a derotation system being before the sensor.

- Dero before the sensor rules out the NUC effect on the sensor, and I've shown above that ambient radiation entering the pod would not cause light patterns rotating along with roll, they'd rotate in the opposite direction (like in FLIR1 and the other example video).

This leaves us with what?

The hypothesis that light patterns are induced by the high-contrast object has been labeled ridiculous multiple times, I'm looking forward to hear what the alternative is, and how less ridiculous it is.
 
Thinking about this @logicbear and @Mick West sorry to bother, but why does that burnt pixel stay in the exact same spot in the OG footage? If there were some "formula rotation" going on after the camera and before the recording device, wouldn't that be rotating also?

Rotates the whole image right? that was the claim the formula was doing?

[edit-clarity]
 
Last edited:
The hypothesis that light patterns are induced by the high-contrast object has been labeled ridiculous multiple times, I'm looking forward to hear what the alternative is, and how less ridiculous it is.
It's been labeled that because you don't have any plausible story for how a high contrast object rotating about its own axis would indeed cause patterns that look and rotate like that. Even if some of the light that gets scatters throughout the image were to come from a high contrast object, I think it would rotate with the camera, not the object, because that scattering would be internal/optical. So there's a question of where exactly the patterns come from, but it's not a matter of finding "an alternative".

I think it's fairly clear from both the manual and what Jeremy told us that the dero happens before the sensor. I've said as much a couple times, e.g here and here. I doubt anyone is suggesting that the dero happens after the sensor. At least not intentionally.

My hypothesis has mainly been that it has to do with imperfections somewhere in the front facing optics (window/lenses/mirrors), that either dampen the intensity of the incoming IR by an uneven amount, or scatter the incoming IR in an uneven way.
Another alternative is that there are slight temperature variations inside the pod, not necessarily in the sensor (whose temperature variations can be corrected by a NUC) but elsewhere along the optical path, some part of the pod that rotates with the front window, and then small amounts of IR emitted by those internal components might get reflected towards the sensor.
 
Last edited:
It's never been extremely clear to me that post-sensor derotation was completely discarded. It's been discussed. Saying that light patterns come from the sensor (no further than in this thread) also suggests post-dero sensor.


Source: https://x.com/MickWest/status/1882183226850230679?s=20


Anyway. Light patterns have directly been compared to FLIR1 to prove that rotating light patterns in Gimbal are from pod roll.
But unless shown otherwise, we expect the pod to roll CCW in FLIR1, but the light patterns rotate CW. So another bad example/approximation, made in this video in particular.

Source: https://youtu.be/4Btns91W5J8?si=jNxQ8lSvQjk0nJbf&t=25


Your hypothesis of imperfections affecting incoming IR is speculative, with no evidence to support it. Incoming IR reflections in the two examples I discuss in post #33 (FLIR1, the other footage) rotate opposite to roll.

A hot object rotating in a high-contrast, low resolution video, may induce reflections/light artifacts that rotate with it, this is just an example here:
https://www.metabunk.org/threads/some-refinements-to-the-gimbal-sim.12590/post-359522

So once again we have to deal with an allegedly irrefutable, non-negotiable evidence for rotating glare (the 3rd observable), that turns out to be backed up with unsupported explanations, bad examples, and unverified speculations.
 

Attachments

  • 1767638871723.png
    1767638871723.png
    314.2 KB · Views: 17
Last edited:
A hot object rotating in a high-contrast, low resolution video, may induce reflections/light artifacts that rotate with it, this is just an example here:
https://www.metabunk.org/threads/some-refinements-to-the-gimbal-sim.12590/post-359522
The light patterns there mainly appear to be optical. They would rotate with the camera. Their position does also depend on the position of those torches, but in Gimbal its position doesn't change that much, it mainly just rotates about its axis. There's some change in the background illumination in a small room, but the sky is not a small room, and the field of view is constantly moving. So you talk about bad examples while providing one of your own. It's not clear, even in principle, hypothetically/theoretically how what you're suggesting could possibly ever work. Meanwhile, there are plausible alternatives for patterns that rotate with the camera.
 
Last edited:
Back
Top