Some Refinements to the Gimbal Sim

Mick West

Administrator
Staff member
There are two issues with the Gimbal simulation, which I think I've now resolved.

1 - The difference between the cloud horizon and the artificial horizon over the first 22 seconds
2 - The possible slight apparent continuous clockwise rotation of the object over the first 22 seconds

Both these issues only affect the first 22 seconds and not a lot. They don't make any difference to the four observable that demonstrate we are looking at a rotating glare, except in the sense that it's now both more accurate in terms of the physics and in the end results.

The original Gimbal sim: https://www.metabunk.org/gimbal1/
And the latest one https://www.metabunk.org/gimbal/

The main differences are illustrated by comparing these graphs (old on the left, new on the right)
2022-08-20_14-45-05.jpg


And looking at the simulated view, first frame:
2022-08-20_14-47-52.jpg


The first issue is discussed here:
https://www.metabunk.org/threads/gimbal-derotated-video-using-clouds-as-the-horizon.12552/
The short version is that by measuring the direction the clouds moved in (parallel to the horizon), it was clear that this was at a shallower angle than the artificial horizon, most so when looking left.

The reason for this is that the horizon angle in the ATFLIR image should match what the pilot sees when they look in the same direction. I was previously assuming the horizon angle would be the same as the jet's bank angle (i.e. the same as the artificial horizon). But that's only true when looking forward. When looking more to the side then the bank angle contribution is diminished, and you get some more of the pitch angle. The precise math for calculating the desired horizon angle is found here: https://www.metabunk.org/threads/gi...using-clouds-as-the-horizon.12552/post-276183 and explained:
I'm assuming that it's intuitive for pilots to look left/right, up/down, but not to tilt their head, so I try to recreate what the horizon would look like if you had a camera strapped to the jet that can only rotate left/right in the wing plane, or up/down perpendicular to that. First I find how much I need to rotate the camera along those lines to look directly at the object, then I compute the angle between a vector pointing right in the camera plane, and the real horizon given by a vector that is tangential to the global 'az' viewing angle.
(Thanks to @logicbear)

This was easy to implement in Sitrec (not uploaded yet), but the Gimbal Simulator was more complicated. The problem being that the horizon angle (previously the bank angle) is used to calculate the derotation amount, which is equal to the glare angle. This relationship was used in reverse so that the glare angle could drive the pod's rotation (one line of code), and hence necessary derotation, and then it was shown this created a stepped curve (the green curve in the charts above) that closely followed the ideal derotation (the white curve). You see in the chart, it's labeled "pod roll vs. glare angle" since pod roll = deroation.

But with the more accurate and complicated calculation of the desired horizon, there was now the problem of how you go from glare angle to pod roll. Glare angle is still equal to derotation, so you can phrase it as how to find the pod roll for a given derotation.

Essentially what I do is first calculate the pod pitch and roll for the ideal track (the white curve) and then keep everything else constant and do a binary search, varying pod roll, until I get the desired derotation. This new pod roll then physically drives the pod head. In the sim the green dot indicates where the pod head is physically looking. The white dot indicates the actual target, which needs to stay withing 5° of the green dot so that the additional mirrors can keep it centered.

Doing this adjustment gives this graph. The magenta is a magnified measure of the angle between the green and white dots (not the graphed lines, which are dero). It stays under 5°, but it has got worse.

2022-08-20_15-08-46.jpg


This is where the second issue comes in. Many people have argued that the gimbal shape has a slight clockwise rotation over the first 22 seconds. This is hard to measure due to the switching from WHT to BLK, but a good case has been made that it's there. I've not worried too much about it as it does not make a huge difference, but on reviewing it I think it's quite plausible. Incorporating a slight clockwise rotation ends up removing the increased error. The actual amount is not that important, anything from 3° to 16° keeps the error under 4°, but I picked 6°. You can play with this value at "Glare Initial Rotation" under Tweaks.

This was all rather complicated. It's probable that there's an analytical solution to finding roll from dero. But the numerical solution (binary search) is robust, being accurate essentially by definition.
Here's the code for that bit, full new code attached.

JavaScript:
// given a jet pitch and roll, and the el and az (the ideal, i.e. the white dod)
// find the pod pitch and roll
// THEN modify roll until the dero for that pitch and roll matches the needed dero for the ideal
// this new roll will give us the green dot (i.e. where the pod head is physically pointing)
function getPodRollFromGlareAngleFrame(frame) {

 //   return getGlareAngleFromFrame(frame);

    // This is what we want the horizon to be
    const humanHorizon = get_real_horizon_angle_for_frame(frame)

    // actual Jet orientation
    var jetPitch = jetPitchFromFrame(frame) // this will get scaled pitch
    var jetRoll = jetRollFromFrame(frame)

    // ideal az and el (white dot)
    var az = Frame2Az(frame)
    var el = Frame2El(frame);

    // start pod Pitch and Roll for ideal az and el
    var podPitch, totalRoll;
    [podPitch, totalRoll] = EAJP2PR(el, az, jetPitch);

    var podRoll = totalRoll - jetRoll

    // what we want the dero to be
    const targetDero = getGlareAngleFromFrame(frame)

    // and hence what we want the pod horizon angle to be
    const horizonTarget = targetDero + humanHorizon

    // binary search here modifying podRoll until the dero calculates from jetRoll, jetPitch, podRoll and podPitch
    // is close to targetDero

    var rollA = podRoll - 90;
    var rollB = podRoll + 90;
    var horizonA = getPodHorizonFromJetAndPod(jetRoll, jetPitch, rollA, podPitch)
    var horizonB = getPodHorizonFromJetAndPod(jetRoll, jetPitch, rollB, podPitch)
    var maxIterations = 1000
    while (rollB - rollA > 0.01 && maxIterations-- > 0) {
        var rollMid = (rollA+rollB)/2;
        var horizonMid = getPodHorizonFromJetAndPod(jetRoll, jetPitch, rollMid, podPitch)
        // is the horiozn from A to B increasing or decreasing, that will affect which way we compare
        if (horizonB > horizonA) {
            // horizon is increasing from A to B
            if (horizonTarget < horizonMid) {
                // target is in the lower part, so Mid is the new B
                rollB = rollMid; horizonB = horizonMid;
            } else {
                // target is in the upper part, so Mid is the new A
                rollA = rollMid; horizonA = horizonMid;
            }
        } else {
            // horizon is decreasing from A to B
            if (horizonTarget < horizonMid) {
                if (horizonTarget < horizonMid) {
                    // target is in the smaller upper part, so Mid is the new A
                    rollA = rollMid;
                    horizonA = horizonMid;
                } else {
                    // target is in the larger lower part, so Mid is the new B
                    rollB = rollMid;
                    horizonB = horizonMid;
                }
            }
        }
    }

    return rollA;

}

And just to reiterate, this does not really change any of the observables. The initial slight continuous rotation does not change the fact when the jet banks, the horizon rotates, but the object does not. And the fourth observable, the derotation matching the glare angle, is still just as accurate as before. The main change is that the simulator now looks even more like the real video in terms of the horizon angle and the slight initial rotation.
 

Attachments

The magnitude and nature of the initial rotation is uncertain, see: https://www.metabunk.org/threads/calculating-and-visualizing-gimbal-angles.12237/post-274060

But it still works regardless of if you do this or not.

@logicbear, did you ever try to extract the gimbal angle from the original file?

I'm just using a linear rotation now. It would be interesting to see if something derived from the video might give different results.
 
@logicbear, did you ever try to extract the gimbal angle from the original file?

I'm just using a linear rotation now. It would be interesting to see if something derived from the video might give different results.
I did switch to the originals but the results didn't change much. With my histogram matching it doesn't seem like there's an obvious discontinuity around the switch from BHT to WHT, but maybe there could be a bit more significant drift that is not entirely detected by the algorithm due to the glare gradually changing shape. Could there also be a more gradual change in the exposure/gain/contrast settings over this period ? Maybe I could try to do some histogram matchings over longer periods, not just at the WHT/BHT boundary.
1661245294665.png


The math wasn't too complicated for that. The key was your insight that the horizon should look more intuitive when looking to the side :).
find the pod roll for a given derotation ... binary search, varying pod roll, until I get the desired derotation
But doesn't the pod only need to rotate just enough to be able to track the target, and then the desired horizon angle can be set by the derotation device without additional pod roll, so the pod roll shouldn't depend on the desired horizon angle ? Is it possible that the green line with keyframing (with that step I added during the first seconds perhaps) that we had previously was already close to the correct graph for the real pod rotation with steps ? Any additional derotation (without pod roll) due to the desired horizon angle should not affect the error angle graph (magenta) ? Also, shouldn't that error stay within 2.5 degrees so that all of the peaks where the roll motor engages can happen at the same error angle ?
 
But doesn't the pod only need to rotate just enough to be able to track the target, and then the desired horizon angle can be set by the derotation device without additional pod roll, so the pod roll shouldn't depend on the desired horizon angle ? Is it possible that the green line with keyframing (with that step I added during the first seconds perhaps) that we had previously was already close to the correct graph for the real pod rotation with steps ? Any additional derotation (without pod roll) due to the desired horizon angle should not affect the error angle graph (magenta) ?
Yeah, I think I focussed so much on the math to match to find the pod roll, I forgot the pod should not be rotating. So I think what I'm calculating is essentially actually just the error in the glare angle measurement.

Plotting green = glare angle, white = ideal dero, and yellow = resultant pod roll, with an initial glare rise of 7.2
2022-08-23_06-43-19.jpg

The calculated pod roll (yellow) is pretty flat, which means I could have just left it at that value, and we'd get, essentially, the same thing.
 
What version should we use then, the new one? Also, what is the "Scale jet pitch with roll" option?
The new one is probably closest to reality, although I'm not super happy with it. It's a refinement but bumps up against the low quality of the recorded data - mostly the difficulty in getting an accurate cloud velocity track. But using the simply ramp, or calculating it based on the old movements yields essentially the same result. And really changes nothing after the first 22 seconds (when dero and pod roll are essentially identical.)

Again, it's a refinement that does not change the observables - and in fact, demonstrates an even closer correlation of the model and the rotating glare hypothesis with the data. With or without refinement, the basic results are unchanged.

"Scale jet pitch with roll" has been there for some time. It adjusts the jet pitch to maintain constant lift when rolling (banking). It does not make a huge difference as the roll does not change much.

JavaScript:
if (par.scaleJetPitch) {
    var roll = jetRollFromFrame(f)
    jetPitch *= 1/cos(radians(abs(roll)))
}
 
Shouldn't the error angle graph be reverted to something like the old version ? It's supposed to approximate the difference between where the pod is actually looking without the steering mirrors and where it should be looking in order to track the object. But now it depends on the desired horizon angle as well which you agree probably shouldn't cause additional pod roll ? Previously it still depended on the glare angle which is also incorrect, but the keyframed glare angle graph (with the step added during the first seconds) was probably a good approximation for the stepped behavior of the pod roll as the resulting error graph was really good at making sense of all the bumps and sudden rotations. Maybe it'd be better to try to write an algorithm to predict the stepped pod roll behavior without looking at the glare angle, only the location of the bumps perhaps to resolve some ambiguity, but until we have such a thing, the previous error angle graph was better ?
 
Last edited:
It's supposed to approximate the difference between where the pod is actually looking without the steering mirrors and where it should be looking in order to track the object.
It's measuring that angle in the sim, i.e the angle between the white dot (the target) and the green dot (the physical pointing direction of the pod). It (the magenta graph at the bottom) is not hugely different
2022-08-26_10-57-16.jpg

There's a couple of factors here. I'm using the glare angle to drive the pod roll in both, but previously it was just glare angle = pod roll, and now it's glare angle = derotation to get the desired horizon, from which we can calculate pod roll. I've also shoehorned in a 6° constant rotation of the glare.

For glare to drive roll (or glare->dero->roll) we need a "glare start angle", as we can't determine the 1:1 relationship, only the delta values (i.e. a change in glare should produce a corresponding change in roll. This value is arbitrary, and you can adjust it in Tweaks/Glare Start Angle. But I have to give it value.
In the old code it was simply the average of the ideal roll for the first 22 seconds (so the green line goes through the middle of the white line)
2022-08-26_11-07-28.jpg

In the new code I pick a value that attempts to minimize the largest error over the entire track. Similar end result:
2022-08-26_11-09-34.jpg


However, as you can see above, this now reduces the largest error to 2.5°
2022-08-26_11-11-41.jpg

2022-08-26_11-10-37.jpg


However, you could also get pretty much the same result in the old code.
 
In the new code I pick a value that attempts to minimize the largest error over the entire track.
JavaScript:
function calculateGlareStartAngle() {

    var n = 650;
    var errSum = 0
    var glareSum = 0;
    for (var f=0;f<n;f++) {
        errSum += getGlareAngleFromFrame(f)-podRollFromFrame(f)
        glareSum += getGlareAngleFromFrame(f)
    }
    var avg = errSum/n // the average difference is the adjustment needed to make it zero
    var glareAvg = glareSum/n

    // this is a bit ad-hoc and does not really work
    par.glareStartAngle -= avg -  (getGlareAngleFromFrame(0) - glareAvg)

    // use that as a start, and now refine it to minimize the maximum error
    // this is a somewhat expensive (slow) operation

    var bestError = 100000
    var bestStartAngle = par.glareStartAngle

    var start = par.glareStartAngle-5
    var end = par.glareStartAngle+5
    for (var angle = start; angle < end; angle+=0.2) {
        par.glareStartAngle = angle; // set it temporarily to see if it's any good
        var biggestError = 0
        for (var frame = 0; frame < Sit.frames; frame += 5) {
            var podRoll = podRollFromFrame(frame); // this is JUST the roll not from
            var pitch,globalRoll;
            [pitch, globalRoll] = pitchAndGlobalRollFromFrame(frame)

            var glarePos = PRJ2XYZ(pitch, getPodRollFromGlareAngleFrame(frame) + jetRollFromFrame(frame), jetPitchFromFrame(frame), vizRadius)

            var v = PRJ2XYZ(pitch, globalRoll, jetPitchFromFrame(frame), vizRadius)
            var errorAngle = degrees(v.angleTo(glarePos))
            if (errorAngle > biggestError)
                biggestError = errorAngle
        }
        //console.log("Angle = "+angle+": biggestError = "+biggestError)
        if (biggestError < bestError) {
            bestError = biggestError;
            bestStartAngle = angle
            // console.log("bestStartAngle = "+bestStartAngle)

        }
    }
    par.glareStartAngle = bestStartAngle

}

It's not perfect, as it's an expensive operation so I do it at low resolution.
 
If one wants to explain all the bumps by pod roll, this error angle graph does not explain the bump at 1s (error peak occurs before 1s), and the model predicts a bump at 10s that does not happen.

I'm intrigued by this step-rotating pod. First because I don't see how it put less constraint on the pod, every ~3deg error the motors need to kick in, creating bumps/steps in the image. And also because if it was something that happens all the time, and not only at singularity (between -3 and 3°Az), this would be a very common feature of ATFLIR, that we would see in at least some examples. I've only seen smooth rotations so far (like in DCS). Experts/pilots would be very familiar with it too.

Except it was a specific algorithm tweak for this generation of ATFLIR. @Mick West , did you ever contact John Ehrhart, the ATFLIR expert that talked to Corbell?
 
this now reduces the largest error to 2.5°
As mentioned previously, now you have one more point, at 10 seconds, where the error seems to reach 2.5 but does not result in any noticeable bump or sudden glare rotation. Maybe you could say in reality it doesn't actually reach 2.5, just gets really close to it, but the previous version didn't have this issue at all. Also, the error peak at 24 seconds moves from 2.2 degrees to about 1.7 degrees, so that one already had a bit of a keyframing bug on it but now it moved even further from 2.5. We can't know for sure that they have to line up at the same value, but it makes a lot more sense if there's a common error threshold for the pod to start rolling. That some of the peaks were slightly higher than 2.5 previously seems like a lesser problem that could be due to other factors as well.
you could also get pretty much the same result in the old code.
This is what I get if I use the new default glare start angle (59.157..) in my version of the old sim. In the second half of the video it's similar but something has changed in the first half.
1661609059037.png


Maybe the new calculated pod roll graph is pretty flat but not flat enough ? I can't fiddle with the gimbal sim for too long as it overheats my laptop so I still need to implement the new changes in my version to be able to experiment more freely. The new getPodHorizonFromJetAndPod function is a bit of a headache to implement for me as the 3D model is in the GUI layer, not the backend layer. Have you checked if doing it that way is noticeably different than just rotating some vectors to get the pod horizon ?
 
is [getPodHorizonFromJetAndPod] noticeably different than just rotating some vectors to get the pod horizon ?
Turns out the answer to that was no, it's just equivalent to rotating some vectors.

So now I've added all of these changes to my version and I get the same error angle graph:
1662584005837.png

It's probable that there's an analytical solution to finding roll from dero.
As an exercise to help me understand the math here I wrote an analytical solution for it. It's always within 0.05 degrees of the horizon angle that the binary search finds.
C++:
// a x^2 + b x + c = 0
std::pair<double, double> solve_ax2_bx_c(double a, double b, double c) {
    double delta = b * b - 4 * a * c;
    double x1 = (-b + sqrt(delta)) / (2 * a);
    double x2 = (-b - sqrt(delta)) / (2 * a);
    return { x1, x2 };
}

//a cos x + b sin x + c = 0
std::pair<double, double> solve_acosx_bsinx_c(double a, double b, double c) {
    // rewrite as x = cos x
    // a x + b sqrt(1-x^2) + c = 0
    // a x + c = -b sqrt(1 - x^2)
    // a^2 x^2 + 2 a c x + c^2 = b^2 (1 - x^2)
    // x^2 (a^2 + b^2) + 2ac x + (c^2 - b^2) = 0
    auto [cos_x1, cos_x2] = solve_ax2_bx_c((a * a + b * b), 2 * a * c, (c * c - b * b));
    return { acos(cos_x1), acos(cos_x2) };
}

// based on extractRollFromMatrix but using a formula for the signed angle to avoid branching
double pod_horizon_from_bases(vec3d podUp, vec3d podForward, vec3d vy = { 0,1,0 }) {
    auto podFwdY = podForward.cross(vy).cross(podForward);
    return degrees(atan2(podUp.cross(podFwdY).dot(podForward), podUp.dot(podFwdY)));
}

double pod_roll_from_horizon_analytic(double jetRoll, double jetPitch, double podPitch, double horizonTarget) {
    vec3d vx = { 1, 0, 0 }, vy = { 0, 1, 0 }, vz = { 0, 0, 1 };

    // Given jetRight,jetForward after applying jetRoll,jetPitch we have:
    // podRight = jetRight.rotate(jetForward, -podRoll)
    // podForward = jetForward.rotate(podRight, -podPitch)
    // podUp = podForward x podRight
    // podFwdY = podForward x vy x podForward // based on extractRollFromMatrix
    // horizon_angle = podUp.angleTo(podFwdY)
    //   = atan2((podUp x podFwdY) . podForward, podUp . podFwdY)
    // (podUp x podFwdY) . podForward = tan(horizon_angle) (podUp . podFwdY)  -- (1)
 
    // It's easier to solve these equations in the frame of reference of the jet.
    // jetRight becomes vx, jetForward becomes vz, and so it can be shown that:
    // podRight = { cos(podRoll), -sin(podRoll), 0 }
    // podUp = { cos(podPitch) sin(podRoll), cos(podPitch) cos(podRoll), -sin(podPitch) }
    // and the vy used above becomes:
    vec3d vy_jet = vy.rotate(vx, radians(-jetPitch)); // reversed the order and sign of these rotations
    vy_jet = vy_jet.rotate(vz, radians(jetRoll));

    // Using (a x b) x c = b (a.c) - a (b.c):  -- https://en.wikipedia.org/wiki/Triple_product#Vector_triple_product
    // podFwdY = vy - podForward (vy . podForward)
    //
    // podUp . podFwdY = (vy.podUp) - (podForward.podUp)(vy.PodForward) = vy.podUp -- (2)
    //
    // podUp x podFwdY = (podForward x podRight) x podFwdY
    // = podRight (podForward . podFwdY) - podForward (podRight . podFwdY)
    // = -podForward (podRight . podFwdY)  // since podForward.podFwdY = (vy.podForward) - (podForward.podForward)(vy.podForward) = 0
    // = -podForward (vy . podRight) // since podRight.podFwdY = (vy.podRight) - (podForward.podRight)(vy.podForward) = (vy.podRight)
    // so (podUp x podFwdY) . podForward = -vy . podRight -- (3)

    // (1),(2),(3) => -vy . podRight = tan(angle) vy . podUp   // now we can substitute podRight,podUp from above:
    // (v . (cos x, -sin x, 0)) = a * (v . (cos p sin x, cos p cos x, -sin p)) // rename v = vy_jet, a = -tan(angle), p = podPitch, x = podRoll
    // vx cos x - vy sin x = a * (vx cos p sin x + vy cos p cos x - vz sin p ) // rename vx = v.x, vy = v.y, vz = v.z
    // cos x (vx - a vy cos p) - sin x (vy + a vx cos p) + a vz sin p = 0
    // and now this is an equation in x that we can solve:
 
    double tan_horizon = tan(radians(horizonTarget)), podPitch_rad = radians(podPitch);
    auto [podRoll_1, podRoll_2] = solve_acosx_bsinx_c(
        vy_jet.x + tan_horizon * vy_jet.y * cos(podPitch_rad),
        -(vy_jet.y - tan_horizon * vy_jet.x * cos(podPitch_rad)),
        -tan_horizon * vy_jet.z * sin(podPitch_rad)
    );

    // acos returns the absolute value of the angle, so try all possible signs
    double podRoll_results[] = { podRoll_1, -podRoll_1, podRoll_2, -podRoll_2 };
    double podRoll = podRoll_1;
    double err_min = std::numeric_limits<double>::max();
    for (int i = 0; i < 4; i++) {
        double x = podRoll_results[i];
        auto podForward = vec3d { sin(podPitch_rad) * sin(x), sin(podPitch_rad) * cos(x), cos(podPitch_rad) };
        auto podUp = vec3d { cos(podPitch_rad) * sin(x), cos(podPitch_rad) * cos(x), -sin(podPitch_rad) };
        double horizon = pod_horizon_from_bases(podUp, podForward, vy_jet);
        double err = abs(horizon - horizonTarget);
        if (err < err_min) {
            podRoll = x;
            err_min = err;
        }
    }
    return degrees(podRoll);
}
 
Last edited:
Nice work, just a quick question. Has anyone analysed the other gimbal video if the "ufo" flying low over the ocean? this is when they get a lock on it.
 
This patent has been cited to claim that the dero should just rotate the image by exactly pod roll, but the patent never says that. It appears to be just an interpretation that some people have assigned to rather unclear sections like this one:
Article:
In order to correct for the rotation of the image, the derotation device 330 is configured to counter-rotate so that the image output by the derotation device is in the same direction independent of the rotation of roll gimbal around the roll axis 242 or the rotation of the first coelostat minor 220 around axis 244 or the rotation of the second coelostat minor 230 around axis 248.
The precise relationship between the rotation of the image and the rotation of the roll axis is not defined. And in any case the usual caveat applies that it's just a patent, not a blueprint for the ATFLIR that was used during Gimbal.

I came up with the following argument to show in a simple way that the pod cannot reproduce what people expect the final horizon angle to be if dero did only rotate the image by exactly pod roll.

The expectation people have is that if the jet is in a left bank and the pilot looks directly ahead the horizon will be tilted by the bank angle and if they look directly to their left they will see a roughly horizontal horizon for a wide range of bank angles. For some this intuition can be so strong that they cannot see how the pod wouldn't just do this naturally, or just derotating the image by exactly pod roll. I think that just shows why Raytheon might've chosen to make the derotation device artificially rotate the horizon by more than that, to the angle which preserves that intuition. It would confuse people too much if the horizon angle were significantly different.

We can use the gimbal sim to illustrate some situations. For simplicity jet pitch will be set to 0, the elevation (under "Tweaks") will be 0. To be able to set a particular azimuth and bank angle in the sim the "Unlock azimuth" option needs to be enabled, and the "Use recorded bank angle" needs to be disabled. Jet Roll (bank) will be set to -45. Under Show/Hide the 'Pod's Eye Views w' dero' is selected and to declutter things a bit the 'spherical boresight grid', the 'frustum of camera' and 'error circle' are deselected.

In situation A we assume a tracked target is almost in front of the jet, at azimuth = -5.
1728912137619.png


We could've assumed an idealized version of the pod without yaw and started from a smaller negative value for 'az' that is arbitrarily close to 0, looking straight ahead for simplicity. But starting from some value assumed to be just far enough from the gimbal singularity for the role of yaw to be diminished, for pod roll to have made the pitch plane horizontal, avoids bad faith objections that attempt to dismiss the whole argument based on irrelevant details.

The derotated view shows what is expected, a banked horizon. In this case the orientation of the image in what the pod would see without a dero, the pod's eye view, should just be the physical orientation of the imager, the IR sensor chip, relative to the horizon. An ATFLIR manual says the imager is inside the pod's EOSU and shows the roll motor being behind the EOSU rotating the whole thing, while some have interpreted the patent as saying the imager might be body mounted instead, not physically affected by roll. Those would produce different imager orientations in this situation. The manual should be more authoritative even though it's from 2003, and that's what the vertical horizon here results from, but for the purpose of this particular argument it actually doesn't matter. In any case the imager is certainly outside of the ball. Its physical orientation is not affected by pitch, and so it will not change relative to situation B.

In situation B the tracked target has moved along the horizon to where it is now more towards the left of the jet, at azimuth = -75.
1728967991430.png


With the jet banked to the left and in reality the pod being under the wings the pod can't quite see directly to the left. So I'm using az=-75 here but it's unclear what the maximum 'az' is in this case.

The pod roll is still the same as in situation A. Only the pod pitch has changed. Since the pitch plane is parallel to the ground, whatever the pitch may be the horizon viewed along the LOS will always have the same orientation relative to the pod window, and the imager's physical orientation also doesn't change with pitch. So without a derotation device it is to be expected that we should see the same image, the same horizon angle in the pod's eye view as in situation A. But now in the derotated image we expect, according to our intuition, to see a horizontal horizon. The pod's eye view did not change but the expected derotated view has rotated by almost 30 degrees, without changing pod roll. If it were true that the derotation device only rotates the image by exactly pod roll, if the amount of derotation did not depend on pitch, then the same input image from the pod's eye view would result in the same derotated output image. If it were true that dero is exactly pod roll then the final derotated image in either situation A or situation B, or both, would have to be off by up to 30 degrees in this case, compared to what is expected. It would be noticeable to pilots.

In response to this some people have invented a kind of derotation device which is somehow supposed to physically rotate around its axis by only pod roll, while still rotating the image according to pitch as well to match what is expected. I suspect that's nonsense but actually that sleight of hand still amounts to a concession that the dero needs to rotate the image by more than just pod roll. Without dero the orientation of a glare (caused by e.g the pod window or some optics in the ball) in situation A and situation B would still be the same. The dero would still end up rotating both the glare and the horizon in the final image by 30 degrees. And during the Gimbal video that is still the effect which has been calculated to match a measured CW rotation of the horizon of about 7 degrees (in addition to the change in bank angle which wouldn't rotate the glare), and a measured CW rotation of the glare of about 4-8 degrees over the same period. It has been suggested that maybe the calculations are still off somehow and something else might be the explanation for those observations, and perhaps that's possible but would be an unlikely coincidence, adding to all the other evidence that what we see in the video is glare.
 
Last edited:
I apologise for commenting, I am here for Yemen encounter only, but as per @Mick West suggestion, I shall share what I have found in regards to the cloud line, frustum rotation and horizon. (Mick provided two links to two threads, as @logicbear has reference with illustration to the pods pitch axis, above, and that being central, I replied in this thread and did not comment on the other thread or any other threads, as there are a few. So I respectfully ask that this commentary stays here)

Firstly, Plane bank incorporated into the view.

There is zero dispute that plane bank is reflected in the footage, although I state that it is directly reflected in the footage @ 1:1 (as opposed to the claim,

"Formula "jetPitch = (ObservedCloudAngle - (jetRoll*cos(abs(radians(az)))) / sin(abs(radians(az)))" "

When we remove the plane bank, we are left with a mismatched cloud line still.

Screenshot (3838).png


Although more easily digestible via @TheCholla post image

Screenshot (3837).png


I do note that @logicbear has made a representation here,
Since the pitch plane is parallel to the ground
I am of the opinion that it is parallel to the planes fuselage, not the ground. As the pod is mounted parallel to the planes boresight, the camera would be responsive to the planes pitch, not the ground. So the camera will be tilted an appropriate amount, with maximum tilting occurring at 90 degree azimuth. However, to achieve this, I calculated based on a 3.6 degree pitch for level flight, what the pitch increase would actually be based on bank.

Screenshot (3839).png


This seems incompatible with some of the earlier assertions (might be a different thread), that pitch could be 6-7 degrees.

So, I then did some simple representations to calculate it myself.

Screenshot (3820).png


Plane flying straight and level, has a value of 1. When we bank the plane.

Screenshot (3822).png


At a 30 degree bank, we now have vector sums, with the plane still generating 87 percent of the required lift. So the plane only needs to pitch by 3.6 X 1.13 = 4.07 degrees of pitch to keep level flight. Which is under what the formula came up with "4.1589 degree pitch". So I have high confidence that the pitch values are correct.


Next, I then calculated how much Frustum Roll (essentially how much the camera is tilted) is required with those pitch values,

Screenshot (3840).png


And de-rotating the footage, removing plane bank @ 1:1 and removing Frustum Roll, this is the result.


Source: https://www.youtube.com/watch?app=desktop&v=PpJ6ppruH70


The end result, demonstrates that the clouds are not the horizon and the footage is more consistent with the F-18, closing in on a close by target, like @Mick West example of a helicopter, getting closer to a balloon. I have my twitter private, but this image should be remembered by everyone as to what happens.

d7dfa0dd-0fa0-4161-ba65-862a4928cb58.png

* I will clarify, "Frustum Roll" is NOT the RAW camera Frustum (Pods Eye View), as Mick makes reference to in his Gimbal Analysis video, but the Frustum AFTER the Derotation mirror. This is because the camera will ALWAYS see the image in the correct orientation, its behind the dero, as Mick does state in his Gimbal Analysis video.

Using this, I have stitched the footage together,

fill.jpg


And have also worked on what the elevation figures would be, based on the FOV being 0.35 X 0.35.

410.jpg

With the values looking like this (I used 0 as a baseline, so I could use the range to see total changes with respect to the initial angle) and is is the elevation figures look like

Screenshot (3841).png



So again, as per Micks suggestion, I have placed it into metabunk for discussion.

(I haven't forgotten about Yemen, but I am working on my own version of 3D recreation software, including all of this as it was the catalyst for commenting on "using the cloud line" and had to deep dive this so I could ensure accuracy of my "Marik-rec"(once completed I will give it to him and all copyright to it), and I need it before i can go further on Yemen, AND go fast, gimbal, Omaha etc)

I am also tagging @Kyle Ferriter , @Mendel and @flarkey as I humbly ask for their feedback on the above.

[edited for clarity]

Screenshot (3790).png
 
Last edited:
just to further clarify what I am saying,

As @logicbear states, the pitch axis, can only move left and right, amount varies, but the maximum point is at 90 degrees azimuth

Screenshot (3852).png


And, illustratively only, when the pod/ camera behind the dero mirror, is looking at 90 degrees azimuth

f_a-18c.09.jpg


it is orientated like that, so it is parallel to the plane, and when it pitches

Sequence 01.00_00_01_19.Still164.jpg


the camera is tilted.

With the elevation being controlled by the roll axis, and the pitch axis, only able to slew left and right, the camera must be responsive to the planes pitch.

I hope that better illustrates what I am saying.

We remove the bank value, remove the above tilt value (frustum roll) and what's left is the actual footage.
 
Due to limited time I will just clarify a misunderstanding here:
I am of the opinion that it is parallel to the planes fuselage, not the ground. As the pod is mounted parallel to the planes boresight, the camera would be responsive to the planes pitch, not the ground.
This is trivially true. It is difficult to imagine how anyone familiar with this discussion could possibly miss that obvious fact. So when it sounds like someone said the opposite, one should assume there's been a misunderstanding. Indeed, when I said "the pitch plane is parallel to the ground" that was only in the context of situations A and B, not in general. Above I said that in these situations "for simplicity jet pitch will be set to 0". So the plane's boreline just happens to be parallel to the ground here. Of course that is not the case during Gimbal, but that does not matter for the following reason.
My post above was a counter argument to the idea that the dero always only rotates the image by an amount equal to pod roll (in the right direction to cancel it out), instead of the more complicated formula which also depends on pitch that is assumed by the refined gimbal sim. To falsify that claim it is enough to find *any* situation where that is not case. So I showed why that doesn't work in a simplified situation that should be easier to understand.
 
Last edited:
To work towards identifying the correct method of de-rotation, I just tested the two methods against each other, I de-rotated Go Fast via,

- Frustum roll and Bank, "Zaine"
- The Formula "Formula "jetPitch = (ObservedCloudAngle - (jetRoll*cos(abs(radians(az)))) / sin(abs(radians(az)))" " "Logic Bear"

There is significant variance in the results

Screenshot (3853).png


When exported it ends up looking like this, key framed at 1 second intervals, rotation value at the top with background motion degrees to the right of each video (only shown from half way, "Zaine" is left static, and we can see that the background doesnt follow that line, until it becomes dynamic, as I was testing the Linear motion when the plane maintains constant bank).


Source: https://www.youtube.com/watch?v=WFhi_Kq-WEk


And what really stands out to me is how the background moves in each.

I placed a blue line from the target illustrating the direction the background moves in, with the "Zaine" on the left displaying dynamic changing and the "Logic Bear" appears far more constant with no change required.

In all examples, we have of two objects in motion, the background motion is dynamic, it changes through out the entire video, like in the Unresolved UAP Report Middle East 2023


Source: https://www.youtube.com/watch?v=FfIFbP94xho&t=13s


And a far shorter example in this recreation of a drone being over taken.


Source: https://www.youtube.com/watch?v=PT8_O8-piPw


The other formula, "Logic Bear", in which, natural motion is removed. IE it isnt representative of real world examples and, apparently, was

1. Derived from the Gimbal Sit-Rec, in which the clouds were already levelled??? Can you clear that up @logicbear did you work forwards, generating a hypothesis and testing it or worked backwards using the levelled clouds to come to a formula that fit it? (By your response above, you don't want me to misunderstand what you are saying).
2. Requires that a plane in a bank, maintain the same pitch as level flight, without regard to the necessity to increase pitch in a bank? (If you can also clear this up I would appreciate it also, how does that work?).

I respectfully submit that, background motion being dynamic, as a litmus test, would therefore give weight to Frustum Roll and Plane bank being the correct method of de-rotation, as

1. We all agree that the camera is titled with respect to the planes pitch,
2. Testing both models side by side, yields one that results with what we expect in a real world encounter and the other, appears to remove natural background change.

[edited for proper use of English, but i probably missed more of that somewhere else]
 
Last edited:
Trying to get my head around what that formula actually does, and can you @logicbear confirm this?


In my excels, I calculated FOV in go fast.

Screenshot (3854).png

What that illustrates is that the camera, not frustum roll, the cameras orientation changes (and the FOV orientation) due to azimuth, bottom will always be closest to plane.

And I inserted two red lines, with the same angle, and clearly this demonstrates how my method would behave, we expect to see changes as it progresses.

Now the "logic bear" formula, maintains the background going in only one direction, that would only happen, and correct me if i am wrong, but that would only happen if you are artificially rotating the camera which removes the cameras orientation changes due to azimuth. And at which point you would be unable to claim that it is "derotating the horizon".

I say that because the bottom of the video, would now no longer be closest the to the plane, as it should, but angled and artificially twisted, if i am understanding this correctly. (The clouds would need to be in the correct position based on azimuth with the camera orientated the correct way).

If you can clear the above points up, as well as your position on the formula artificially rotating the camera, (and image), I would truly appreciate it.

[Edited for better clarity]
 
Last edited:
- The Formula "Formula "jetPitch = (ObservedCloudAngle - (jetRoll*cos(abs(radians(az)))) / sin(abs(radians(az)))" " "Logic Bear"
Please don't attribute things to others when they've said no such thing. You even put it in quotation marks, but there's no such quote.
That looks a bit like you rewrote the spherical interpolation that Mick originally proposed to fit the cloud angles here, which we sometimes refer to as the "Sin Cos" method. But you made it take the absolute value of az in an attempt to generalize it ? You appear to be using it to calculate jetPitch, but the Gimbal sim doesn't do that. The jetPitch in the Gimbal sim is a constant value (3.6 by default) scaled according to the bank angle as mentioned above:
"Scale jet pitch with roll" has been there for some time. It adjusts the jet pitch to maintain constant lift when rolling (banking). It does not make a huge difference as the roll does not change much.

JavaScript:
if (par.scaleJetPitch) {
    var roll = jetRollFromFrame(f)
    jetPitch *= 1/cos(radians(abs(roll)))
}
I proposed a different function here for calculating what people might intuitively expect the horizon angle to be, to match horizon angles produced by a much simpler camera mounted like a Wescam instead of an ATFLIR. That's the one that is used by default in the Gimbal sim, not the one that you spent considerable effort on. There is an option in the sim to set it to the "Sin Cos" method instead just for comparison. The two happen to produce values that are somewhat close during Gimbal, but for example my function takes pod elevation into account while "Sin Cos" does not. So clearly "Sin Cos" was never intended to work during GoFast where the elevation has a much more significant impact.

The other formula, "Logic Bear", in which, natural motion is removed. IE it isnt representative of real world examples and, apparently, was

1. Derived from the Gimbal Sit-Rec, in which the clouds were already levelled??? Can you clear that up @logicbear did you work forwards, generating a hypothesis and testing it or worked backwards using the levelled clouds to come to a formula that fit it? (By your response above, you don't want me to misunderstand what you are saying).
2. Requires that a plane in a bank, maintain the same pitch as level flight, without regard to the necessity to increase pitch in a bank? (If you can also clear this up I would appreciate it also, how does that work?).
None of that is even remotely accurate. The formula you mentioned is not used by default in the Gimbal sim or Sitrec. The clouds are not "already leveled" in the Gimbal sim. Neither that formula, nor anything used in the Gimbal sim, require maintaining the same jet pitch as level flight. How was any of that "apparent" to you ?
 
2. Requires that a plane in a bank, maintain the same pitch as level flight, without regard to the necessity to increase pitch in a bank? (If you can also clear this up I would appreciate it also, how does that work?).
Sitrec adjusts the plane pitch for bank (roll)
2025-11-28_23-52-59.jpg


by
Code:
jetPitch *= 1 / cos(radians(abs(roll)))
 
I Appreciate both of your responses,
So clearly "Sin Cos" was never intended to work during GoFast where the elevation has a much more significant impact.
So you can't use it to de-rotate the footage from the same pod for a different event ten minutes earlier?

Due to elevation? Why?

And to qualify my remarks, two examples,

1. If I am filming with a phone, point it down and pan it up, the orientation to the horizon hasn't changed, same as when i point the camera at 45 degrees left and pan up.
2. Checking in my excel, I get the same result, I'm not seeing any rotation due to elevation change. And as the FOV orientation is driven by Azimuth, Can you to explain it a different way?

(Just adding in here, de-rotating is making the up direction, the actual up direction (global), as opposed to the Frustum Roll, (camera tilt) that, in which the up direction is the up direction relative to the planes pitch, we can account for everything by using bank and pitch, not elevation).

Screenshot (3857).png


Just to this,
Sitrec adjusts the plane pitch for bank (roll)

To that, I cant see how impactful that is, as the sitrec for "The gimbal encounter" doesn't display pitch nor the "within 2.5/ 5 degrees" or maybe im missing that feature?, I can see some shift in the orientation of the pod, which means that the degrees from the two degrees down is altered, which would impact the degrees of freedom, but I cant check how that alters the camera rotation in either configuration, it just seems to always be as per the video?


Source: https://www.youtube.com/watch?v=LuYgd-of_zY


I did a follow up check on the gimbal roll model and I couldn't see the scaling for plane pitch due to bank impacts it, I can adjust the planes pitch, but it still operates as normal with no corrections, to when it reaches the degrees of freedom limit, IE it can go 10 degrees of variance and still rotates as normal (I appreciate it is really old code and wouldn't of been updated and only raising because I thought maybe theres a dynamic pitch example driven by plane bank).

Screenshot (3860).png


The point I respectfully raise to this is, there are claims made in the gimbal analysis video, and even though we have progressed, well past that, to using dynamic pitch, the rotation points (bumps) have stayed the same? Put another way, when it was explained in the "old" analysis video, the limits were defined based on 3.6 degree pitch, as we know, the entire encounter is actually 3.9 to just under 4.7 odd degrees, so surely that impacted degrees from centre? and is testable?

Screenshot (3839).png


I wont dwell on this, as the part I am wanting to understand is, the broader context is, I cant understand how the footage is derotated to the extent required, when, as my assertions, you would need the plane to have 12 odd degrees of pitch to result in the amount of slant we see in the footage,


Source: https://www.youtube.com/watch?v=vIcGBv8BZ3g


Noting that the formula is Gimbal specific and can't be used to de-rotate go fast, because when it does, it appears apparent that what is occurring is the camera is being rotated to hold a certain perspective, not reflective of actual footage. How do we know if that is or isnt also occurring with the de-rotated footage in gimbal? The test really is, if that formula being relied upon can actually be used on another video to give the same result (of successfully de-rotating footage) because when i did, it didnt work.

We have known parameters that, I assert, are being exceeded with the clouds level throughout, with my claim being the method I demonstrated, identifying them and working from that to get my result.

If either of you can provide feedback to these three main points, primarily
-how elevation is impactful in de-rotation,
-what does activating/ deactivating the "scale jet bank pitch" do, and
- where specifically are those extra degrees, above camera tilt coming from to level the footage so the clouds stay level throughout, (we all agree that the camera is parallel to the planes fuselage, we can all agree that bank is reflected in the footage, but where are those other degrees required to level the clouds for the entire encounter coming from?)

it would go along way.

Adding again, that using the parameters, plane bank, camera tilt (after I worked out the planes dynamic pitch to compensate for bank), I get this result. How is that incorrect?

410.jpg


Again, i appreciate the feedback, and this information will be used in both? of our upcoming software for the 3D camera view (i noticed you depreciated Version one in Github, and made remarks about V2 coding, so this should be beneficial all round.

Many thanks for both of your time and consideration

[edited a few times to be clearer and more defined]
 
Last edited:
We have known parameters that, I assert, are being exceeded with the clouds level throughout, with my claim being the method I demonstrated, identifying them and working from that to get my result.
I feel like you might need to go back to whatever that method is and reevaluate it.

The model that I use in the simulator is essentially a physical model. There's a minor tweak with the "human" horizon, and another even more minor tweak with the pitch adjusted for bank (which is not impactful, just there for completion). But it works. It worked within reasonable bounds both with and without those tweaks. It works fine with the 3.5 degrees of pitch.

And really, I'm not that interested in this, unless you can make a clear and compelling case that convinces people. Until then, I see no reason to revisit this.
 
I appreciate your input, as I said in DM, I was trying to work out how the 3D camera view works for my recreation software, it wasn't some kind of "gotcha" (as i was keeping it private), you mentioned to pop it into metabunk, I took that suggestion and did just that.

I do ask the you understand, I went to a lot of effort to understand what is happening in the footage, I went so far as to calculate planes pitch in banking, how much camera tilt is required, stitched the version I had together for more context, I tested the gimbal derotation footage against go-fast, I wasn't trying to relitigate gimbal, Yemen is what I am focussed on, but I was using Go-fast, as everyone can see in my screenshot of the software, as i have way more data for that for camera orientation and went to check against gimbal because they both must operate the same way etc.

I believe we are all after the most accurate recreation that will prove or disprove, the claims being made about different footage.

[edited for more context]
 
Last edited:
Sorry, and I apologise, i felt i must be doing something really embarrassing against myself, but after spending a lot of time working on this since, you said this,

I feel like you might need to go back to whatever that method is and reevaluate it.

I took your direction and have been focussed on resolving my perceived errors.

But, despite me looking for where my error is, and unable to locate any glaring problems, or even minor ones, i then focussed on the formula being used to claim the clouds are level, and Grok has come up with the following, specifically related to go fast, as we don't have anything definitive for Gimbal, and why the background moves through in only one direction.

"Because the cot(|az|) term is literally the proportional-navigation guidance term that drives the line-of-sight rate to zero. When you subtract it in post-processing, you are electronically turning the jet into a perfect pure-pursuit or PN interceptor that always flies straight at the current target position. In that world, the only geometry that keeps the target centred while the background slides in one fixed direction is exactly 180° opposition."

Now noting thats AI, and there are "AI limitations".

And I don't want to trouble either of you, but, it may be far more beneficial to just come out and ask directly, if you can point out where my method fails I would really appreciate that, with my method being,

1. Remove bank @ 1:1 (i say this because when we level on JUST the artificial horizon, any banking is cancelled out and we dont see any appreciable change in the cloud line, IE it would be derotated @ 1:1 not as per Sin or Cos of bank)
2. Account for planes pitch increase due to bank, over the standard 3.6 degree pitch for level flight, apply that value to determine how much camera tilt there is. (As we all agree the camera is parallel to the planes fuselage, and any pitch the plane has, is reflected in how much the horizon is offset from the planes artificial horizon).
3. Use those figures to derotate the footage, to put the footage into a global state where up is up for everyone.

And whats left, degree mismatch, must be due to elevation change.

Mick, who is under zero obligation to respond, has commented he isnt going to revisit this, but, and i cant stress this enough, I am not looking for a "gotcha". All I want to do is ensure the accuracy of 3D recreation software that would be beneficial to everyone. If its so obvious, LBF, Kyle, anyone can reply saying, "it doesnt work because of X,Y,Z and the additional degrees the background is off is because of ABC with DEF."

Many thanks for anyone who is reading this time and consideration.
 
@Zaine M. it would probably be easier to follow along if you could provide concrete python/javascript/excel/etc files or snippets to demonstrate what you think is inconsistent. And which could be downloaded by someone else to replicate what you're referring to. If you have files or code, then putting them in a github repository would make it much easier to refer to specifics which can be harder to describe purely with words and static images.
 
Many thanks for your reply @Kyle Ferriter

I do briefly take this opportunity, by way of background, to highlight that @TheCholla was the first to notice that, when we level the video, just on the planes artificial horizon, the clouds are not traversing level (This is the main thrust of where inconsistency claim comes from).

There were discussions on Metabunk about this, with the formula,

Formula "jetPitch = (ObservedCloudAngle - (jetRoll*cos(abs(radians(az)))) / sin(abs(radians(az)))

being used to address that mismatch. I was able to confirm, the formula did do that, when using the base pitch, with no other pitch variances, ie the plane doesn't increase pitch when banking.


Source: https://www.youtube.com/watch?v=Z9O5L2MASOs


First half using 2.9 degree plane pitch, we can see clouds do not progress through level, second half is using 3.6 pitch value, and it results in the clouds passing through level.

So that is/ was being used as the successful de-rotation formula for footage.

However, while doing additional work, I found the following accounted for camera orientation, once you correct for plane bank.

1. Calculate planes pitch during banking using this formula "=$V$3/COS(RADIANS(G2))" where V3 is plane pitch for level flight at 25k feet, 3.6 degrees, and G2 and subsequent, is the planes bank.

*(mick has jetPitch *= 1 / cos(radians(abs(roll))) so its the same, only calculating the additional pitch over initial, roll = bank??? I am assuming, you can also find that in my excel also).

2. Calculate how much Frustum Roll (camera tilt) is attributable to de-rotate the video footage using this formula "=DEGREES(ATAN(TAN(RADIANS(I2)) * SIN(RADIANS(A2))))" where I2, and subsequent, is point ones calculated pitch, and A2, and subsequent, is the azimuth. Reason being post #23 camera orientation reflective of planes pitch, which @logicbear agrees with by stating,

It is difficult to imagine how anyone familiar with this discussion could possibly miss that obvious fact.

Please see attached excel labelled "Frustum Roll" for the calculations/ figures.

Now doing so still results in the clouds not traversing level through the video, there is still a mismatch. As seen below


Source: https://www.youtube.com/watch?v=PpJ6ppruH70

And in the zoomed out overview it looks like this,


Source: https://www.youtube.com/watch?v=6dXHWFAXB3Q


Which is where this stitched version comes from

fill.jpg


So doing points 1 and 2 yield that result, now, not wanting to essentially copy and paste the go fast comparison, from post #25, when I went and tested both methods, on the go fast video, to see which successfully de-rotated the footage,


Source: https://www.youtube.com/watch?v=WFhi_Kq-WEk


It became clear, that the initial formula,

Formula "jetPitch = (ObservedCloudAngle - (jetRoll*cos(abs(radians(az)))) / sin(abs(radians(az)))

was, in my opinion, unsuccessful, see my posts from earlier.

Which now brings us to today, where I am simply asking,

1. Where is my methodology incorrect,
2. Where are the additional degrees, to make the clouds pass through level, and not at an angle come from.


demonstrate what you think is inconsistent.

So to be clear,

1. I think the formula being used to de-rotate gimbal is inconsistent with an accurate representation of what the encounter looked like, with the comparison footage for go fast being used to support my claim. Said a different way, As the Sin Cos formula was unsuccessful in de-rotating other footage, how can we be certain it isn't giving that, apparently, erroneous result for gimble?

[Edited to include IAS BANK excel, which has the calculations for go fast using my method and the metabunk formula]
 

Attachments

Last edited:
I do want to add, that when using my method, whilst we would expect there to be only, 1 portion of the video, where the background moves through level, closest approach/ its motion is at right angles to the cameras direction, there is actually 2 parts, one at 23L and another at14L.

Now is that because
1. my methodology is incorrect, or
2. is it demonstrable evidence that the object does something that results in that?

Noting we shouldn't see that with 2 objects in motion unless, 1 or more variables, plane, clouds, target, actually changes and does something different.


Source: https://www.youtube.com/watch?v=eD-9Jr_g2OU

*Caveat, whilst i say consistent with the F-18 getting closer to a close by object, a distant plane, noting that it is reportedly flying a wavy flight path, *could* also be descending in altitude by 1215 feet additional to any other previously identified altitude changes, and then starting to increase its altitude, to account for the elevation changes i have derived, but, an additional caveat, I am uncertain how a distant plane would achieve making the horizon pass through level twice.*

I am NOT relitigating Gimbal, but demonstrating why the correct formula/ method is incredibly important. (noting that i can not use the horizon level twice to rule out my method as the claim is, "the object stops and reverses directions", and that highly anomalous characteristic *may* account for it.)

[Edited for caveat]
 
Last edited:
Hey @Zaine M., good to see you here and thanks for tagging me. Honestly, going back to this after more than a year not thinking about it hurts my head...

I see a lot of merit in your analyses, although my opinion is that at this point it's very hard to prove one or the other side of the argument (about what method is used for dero) without additional evidence. Sorry for not being helpful.

I will just note that your estimated change in the El angle of the pod (0.35° or about one FOV) would pretty much cancel out the need for gain in altitude of the object in the <10Nm close path (0.35° corresponds to 220-300ft at 6-8Nm, which was our estimate of the object rise in altitude with assumed close-to-constant El angle). I.e. no need for elevation change of the object, just an object going level and reversing course in the wind. Too bad the Gimbal crew never came out of the woods, that would have been interesting, I feel like it's past due at this point.

1764790002905.png
 
Many thanks for your reply @TheCholla , this isnt about litigating gimbal per se, but identifying the accurate method for de-rotating the footage. As I recall @logicbear made claims that software is responsible for the mismatch angle the clouds move through,


Source: https://x.com/lbf_tweet/status/1770882366367473988

*Noting that there is a dero mirror between the pods eye and the camera and this would now involve another method on top of the mirror to generate additional degrees of difference for the cloud line.


Source: https://x.com/lbf_tweet/status/1889848937928151448


Adding that tweet, as it highlights the importance of solving the de-rotated footage, "We just disagree about whether the software of the pod might've done in Gimbal."

And noting that, that method does NOT work in go fast, and all angles are now accounted for with the method I put forward, elevation change, frustum roll due to pitch (camera tilt) and bank angle, it is about ensuring accuracy of recreations and of claims about what is happening and why the object is rotating when the pod is allegedly not rotating, but that was NOT the intent of me posting.
 
Last edited:
Putting this a different way,

Facts NOT in dispute,
1. The object in gimbal rotates, in the first 20 seconds,
2. The cameras orientation is responsive to planes pitch due to how it is mounted.


Now, position one is the formula, (used to account for the rotation from point 1)
Formula "jetPitch = (ObservedCloudAngle - (jetRoll*cos(abs(radians(az)))) / sin(abs(radians(az)))

Is used to de-rotate the footage and put it back into the same state it was filmed in, IE how the camera actually saw the encounter. This is the clouds are perfectly level.

The issues with that is
1. The formula clearly fails when used on the go fast footage,
2. We would expect to see a mismatch of the clouds still due to the camera tilt (due to how it is mounted, noting @logicbear agrees), yet we do not and we do not need to correct for that with the formula.

Screenshot (3873).png


My position/ methodology put forward accounts for all angles, and a distant plane would only need to be reducing its altitude before increasing it again.

To put it as simply as possible, if the formula de-rotates the footage to what the camera sees then, because we all agree the camera is tilted due to pitch, the clouds must move through at an angle consistent with the tilt due to pitch.

However, and NOT litigating Gimbal but highlighting what would come from this, the flow on effect is that the rotation in the first 20 seconds can only be attributed to,
1. The pod is actually rolling and not static, which would cause issues for stepped rotation,
2. The object is not glare but a real object and the rotation is consistent with perspective.
 
Now, position one is the formula, (used to account for the rotation from point 1)
I've already made it clear, twice, that this formula is not actually used to account for anything in the Gimbal sim. Therefore any analysis that includes it and tries to infer anything about the Gimbal sim is technically wrong.

The formula that we do use instead has a built in proof by construction for why it should represent the real horizon angle of a simple Wescam style camera. If you wish to claim that some other formula, in particular one that does not depend at all on the elevation, can physically represent the horizon, then you need to prove that with more than just words.
 
Last edited:
Back
Top