(ME23) AARO release — Unresolved UAP Report: Middle East 2023

Partial fit. Plenty of tweaking, thanks to the aforemntioned multiple variables.
https://www.metabunk.org/sitrec/?cu...t-2.amazonaws.com/1/ME23 2/20250515_233347.js

2025-05-15_16-40-08.jpg


Zooming in on the saddle
2025-05-15_16-40-46.jpg


What I did there was calculate the track that traversed the LOS while being as close as possible to a target track (a fixed point here, but could also be windblown).

Adding wind does not imporve things, which seems consistent with it being close to the camera (no relative wind motion between camera and target)

Complicated stuff - made more fiddly here by the long video, giving 14,845 frames, and I do calculations for each one. I'd like to get it back more to real time (when editing values, it's real-time during playback)
 
A at 03;54;18 at 36°49'6.75"N, 38° 8'7.43"E
B
at 03;55;18 at 36°50'40.81"N 38° 8'33.97"E
C
at 04;07;20 at 36°55'2.21"N 38° 6'44.48"E
EDIT: to tidy up.

LabelDescriptionFrame No.TimeCoordinatesScreen cap from videoSatellite image
1white blob/hill660403:4036°43'56.13"N 38° 7'11.19"E
Screenshot 2025-05-16 at 10.37.11.png
Screenshot 2025-05-16 at 13.04.11.png
Acurved road703203:5436°49'6.75"N, 38° 8'7.43"E
Screenshot 2025-05-16 at 10.38.29.png
Screenshot 2025-05-16 at 13.05.20.png
Bdark field / diagonal line7062 03:5536°50'40.81"N 38° 8'33.97"E
Screenshot 2025-05-16 at 10.40.20.png
Screenshot 2025-05-16 at 13.06.14.png
Cconverging roads742204:0736°55'2.21"N 38° 6'44.48"E
Screenshot 2025-05-16 at 10.41.48.png
Screenshot 2025-05-16 at 13.07.22.png
 
Last edited:
View attachment 80310
The target appears then movement starts around frame 2100, before that there's a slight CCW rotation.

Tracking lock is acquired around 6750, which is about the middle of the slope.

If we were circling something then the rate of change of the angle would be constant.

Here, the rate of change of angle looks like what you'd expect flying in a straight line, past something that's nearly stationary. It's not perfect symmetrical, which suggests a small wind component.
I don't understand how to read this graph. I think a lot of us could use some more detailed explanation of what you're doing.

Also
It seems you're comparing two types of motion in the analysis:

-The rotation (panning) of the camera, as indicated by the changing heading—i.e., the direction the camera is pointing, in relation to "N".

-The apparent motion of the background through the frame, caused by parallax as the camera pans to keep the object centered.

From this comparison, the actual heading of the drone itself—not just the camera—is being inferred through mathematical or geometrical analysis. I understand the basic principle behind this approach, but what I don't understand is how the background motion is being quantified. Is it measured in angular displacement across the frame? If so, how exactly is that angular movement calculated or referenced?

Or am I entirely off?
 
Last edited:
View attachment 80354

If we assume both the aircraft and the UAP are moving in straigth lines at constant speed, the theoretical heading of the camera tracking the UAP would be of the form :

heading over time:
\[ tan(\theta(t))=\frac{y_{UAP}(t)-y_{AIRCRAFT}(t)}{x_{UAP}(t)-x_{AIRCRAFT}(t)} \newline\newline\newline \theta(t)=atan(\frac{y_{0_{UAP}}-y_{0_{AIRCRAFT}}+(v_{y_{UAP}}-v_{y_{AIRCRAFT}})*t}{x_{0_{UAp}}-x_{0_{AIRCRAFT}}+(v_{x_{UAP}}-v_{x_{AIRCRAFT}})*t}) \newline\newline\newline \theta(t)=atan(\frac{A+B*t}{C+D*t}) \]
with A,B,C and D constants.


I used gradient descent to estimate A,B,C and D from Mick's measured headings (using data from frames between 2500 and 10000) :
A19500
B-2.91
C4000
D0.21



This values multiplied by the same non zero factor would also work.

With A,B,C and D you can simulate the heading over time :

View attachment 80352
There is a good match between the measured and simulated data, the constant speed hypothesis seems to be close to the reality.

A,B,C and D can be used to reconstruct the trajectory of the UAP in the aircraft frame of reference.

UAP trajectory angle = atan(b/d) = 5 degrees

View attachment 80357

I animated the scene in blender. The camera rotation is from Mick's heading data, the UAP position is calculated from A,B,C and D.
(animation sped up x100)
Left side : View from the camera
Right side : Top view, the black triangle is the camera, the sphere is the UAP
View attachment 80358
Ditto. Please explain in layman's terms.


I would particularly appreciate an explanation about this...

I used gradient descent to estimate A,B,C and D from Mick's measured headings (using data from frames between 2500 and 10000) :

A19500
B-2.91
C4000
D0.21

What is this measuring?
 
I don't understand how to read this graph. I think a lot of us could use some more detailed explanation of what you're doing.
The graph shows the heading of the camera, it has been calculated
  1. Motion-tracking the position of the "N" in the video (the X Pixels, and Y Pixels position.
  2. Converting this to an offset from the center by subtracting the center X and Y, that gives the next two columns
  3. Using Pythogoras to calculate the distance of the N from the center (R) the next column.
  4. Using the ATAN2(X,Y) function to calculate the angle of the N from the top of the screen
  5. Negating this to get the camera's heading from North
The graph then shows the camera's heading on the Y axis, with the frame number on the X axis. Since not all the frames are tracked, I use a scatter plot line chart.

Consider the diagram below:

2025-05-17_15-27-56.jpg


Here, 0,0 is the center of the video, and X, Y is where the N is. Then ATAN2(X,Y), gives you the angle
 
Last edited:
Still not understanding how you infer the heading of the drone itself from this. Or is that what you're trying to do?

Basically... you're assuming too much knowledge in your audience. You understand this stuff but your explanation is not clear because you're not starting with the basics.

You've got to teach to the student's zone of proximal development.

Not everyone is going to understand that. It's specialized jargon I'm familiar with. But I can't assume everyone's going to understand zone of proximal development. (Studied irony.)

ZPD: A concept developed by psychologist Lev Vygotsky, refers to the space between what a learner can do independently and what they can achieve with guidance or support from someone more knowledgeable. It's the range of tasks a learner is close to mastering with assistance but cannot yet master alone.
 
Last edited:
It seems you're comparing two types of motion in the analysis:

-The rotation (panning) of the camera, as indicated by the changing heading—i.e., the direction the camera is pointing, in relation to "N".

-The apparent motion of the background through the frame, caused by parallax as the camera pans to keep the object centered.

From this comparison, the actual heading of the drone itself—not just the camera—is being inferred through mathematical or geometrical analysis. I understand the basic principle behind this approach, but what I don't understand is how the background motion is being quantified. Is it measured in angular displacement across the frame? If so, how exactly is that angular movement calculated or referenced?
It is not. So far I've just roughly eyeballed it. It's not at all quantified other than being roughly in the right direction and speed. The change of direction does not really match. We don't know what the velocity of the drone/camera is, so that's one of several variables, along with the turn rate.

The Sitrec sitch is not the solution, it's an illustration of something that roughly fits the data if you adjust the unknows a certain way.
 
Still not understanding how you infer the heading of the drone itself from this. Or is that what you're trying to do?
I don't. I rotated the heading until the lines of sight converged, showing there was a partial solution for a roughly non-moving object.
 
Last edited:
So far, what can we say as to... Was the drone flying straight or turning throughout the video? Or some combination?
 
Last edited:
Ditto. Please explain in layman's terms.


I would particularly appreciate an explanation about this...



What is this measuring?
Mick calculated the camera heading from the position of the "N" in each frame of the video.
If you assume the aircraft and the UFO are moving in straight lines at constant speed you can find the mathematical formula of this heading over time. A,B,C and D are constant values that appear in the formula. They are related to the flight characteristics (position at t=0s and speed of both objects).
We can't calculate them with the heading data as the only input values, but you can use several algorithms to get an estimate . I used gradient descent to estimate A, B, C and D values that would match Mick's data. What I get is not the real values, there is an additional scale factor : multiplying all four constants by the same number would give the same shape to the heading plot.

With A,B,C and D "calculated", we have all the needed values in the theoretical heading formula so we can use the formula to compute a simulated heading over time. The simulated heading matches well with Mick's heading data, therefore when can conclude than the "straight lines, constant speeds" hypothesis is close to the reality. If the hypothesis was wrong I wouldn't have been able to estimate values that give a plot shape the closely resemble the real data plot shape.
 
if we assume both the aircraft and the UAP are moving in straigth lines at constant speed, the theoretical heading of the camera tracking the UAP would be of the form :

heading over time:
\[ tan(\theta(t))=\frac{y_{UAP}(t)-y_{AIRCRAFT}(t)}{x_{UAP}(t)-x_{AIRCRAFT}(t)} \newline\newline\newline \theta(t)=atan(\frac{y_{0_{UAP}}-y_{0_{AIRCRAFT}}+(v_{y_{UAP}}-v_{y_{AIRCRAFT}})*t}{x_{0_{UAp}}-x_{0_{AIRCRAFT}}+(v_{x_{UAP}}-v_{x_{AIRCRAFT}})*t}) \newline\newline\newline \theta(t)=atan(\frac{A+B*t}{C+D*t}) \]


A = Relative Y position
B = Relative Y speed
C = Relative X position
D = Relative X speed

This can be derived with AI, if you know what to ask. I asked
External Quote:
we have two aircraft "UAP" and "AIRCRAFT", with their motion specified as x and y components for position and constant velocity. e.g. y_UAP = y0_UAP + vy_UAP*t

Use that notation to find a formula (using Atan2) for the angle between UAP and AIRCRAFT at time (t)

Simplify that to use relative positions and velocities. Use:

A = Relative Y position
B = Relative Y speed
C = Relative X position
D = Relative X speed
2025-05-17_15-54-03.jpg


Note atan(y/x) = atan2(x,y), atan2 is used as it better handles values like when x is 0, or close to 0 (division by zero, asymptotically increasing y/x with small x)
 
Last edited:
A = Relative Y position
B = Relative Y speed
C = Relative X position
D = Relative X speed

If we also assume a balloon then the relative speed between UAP and AC (which I's shortening from AIRCRAFT) becomes the absolute speed of AC, and hence the heading of the aircraft

I visually optimize for a balloon-like entity (not knowing your values), and ignoring wind this gives

https://www.metabunk.org/sitrec/?cu...onaws.com/1/ME23 Zero wind/20250517_234304.js

2025-05-17_16-24-09.jpg


Red is the tracking portion of the video, I start at 2300

You recreation gives:

2025-05-17_16-25-31.jpg


Overlaying:
2025-05-17_16-27-19.jpg


Close (as mine is manual, and yours cuts off the last portion of the movement, which might be why the longer leg is more at an angle).

I'm not immediately clear why yours needs rotating about 62. Your Blender mode has a fixed camera and moving balloon, whereas I have a moving camera and fixed balloon, but why is your initial heading not around 260?
 
Last edited:



We can also tell the camera is facing to the right of the aircraft's nose. If the camera were pointing in the direction the plane was traveling, everything would be moving to the bottom of the screen in a uniform motion. We don't see that. We can tell this because the foreground objects appear to move through at different angles, and based on limitations, it has to be looking to the right.
I agree with this. He's looking at the apparent direction of motion of the clouds/landscape through the frame. He has inserted some lines to show the angle. The camera is clearly looking to the right at the beginning of the video.
View attachment 80452

We can also tell that the camera is looking - roughly - WbS - 258 degrees. If you extend those lines, the drone seems to be heading - roughly - What?




The Reaper could be flying in a straight line, or more likely, it could be flying in a right-hand bank, with the napkin math suggesting the Reaper is flying somewhere between a 260° and 200° heading.
 
Last edited:
Back
Top