Automated Motion Tracking in Videos Like Gimbal and GoFast

Mick West

Administrator
Staff member
There's very little, if any, parallax in the clouds. If there is some then you should be able to demonstrate it - and then demonstrate that it changes.

You seem to be applying some subjective metric. You need to be able to measure it - or at least point it out so other people can see it.
I was experimenting with OpenCV today, and got this velocity field overlay video:


I think others have done similar things before, but I can't immediately find them. Unfortunately a lot of noise, and it's not good at tracking in WHT mode. But it's interesting, and might be worth more work.
 

Mick West

Administrator
Staff member
This (OpenCV) is something of a new subject for me. I've generally done motion tracking and data extraction with Adobe After Effects, but that's focussed on tracking an object, and it's fiddly to use for measuring the speed of the background. So I thought I'd experiment with the OpenCV library - which lets your program more exactly what you'd like to look at, but is a bit of a pain to use.

The lines are motion vectors, so they represent the direction and distance a group of pixels moves over a period of time (three frames in this example). OpenVC is calculating a pre-pixel "Optical Flow" map. I started with this tutorial
https://learnopencv.com/optical-flow-in-opencv/
and code
https://github.com/spmallick/learnopencv/tree/master/Optical-Flow-in-OpenCV
(and the usual StackOverflow, etc. code results)

Getting the code to run was no easy task on ARM MacOS, but eventually, I got something working using Python. I modified the example to average the vectors for squares of pixels (hence the spacing) - but it's still noisy as there's noise and a lack of fine structure in the original.

I don't think this is the best approach to using OpenCV, but I have very little experience with it.

Going back to Adobe After effects - If you simply drop a track point on the video, it will track it across the screen. I can then reset to another point and continue:


The data from the track can then be compiled into a spreadsheet. I remove the jumps back and compute the per-frame delta ("delta" = "difference" i.e. how much it moves each frame). This gives a per-frame speed and a per-frame angle. Very noisy data. But using a 20-frame moving average shows something of what is going on:

AE Gimbal Speed.png
The jump at the end from the loss of lock distorts the true trendline, I should do it again with the video fully stabilized on the target.

AE Gimbal Angle.png


The blue is the raw data (with the resets removed). The red is 20-frame moving average. The yellow is a polynomial trendline - which I would not read too much into for the angle as it's influenced a lot more by the bumps, especially at the end. The data extraction was also semi-manual, and poor choice of regions might have led to misleading results.
 

Attachments

  • AE Background speed and angle extraction.xlsx
    143.1 KB · Views: 59
Last edited:

sitarzan

Member
Apologies for the redundant info on OpenCV. I was in the process of drafting the following when you posted your comment @MickWest.


cvCalcOpticalFlowBM
Calculates the optical flow for two images by using the block matching method

Code:
void cvCalcOpticalFlowBM(
           const CvArr* prev,
           const CvArr* curr,
           CvSize blockSize,
           CvSize shiftSize,
           CvSize max range,

           int usePrevious,
           CvArr* velx,
           CvArr* vely );

...

The function calculates the optical flow for overlapped blocks blockSize.width×blockSize.height pixels each, thus the velocity fields are smaller than the original images...OpenCV Reference Manual - pg 373 - Motion Analysis and Object Tracking

Content from External Source

If I understand OpenCV's documentation correctly, it sounds like the above cvCalcOpticalFlowBM function makes it possible to use Particle Image Velocimetry techniques; but all in software (i.e., no "laser light sheets" required)...


Particle Image Velocimetry (PIV) is a non-intrusive state-of-the-art technique for flow measurements...The PIV technique is based on image recording of the illuminated flow field using seeding particles ... The light scattered by the particles is recorded on a sequence of image frames ... by applying a spatial cross-correlation function as implemented by the OpenPIV resulting with a two dimensional two component velocity field ... The flow is illuminated twice by means of a laser light sheet forming a plane where the camera is focused on...

...

...to find out the velocity field of the moving fluid...from images of small particles, called tracers. The basic principle is to use two images of
the same particles with a small time delay between them. For that purpose, typically two laser shots are created and two images are taken...
OpenPIV Documentation — ppg 3, 8
Content from External Source

...I got something working using Python...

I believe DeepFlow (demoed in this Velocity from animated sequence / Optical Flow forum post) is Python too...


...
AFAIK people generate vectors in external apps (usually After Effects or Nuke with plugins like Twixor or go fancy with openCV libs or hacked ffmpeg builds), and then use that image sequence to make a velocity field. I assume that's how the amazing video below was done.

(edit)

Ah, he says in another video he uses deep flow: https://thoth.inrialpes.fr/src/deepflow/

(/edit)

...

Source: https://youtu.be/N8Sed-c1sJI


...
Content from External Source
 

Edward Current

Active Member
I overlaid black lines where I have consistently found "new clouds," and then did a very rough eyeballing calculation of the area under the curve for each segment:

AE Gimbal Speed.png

This supports that the curve may be a bit high at the end. (About 1/4 of a field passes in the last segment.)

This is the position curve of my "camera goal," which is the animated line that I use to make sure the camera panning is on track. It's the antiderivative of the speed curve.

Screen Shot 2022-02-26 at 12.48.37 PM.png
 

Leonardo Cuellar

Active Member
I think we are going a little further. How many rad/sec does a given crest of clouds move during the movie? Is it possible to get it?
 

Mick West

Administrator
Staff member
I think we are going a little further. How many rad/sec does a given crest of clouds move during the movie? Is it possible to get it?
That's what the "speed" graph is showing, except in pixels, going approximately from 7 to 1 pixels per frame, or x30 to 210 and 30 pixels per second. The file used is 480 pixels high so multiply by 0.35/480 to get 0.15 to 0.02 degrees per second. All VERY rough.
 

Mick West

Administrator
Staff member
I'm a little out of the loop on where 0.35 might come from though.

I'm guessing that would be this number...

Yes, the display in the 2x NAR mode is 0.35° by 0.35°. Since the video is 480 pixels high, then we can divide by 480 and then multiply by 0.35 to get degrees per pixel.
 
Top