F-16 Pilot Chris Lehto's Interpretation of the GoFast footage [Focus, Parallax, Inaccurate Range]

This is the LTD/R indicator (LASER TARGET DESIGNATOR / RANGER) on the ATLFIR

1622978085503.png

Laser Status Indication - The laser status display provides 6 labels as follows:
  • LTD/R (flashing) - Laser is firing with ranging available.
  • LTD (flashing) - Laser is firing with no ranging available.
  • LTD (steady) - Laser is in standby.
LTD/R is not flashing in Go Fast, note the ATLFIR sim manual here only mentions use of the LTD/R in A/G (Air to Ground mode) The ATFLIR here appears to be in A/A mode by the configured display options.

The manual also says LTD/R then switches to LTD only in the display statuses, this could be a typo in the manual, or some version difference between the sim and the real ATLFIR. Or it could mean that the display shows it as there in A/A mode but there is no option to use it in A/A mode.

The following video contains a montage of ATFLIR footage showing LTD/R flashing but also positioned immediately underneath the pointing direction indicator. I thought it might be possible he's just referring to the indicator being in that general area, which depending on the model of ATFLIR he is probably right according to the footage.




Buuut he also seems to be doubling down on the pointing direction indicator also being the laser fire indicator.

flir nonsense.png
 
Last edited:

Source: https://twitter.com/alpha_check/status/1401853489295380488?s=20


This is baffling to me. I was assuming that range was from the radar and probably accurate. Pilots disagree. Obviously, we should listen to them as they know how those systems really work (we really don't unfortunately).

Especially if they are F-18 Navy pilots. Other pilots such as Chris make small mistakes (e.g. the L for left mistaken for laser as ATFLIR is not on the planes he used to fly). In general however the principles are the same.
 
Until they start talking publicly and add some detail of what exactly they mean it's really hard what to make of this.

I mean if you go back to basics, when the videos were released the accompanying analysis we were provided with was what we analysed against the video. This figure was accurate according to the TTSA etc, it has never been questioned by anyone else until now, it just appeared someone did the maths wrong, if at all.

https://thevault.tothestarsacademy.com/2015-go-fast-footage

1623072683005.png

So I mean who is wrong here? The 1st experts to analyse it, the second lot a few years later. I mean did TTSA/AATIP "not ask any pilots" even though they apparently had way more access to them than us? Did David Fravor an actual F/18 pilot or Lt Col Chris Cook (ret) who was on the Unidentified tv show not point it out?

I mean where is the bad range coming from? just a ghost in the system? Which system RADAR? the LASER? some other ranging system?

Even if the figure is wrong, why must the object be at/near sea level, why not closer to the jet? Why does it smoothly change as the jet moves towards it to keep inline with an object a steady altitude, a co-incidence? Is it wrong but referenced to a base figure that makes it right if you know the adjustment? Saying the range is wrong just raises a lot more questions.
 
After watching Chris' video I was really bugged by the very poor explanation of depth of field, and the very misleading (I would venture to say deliberately misleading) indoor demonstration of depth of field effects trying to disprove how an object and background several miles away from each other could not both be in focus because of "physics" apparently.

So I got to wondering if there was any way to determine what the focal ratio of the ATFLIR system is, and since you guys seem to be more knowledgeable about this system I thought this would be a good place to ask.

Focal ratio is a pretty simple calculation. Focal Length (mm) / Aperture (mm) = Focal Ratio. Using my ES 127 APO refractor as an example, with a focal length of 952mm and a aperture of 127mm it has a focal ratio of f/7.5.

Looking at the ATFLIR, I think the narrow field of view has been calculated by others here to have a focal length of around 2000mm (though this may be exaggerated somewhat by the sensor size of the camera.) For comparative purposes, a 8" Schmidt-Cassegrain telescope with an aperture of 203mm and a focal length of 2032mm has a focal ratio of f/10.

So for those who are more informed on the specs of these ATFLIR systems than I am, what's the estimation of the front window aperture? Just looking at it on Youtube videos my 5" refractor probably seems about the same size if not bigger, but for arguments sake let's say it's been about 5" and 8". That would still give the ATFLIR, at best, a f/10 focal ratio in narrow mode unless I am dramatically oversimplifying the internal optics of these systems.

Thoughts?
 

Source: https://twitter.com/alpha_check/status/1401853489295380488?s=20


This is baffling to me. I was assuming that range was from the radar and probably accurate. Pilots disagree. Obviously, we should listen to them as they know how those systems really work [emphasis added] (we really don't unfortunately).

Especially if they are F-18 Navy pilots. Other pilots such as Chris make small mistakes (e.g. the L for left mistaken for laser as ATFLIR is not on the planes he used to fly). In general however the principles are the same.

But Lehto doesn't know how the system works, does he? By his own account he doesn't know where the RNG figures come from. He just has a gut feeling that they must be wrong, based on his conviction that the object is much closer to the sea than the figures imply, which in turn is based on his misconceptions about focus and parallax.

If pilots know from experience that the RNG data are often wrong, that would be important for many reasons, but I'm not sure where it leaves us. We can still say that if the RNG data in the Gofast case are correct, then the object is not 'low and fast', but otherwise (if the data are not correct), it would only be possible to make a wide range of estimates, which would include the possibility that the object is even closer (and therefore smaller) than previously thought.
 
Even if the figure is wrong, why must the object be at/near sea level, why not closer to the jet? Why does it smoothly change as the jet moves towards it to keep inline with an object a steady altitude, a co-incidence? Is it wrong but referenced to a base figure that makes it right if you know the adjustment? Saying the range is wrong just raises a lot more questions.

I'm really suspect about all of this. When you throw two of Chris' observations together (range data being inaccurate, "physics" making it impossible for two objects miles apart to appear in focus) it really seems like an argument specifically designed to speed up the Go Fast object and eliminate more mundane possibilities.

If the depth of field demonstration wasn't so poor I'd be more charitable about accepting the range data being inaccurate.
 
But Lehto doesn't know how the system works, does he? By his own account he doesn't know where the RNG figures come from. He just has a gut feeling that they must be wrong, based on his conviction that the object is much closer to the sea than the figures imply, which in turn is based on his misconceptions about focus and parallax.

If pilots know from experience that the RNG data are often wrong, that would be important for many reasons, but I'm not sure where it leaves us. We can still say that if the RNG data in the Gofast case are correct, then the object is not 'low and fast', but otherwise (if the data are not correct), it would only be possible to make a wide range of estimates, which would include the possibility that the object is even closer (and therefore smaller) than previously thought.

Lehto said that it was the flir system performing trigonometry using internal software, basically making calculations about what it sees and making a best-guess.
 
doesnt make a lot of sense though without the laser as theres no reference to estimate size and therefore range

how i understood him, the laser does fire and gives back a range value BUT the laser is solely meant to measure distances against ground targets and delivers false values against air targets.

for whatever reason. maybe he fires a shit ton of photones and can collect enough that bounced back from a large surface to estimate range but receives too few returns on smaller airborne objects that move and most likely reflect and distort the beam too much

(im talking out of my ass now but given the limited understanding of particle physics this is what would make sense to me)
 
Lehto said that it was the flir system performing trigonometry using internal software, basically making calculations about what it sees and making a best-guess.

And yet he uses rough calcs to come up with his distance and bam presto, it's much more accurate than the computer designed to calculate it.

I mean, I find this hard to believe.
 
Really what guys like Mick, Chris, and the other debunkers need to do is have like a weekend symposium, or a way to meet in person as a group. The group could have a moderator. But anyway, it would be an opportunity for these guys to get together, and share expert opinions, compare notes, do analysis, and then put forth refined theories. Just a thought.
 
doesnt make a lot of sense though without the laser as theres no reference to estimate size and therefore range

how i understood him, the laser does fire and gives back a range value BUT the laser is solely meant to measure distances against ground targets and delivers false values against air targets.

for whatever reason. maybe he fires a shit ton of photones and can collect enough that bounced back from a large surface to estimate range but receives too few returns on smaller airborne objects that move and most likely reflect and distort the beam too much

(im talking out of my ass now but given the limited understanding of particle physics this is what would make sense to me)
I have to be honest here in saying that I don't understand what you don't understand. He clearly told us that the L symbol would be blinking if it had fired a laser and that in any other situation the camera is only observing photons hitting it and that's it. I don't see how there is any possible way to interpret what was said differently.

And yet he uses rough calcs to come up with his distance and bam presto, it's much more accurate than the computer designed to calculate it.

I mean, I find this hard to believe.

I suppose its possible that the flir pod uses a different method of guestimating a distance. Perhaps its worth investigating that aspect to compare to what he has shown us. One difference I can think might be present is that the FLIR pod would be trying to get a distance moment-by-moment (only a single "point", where the plane is in the current moment) whereas Lehto is using multiple "points" looking back at the situation in hindsight.

It would seem to me that there is a clear 'separation of concerns' here in terms of the responsibility of the equipment. Something like the Radar array would provide the accurate distance while the FLIR's main job is to see the target through clouds and from long distances where eye sight is not reliable. It's entirely possible that whatever method of estimation the flir pod uses to get a distance without the laser / radar was lazily implemented since thats not the main concern of the technology.
 
Last edited:
He's wrong about ATFLIR as far as I can tell

His assumptions seem to be

ATFLIR in A/G mode - it isn't based on the sim manuals, it's in A/A.
L is laser indicator - It isn't it's LTD/R on ATFLIR

I can see that the laser ranger might give an invalid ground range under certain circumstances dust/fog etc the problem is this ATFLIR is in A/A mode and the LASER is not used in that mode for ranging.
 
Really what guys like Mick, Chris, and the other debunkers need to do is have like a weekend symposium, or a way to meet in person as a group. The group could have a moderator. But anyway, it would be an opportunity for these guys to get together, and share expert opinions, compare notes, do analysis, and then put forth refined theories. Just a thought.

I think a discord would be useful, the amount of sources/data is getting overwhelming and discussions are probably needed.
 
I have to be honest here in saying that I don't understand what you don't understand. He clearly told us that the L symbol would be blinking if it had fired a laser and that in any other situation the camera is only observing photons hitting it and that's it. I don't see how there is any possible way to interpret what was said differently.



I suppose its possible that the flir pod uses a different method of guestimating a distance. Perhaps its worth investigating that aspect to compare to what he has shown us. One difference I can think might be present is that the FLIR pod would be trying to get a distance moment-by-moment (only a single "point") whereas Lehto is using multiple "points" looking back at the situation in hindsight. It would seem to me that there is a clear 'separation of concerns' here in terms of the responsibility of the equipment. Something like the Radar array would provide the accurate distance while the FLIR's main job is to see the target through clouds and from long distances where eye sight is not reliable. It's entirely possible that whatever method of estimation the flir pod uses to get a distance without the laser / radar was lazily implemented since thats not the main concern of the technology.
i said it doesnt make a lot of sense how the flir would estimate distance to an object, solely on visible light data without a radar or laser assisting. because you cant do trigonometry in this case if you dont know what the size of the object is.

so because this doesnt make a lot of sense, because the value you get in return could be really anything, thats probably the reason why pilots ignore this range value when it comes to flying objects.

the other scenario is laser guided estimation of distance, but apparently this doesnt work for A/A situations but only A/G.

the last possibility is the flir receives additional data from its onboard radar or another source. and this is the question that got raised a couple of times if this is true and if so how would we know if the RNG value is provided by radar and trustworty or calculated by estimating nonsense and therefore not usable
 
Excuse my ignorance, but wouldn't Radar measure the direct distance along the line of 'sight'? No trig needed for that. (Assuming we are talking about radar beams transmitted from the plane itself, not some other source.)

Would you want a radar system that detects an object 12 NM north of you and 5 NM below you to have range of 13 NM or 12 NM? Radar would tell you 13 NM. I'm tempted to say that I'd like the range to go to 0 NM as I approach being right below the object, but it depends what I wanted to do with the data. For an instance of such ranging being not useful, watch /Aliens/ (1986).

[Broken video link removed]
 
Last edited by a moderator:
Lehto said that it was the flir system performing trigonometry using internal software, basically making calculations about what it sees and making a best-guess.
Can you say where and when he said that, preferably with a link? On Saturday at #12 above I copied and pasted his explanation, probably also from Saturday, in the comments thread under his video. It doesn't say anything about trigonometry or software. If he has changed his mind, or maybe acquired further information, it would be useful to know when.
 
i said it doesnt make a lot of sense how the flir would estimate distance to an object, solely on visible light data without a radar or laser assisting. because you cant do trigonometry in this case if you dont know what the size of the object is.

so because this doesnt make a lot of sense, because the value you get in return could be really anything, thats probably the reason why pilots ignore this range value when it comes to flying objects.

the other scenario is laser guided estimation of distance, but apparently this doesnt work for A/A situations but only A/G.

the last possibility is the flir receives additional data from its onboard radar or another source. and this is the question that got raised a couple of times if this is true and if so how would we know if the RNG value is provided by radar and trustworty or calculated by estimating nonsense and therefore not usable

I mean it could be anything at this point and maybe starts breaking the rules in terms of speculation. For all we know there is an onboard database of known aircraft shapes and it does its best job fitting what it sees to a known aircraft shape and then determines a range based off how big that shape appears on the screen and what it thinks that shape matches up to. But IMO its not even worth speculating at this point, it seems moot when we both agree that pilots definitively don't trust that range indicator.
 
I mean it could be anything at this point and maybe starts breaking the rules in terms of speculation. For all we know there is an onboard database of known aircraft shapes and it does its best job fitting what it sees to a known aircraft shape and then determines a range based off how big that shape appears on the screen and what it thinks that shape matches up to. But IMO its not even worth speculating at this point, it seems moot when we both agree that pilots definitively don't trust that range indicator.
Have you got a reply to my question at #93 above? Take your time if you need to, but the usual custom here is that if you can't back up a claim you withdraw it.
 
I have had another look at the comments under Lehto's video. Unfortunately I cannot find the one I quoted at #12 above. There are over 600 comments so I may just have missed it. Or he may have deleted it. For the record, the quoted passage in #12 was copied and pasted from Lehto's comment without amendment. I now wish I had screencapped it at the time.

I was looking for any further comments on the disputed question of range (RNG), for example along the lines mentioned by abyssal dission in #84 above. Lehto repeatedly insists that the range information is unreliable, and in this case inaccurate, but he doesn't get much closer to explaining where it comes from. Perhaps the closest is this:
I believe I state it clearly at the end of the video that the range data displayed in the pod is erroneous. I think the object is a round ball 2-3 meters in diameter traveling in a straight line approx. 1,000' to 4,500' above the sea between around 0.8 mach and 0.93 mach. Because of the nature of the engagement, the range to this target is ambiguous. Pods are passive sensors and can only give us Line of Bearing to a target (very accurate). If we knew the size of the object we could mil size it. That would be another way to debunk West's claims. If it really is a weather balloon, which is by far the most plausible argument, based on the zoom range of the pod etc...we could reverse mil-size the object to determine range. I didn't have access to the TGP data though to get the mil sizing chart. if anyone knows what I am talking about it, please do this. It would be a great way to kill these range arguments
I didn't screencap that either, so let's hope it doesn't disappear!

While looking through the comments I noticed that a lot of people are disputing his claims about focus, but he isn't really replying.
 
But if the range indicator in a bloody fighter plane is giving false data, why is it there? Sounds very handy in crisis situations... "We are approaching our target, but uh, we have no idea how close we are, sorry".
 
While looking through the comments I noticed that a lot of people are disputing his claims about focus, but he isn't really replying.

The ATFLIR does have an opto-mechanical system in the design to focus. Thus it can be indeed an active autofocus is happing in the go fast video.

ir.png
 
It isn't that ATFLIR can't focus, it's that all camera systems have a hyperfocal distance at which point all objects are in focus.

For my 560mm lens setup, the hyperfocal distance is 2027 meters (2 kilometres)

That means something at 2 kilometres and something at 20 kilometres would both be in focus if we were "focused" on the 2KM object.

In the below photo the bird is only at ~6metres so the background can't be in focus, but at much longer distances this same rule doesn't apply.

1623086644002.png
 
in chris second debunk video. his plot doesn't coincide with the cameras behaviour, because he assumes the object is moving to the left (gimbal) when its actually stationary. so each plotted point the angle of heading(front of jet) to the LOS of the object wont match. so given this in his situation a gimbal rotation wont take place because the craft will always be in the Left degree range (to the left of the jet). also according to him the ufo is going constant velocity but in the video it clearly appears to slow down because of LOS and parallax, as the jet approaches. from the video we can see the jet has almost constant velocity.
 

Source: https://twitter.com/alpha_check/status/1401853489295380488?s=20


This is baffling to me. I was assuming that range was from the radar and probably accurate. Pilots disagree. Obviously, we should listen to them as they know how those systems really work (we really don't unfortunately).

Especially if they are F-18 Navy pilots. Other pilots such as Chris make small mistakes (e.g. the L for left mistaken for laser as ATFLIR is not on the planes he used to fly). In general however the principles are the same.


I'm dubious.

What is the point of adding data to the display that is not reliable? How often is it wrong? Is it only right/wrong under certain conditions? When it's wrong, is it only off by 5%? Or can it be wrong by 500%

If it is often wrong, how can know these sorts of errors aren't common with other sensors? If other sensors are sometimes wrong, perhaps that can explain lots of the "sensor data" for exotic UFOs that is rumored to be in the report?
 
If the object is stationary why do the pilots mention the radar returns (along with the object we can see) are moving against the wind? What type of stationary airborne object would produce the type of image we see in the video?
 
If the object is stationary why do the pilots mention the radar returns (along with the object we can see) are moving against the wind? What type of stationary airborne object would produce the type of image we see in the video?

What radar returns? Chris Lehto seems to think there weren't any (for the Gofast object).
 
It isn't that ATFLIR can't focus, it's that all camera systems have a hyperfocal distance at which point all objects are in focus.

For my 560mm lens setup, the hyperfocal distance is 2027 meters (2 kilometres)

That means something at 2 kilometres and something at 20 kilometres would both be in focus if we were "focused" on the 2KM object.

In the below photo the bird is only at ~6metres so the background can't be in focus, but at much longer distances this same rule doesn't apply.

Yes, indeed I am totally agreeing on this. It is just that the system at least has the ability to also focus when needed. No dispute here.
 
If the object is stationary why do the pilots mention the radar returns (along with the object we can see) are moving against the wind? What type of stationary airborne object would produce the type of image we see in the video?
Micks gimbal need not move video explains everything.
 
@Mick West

Shouldn't calculations be based on ground speed? Particularly when there was strong winds as called out by the Pilot/WSO (120 Knots to the west)
Actually, even ground speed may not work, particularly if the jet is being blown sideways by strong winds.

I'm really struggling to see how it's possible to calculate positions with any accuracy at all
 
Last edited:
@Mick West

Shouldn't calculations be based on ground speed? Particularly when there was strong winds as called out by the Pilot/WSO (120 Knots to the west)
Actually, even ground speed may not work, particularly if the jet is being blown sideways by strong winds.

I'm really struggling to see how it's possible to calculate positions with any accuracy at all

No all the calculations are doing in the frame of reference of a moving mass of air. You can kind of think of the air as being like a train, and the calculations are all about things moving around inside the train. It does not matter how fast the train is moving.
 
Yes, indeed I am totally agreeing on this. It is just that the system at least has the ability to also focus when needed. No dispute here.

I'm not sure I'd demand that objects at infinity be at best focus, as they're not going to be of interest for a while, and even if they could be focussed perfectly, they'd just be a perfectly focused unresolvable single pixel. So one could fix focus on HFD/1.1, and have everything between HFD/2.1 and HFF/0.1 in focus. Focus is a two-sided bell curve, optimising only one side of it could be considered short-sighted.
 
I've done a response video focussing mostly on Gimbal, but cover the focus issue of GoFast

Source: https://www.youtube.com/watch?v=fBeqP4z3rXo

I think the accuracy of the RNG requires some more evidence.


Great video, Mick.

The only ways I think it could be improved would be if you could explain the relationship between the hyperfocal distance, which is a property of the optics independent of the actual distance it's focussed at, and the depth of field when focussed at various fractions of the hyperfocal distance. You did touch on this with your large scale vs. small scale comment, but perhaps some real numeric examples, or even some graphs would help explain the difference.

(Diversion for those not familiar with the concept of hyperfocal distance:
If you centre focus on HFD/R, then everything between HFD/(R+1) and HFD/(R-1) will be sufficiently in focus.
E.g. If you focus on HFD, then HFD/2 to infinity will be in focus. However, if you focus on HFD/5, then only HFD/6 to HFD/4 will be in focus.
I imagine the ATFLIR's HFD is a few kilometres, in which case "HFD/2 to infinity" could definitely cover the "5NM to 10NM" from Lehto's "It's obvious ... you can't focus on something 10 miles away and ... focus on something 5 miles away".
Let's take some numerical examples (I'll drop the units):
HFD=10. Focus on 5, in focus=(3.33, 10). Focus on 10, in focus=(5, inf). Focus on 6, in focus=(3.75, 15)
HFD=20. Focus on 5, in focus=(4, 6.67). Focus on 10, in focus=(6.67, 20). Focus on 6.67, in focus=(5, 10)
So if the HFD of the ATFLIR is anything less than 20NM, then Lehto's obvious assertion becomes obiously false. I will confess that I did snigger when he started his sentence with "It's obvious", it did make me expect a false claim.

I presume his video camera setup probably had a hyperfocal distance of about 5m, focussing on his hand or face would be perhaps .5m and 1m, so again let's run the numbers for that example too:
HFD=5. Focus on .5, in focus=(0.45, 0.56). Focus on 1.0, in focus=(0.83, 1.25). Focus on 0.67, in focus=(0.59, 0.77).
Therefore the hand and face will not be in simultanious focus, it's one or the other (or neither!).
Conclusion - depth of field is much shallower for near things than for far things, but note that "near" and "far" are not absolutes, they're a property of the camera.
End Diversion.)

Can we get the ATFLIR's HFD? It's comparitively lowish resolution, so the CoC can be crappy, and therefore its HFD should be lower than one would get just by extrapolating DSLR data.

Damn, I've forgotten the other thing I was going to say now, so I'll stop here.
 
I found this very interesting forums discussion on

https://forums.eagle.ru/topic/272552-atflir-laser-in-aa-mastermode/

"Laser is not used for ranging in AA. The range is either from FLIR passive ranging or the FLIR Autotrack target merely being correlated to a Radar target and thus fused into a single MSI trackfile, and the range data would be from the Radar. With no passive range or another range contributor then the trackfile for the FLIR target would be angle-only without range."

This basically is what I deduced from the manual

"FLIR passive ranging" is mentioned and

https://www.yumpu.com/en/document/read/4020605/raytheon-brings-eo-technology-to-defend-our-nation

I found Raytheon owned patent

1623142259972.png

https://patents.google.com/patent/EP0820040A3/en

"The sensor (1) provides electrical signals representing the observed scene (3) and can be a visible light or infrared sensor. A computer (9) is used to identify the target from the data base, estimate the initial range to the target"

It is unclear whether the ATLFIR does any of this though and even if it does if it does it in this situation.

It's becoming clear that what would be really useful is the recording of the other MFD with the RADAR/SA on it.
 
Aha, I've remembered - this one would perhaps be a lot more work to expand on with any real depth, but I think it's important to note the limited precision and accuracy of *all* of the readings, and the inherent instability in the calculations.

53, 38, and 21 degrees *arent* 53, 38, and 21 degrees, they are 52.5-53.5 degrees, 37.5-38.5 degrees and 20.5-21.5 degrees.
Half a degree one way on one line and half a degree the other way on a different line, and the extrapolated intersection point can easily go from on the page to infinity and beyond! Again, you do mention how easy it is to make the intersections grow without bound by changing the rate of turn near a critical point, but an essential property of this assymptotic behaviour is that very small changes in almost any of the inputs, such as the on-screen IR bearings, can have enormous effects on the output. The lines on your diagrams are in fact triangles, and their intersection "points" potentially infinite quadrilaterals. In Lehto's regime (assuming it's a tight turn and a close target) it's less important, but in your more believable (looser turn, far away) scenario, it becomes more significant.

Of course, by making some simplifying assumptions and frame counting, one can come up with interpolated estimates to regain some accuracy, and reduce the extent of the areas involved.
 
Here's a diagram of the 10-mile, 5-mile situation to show how you can calculate the amount of blurring you'd expect if the ocean is out of focus. The diagram assumes the object is in focus. It's based on this Wikipedia section which you can use to check my work.
CoC.png
Each out-of-focus point (blue) on the ocean creates a circular image in the focal plane. When the blue point is twice as far as the focal plane (10 miles vs 5 miles), the circle's diameter is half that of the camera's aperture (C = A/2). You can confirm that with the equation from Wikipedia:
1623184328821.png
Because the ATFLIR pod is 0.33 meters wide, the camera aperture has to be less than 0.33 m. So the circular image of the blue point has a size in the focal plane of at most 0.33 m / 2 = 0.165 m.

If you know the size of the object or the angle of view of the camera then you can calculate how many pixels a point on the ocean gets blurred over. I recall seeing a size of 1 m for the target somewhere. Given that it's about 9 pixels across, that would mean the ocean would be blurred by at most 0.165 * 9 ~= 1.5 pixels. I think it looks blurrier than that, probably due to a combination of other factors.
 
Here's a diagram of the 10-mile, 5-mile situation to show how you can calculate the amount of blurring you'd expect if the ocean is out of focus. The diagram assumes the object is in focus. It's based on this Wikipedia section which you can use to check my work.
CoC.png
Each out-of-focus point (blue) on the ocean creates a circular image in the focal plane. When the blue point is twice as far as the focal plane (10 miles vs 5 miles), the circle's diameter is half that of the camera's aperture (C = A/2). You can confirm that with the equation from Wikipedia:
1623184328821.png
Because the ATFLIR pod is 0.33 meters wide, the camera aperture has to be less than 0.33 m. So the circular image of the blue point has a size in the focal plane of at most 0.33 m / 2 = 0.165 m.

If you know the size of the object or the angle of view of the camera then you can calculate how many pixels a point on the ocean gets blurred over. I recall seeing a size of 1 m for the target somewhere. Given that it's about 9 pixels across, that would mean the ocean would be blurred by at most 0.165 * 9 ~= 1.5 pixels. I think it looks blurrier than that, probably due to a combination of other factors.
The ocean is a distributed source and I bet it wouldn’t look significantly different even if it were a little out of focus, which it probably isn’t.
 
ATFLIR has a manual focus adjust, you see this in the FLIR1 footage and in the Technical manual
2021-06-08_17-55-31.jpg
2021-06-08_17-46-24.jpg2021-06-08_17-47-05.jpg

Described as:

FOCS increase/ decrease pushbutton switches

Pushbutton switches. Displayed by pressing FOCS pushbutton switch. Adjusts focusing of the infrared video on the FLIR control display. Pressing and releasing changes focus in small amounts. Pressing and holding changes focus very rapidly. Focus is adjusted from 0 (least sensitivity) to 8 (highest sensitivity) for close targets with an adjustment of 9 for far targets. Initially focus is set to 8.
Content from External Source
It's not clear what "sensitivity" here means, or what they consider a "close target" or "far target"

In FLIR1, the FOCS is set to 8. It is not displayed in GIMBAL or GOFAST.
 
Back
Top