It's unpolished. But I've got a polisher, and it will be interesting to see how well it works without polishing. I was quite surprised what a good image you get reflected from a matt metal surface, so maybe a window might work somewhat with not polishing.20mm, 2mm thick, for 20 bucks. Not bad! Be careful with them, is my advice. If you drop em, they crack.
Can you show an example of what you mean?The images demonstrating glare from jet engines do show the typical spikes radiating out, which indeed do rotate with the optics, but the center of the blob does not since it is a thermal image of something physical (a hot exhaust cloud), not glare.
According to an article about the "Basics of Digital Camera Settings"External Quote:The digital non-uniformity correction (DNUC) channels pixel-to-pixel non-uniformity and converts the analog signal to digital. The low-noise 14-bit DNUC receives the ATFLIR detector video output and applies level and gain corrections to each pixel. A fine level calibration takes less than one second.
...
Level is adjusted from 0 (least brightness) to 9 (highest brightness)
...
Gain is adjusted from 0 (least contrast) to 9 (highest contrast).
...
ALG pushbutton switch: ... When boxed, selects automatic level and gain controlled by the FLIR pod
The ATFLIR manual doesn't say which type of gain the pod uses, but as the gain is applied by the unit that converts analog to digital, and as it is intended to increase the contrast, it is most likely real gain. Still there's a question of whether the exposure time in ATFLIR is fixed and there's only the "automatic level and gain" algorithm that the manual mentions or if there's some undocumented auto exposure algorithm in addition to that.External Quote:Gain can be before or after the analog-to-digital converter (ADC). However, it is important to note that gain after the ADC is not true gain, but rather digital gain. Digital gain uses a look up table to map the digital values to other values, losing some information in the process. ...
In general, gain should be used only after optimizing the exposure setting, and then only after exposure time is set to its maximum for a given frame rate.
That said, still of interest, and related to the above video, where I'm demonstrating glare from the "hot parts" that you see when roughly tail-on. The paper breaks down sources of IR into four components, and this diagram roughly illustrate what components are important at various viewing nagles.External Quote:
Details of military IR technology, missiles, and countermeasure systems arenecessarily classified, and the EW courses at NAWCWD Point Mugu are taught at thatlevel. This report is unclassified for distribution to a larger audience than is able to attend the courses. This report being unclassified limits the material included to general principles and the figures provided to drawings and photographs that have been released to the public domain. The IR images used are of F-4 and F-14 aircraft that are no longer operational in the United States.
Now do it with correctly exposed high contrast clouds and sky as a background.A similar garage experiment but with a turbine engine and IR sensors. More relevant than a lamp torch blinding a regular camera.
https://www.mdpi.com/1424-8220/18/2/428
MWIR (in function of aspect angle)
I think the Chilean plane-UFO does not show glare as you define it. I think we see the exhaust plume, or region of hot air mixed by wake turbulence behind an airliner. See paper below for example of how large wake turbulence can be.
This shows up as a blob in IR, but I'd bet that if the camera was rotating, the blob would rotate (i.e., not a glare as you define it with your lamp torch experiments).
How is it different? An F/A-18's engines fit inside one of the A340's engines, and the entire plane is well within a single engine's glare.Gimbal is a complete different story.
It was a rhetorical suggestion. The point is that here they have exposed for the exhaust. In Gimbal it's exposed for the narrow range of temperatures in very cold clouds. An ideal setting for large glare - similar to the Chilean case where even clouds and contrails are black.You can contact these guys and ask them to do another study. I don't have an engine turbine and IR cameras in my garage. And it's your job to prove your theory, not other's to disprove it!
The gap between an engine and the start of a contrail is the gap between ~600°C and -40°C.The figure I posted is for jet regime (close to the engines).
View attachment 61572
Generally, a high-bypass engine (A340) will be hotter than a conventional jet engine, but the F/A-18 exceeds that when it uses its afterburners.But two F/A-18 engines will be a lot hotter than one A340 engine.
Not super relevant to the 30NM hypothesis.Some new footage of fighter jets in infrared, 5-10 miles
Can you explain what you mean by this phrasing?good to take
Some new footage of fighter jets in infrared, 5-10 miles, taking a turn.
Source: https://twitter.com/DaveFalch/status/1690128011125743616?s=20
Source: https://twitter.com/DaveFalch/status/1690128011125743616?s=20
It's good to have more footages of planes in IR, to discuss how planes look in IR.Can you explain what you mean by this phrasing?
That's when the camera refocuses. I don't think it looks like Gimbal at all, more like a jet exhaust seen from the rear then side (see experiment posted above).Super nice, looks like GIMBAL, note what happens near 20s when they change the exposure: it directly affects the size of the glare, and even at this short distance can obscure the aircraft completely.
The refocus looks to be at 20 seconds, whereas there's a definite change in contrast at 18 seconds, where the sky looks darker and the size of the glare around the exhaust gets larger.That's when the camera refocuses. I don't think it looks like Gimbal at all, more like a jet exhaust seen from the rear then side (see experiment posted above).
But I'm out this discussion already, just thought people may be interested in footage/experiments of jet exhausts in IR.
Yes, but if that is true, the jet has to rotate with the glare later, doesn't it?One interesting thing that occured to me when watching it is that the orientation of the jet might explain the asymmetry of the glare - especially earlier in the video. Specifically the direction in which the engines are pointing has a larger glare, which might be from the exhaust plume.
No, the glare just has to be a bit bigger in that direction. There's two components to the shape of a glare:Yes, but if that is true, the jet has to rotate with the glare later, doesn't it?
The HFOV we care about in this video is for the MWIR camera. From previous data we have collected, an IR focal length of 675 = 1.0849 degrees (for the active portion of video) For the entire width of the screen (including the black bars to the left and right), it would be 1.0849 degree * (1920/1280) = 1.62735 degrees HFOV. Because this is essentially a prime lens, the accuracy of the reported field of view is generally quite high and consistent across cameras.
The FOVs quoted by Wescam in their datasheet are for the active portion of the video. That is, the 675mm focal length (which corresponds to what Wescam calls "1.1 degrees") means an angular field of view from the left side of the sensor image (not including the black bar) to the right side of the inset image. The HFOV of the entire image (including the black bars with no video) would be 1.1 * (1920/1280) = 1.65 degrees.
The fields of view they report for their EOW (Electro-Optical Wide) sensor are for the full frame width.
It usually talks about veiling glare and sensor blooming.Have you found any literature about aviation/infrared mentioning "glare"? I looked over a few papers about exhaust plumes etc... it seems it's an unknown term in that field.
Article: Discrimination between electronic and optical blooming in an InSb focal-plane array under high-intensity excitation
The blooming of IR FPA's by high-intensity thermal or laser sources is a significant problem in military IR imaging systems. In order to understand how to mitigate such blooming, we have attempted to understand the nature of the blooming phenomena by performing experiments exposing a 640 × 480 InSb FPA to high-irradiance that significantly degraded image quality. These experiments showed that stray radiation produced in the optics is the dominant contributor to the blooming effects.
After this discussion, my understanding is that glare and bloom are similar effects, caused by dispersion of strong light sources. Bloom encompasses in-camera phenomena (sensor/film and optics), glare indicates phenomena outside the camera (usually atmospheric).I will concede that my understanding of the word blooming was incorrect and it is used for optical effects as well.
Having worked with optical imaging and spectroscopic systems for many years, I have a lot of experience with all these effects. But, having been isolated to a specific scientific field, I am used to a particular set of terminology that isn't necessarily universally used.After this discussion, my understanding is that glare and bloom are similar effects, caused by dispersion of strong light sources. Bloom encompasses in-camera phenomena (sensor/film and optics), glare indicates phenomena outside the camera (usually atmospheric).
Other in-camera artefacts include diffraction spikes, bokeh, and lens flares.
Were you thinking "bleed" not "bloom"? In the digital sensor domain, such bloom is literally charge bleeding out of one cell into neighbouring ones.I will concede that my understanding of the word blooming was incorrect and it is used for optical effects as well.
It seems that "bloom" is a general term regarding an effect and there are multiple potential causes for that effect, including both those of the optics (e.g., stray light) and those of the sensor (e.g., charge bleed).Were you thinking "bleed" not "bloom"? In the digital sensor domain, such bloom is literally charge bleeding out of one cell into neighbouring ones.
Much of the "bloom" in the optics domain I tend to consider just a soft focus effect, whether by engineering design or not.
So all in all, I think I was with you on the interpretation of "bloom"
As different people from different backgrounds start to overlap fields, terminology can certainly become wierded. Just consider yourself lucky that tonightt you won't utter the phrase "a half-litre plastic pint glass", which is absolutely a thing now in some parts of the world.
After this discussion, my understanding is that glare and bloom are similar effects, caused by dispersion of strong light sources. Bloom encompasses in-camera phenomena (sensor/film and optics), glare indicates phenomena outside the camera (usually atmospheric).
Other in-camera artefacts include diffraction spikes, bokeh, and lens flares.