Ukrainian UAP Study: "Observation of Events"

Mick West

Administrator
Staff member
2022-08-26_11-43-24.jpg
Article:
The Main Astronomical Observatory of NAS of Ukraine conducts an independent study of UAP also. For UAP observations, we used two meteor stations installed in Kyiv and in the Vinarivka village in the south of the Kyiv region. Observations were performed with colour video cameras in the daytime sky. We have developed a special observation technique, for detecting and evaluating UAP characteristics. According to our data, there are two types of UAP, which we conventionally call: (1) Cosmics, and (2) Phantoms. We note that Cosmics are luminous objects, brighter than the background of the sky. We call these ships names of birds (swift, falcon, eagle). Phantoms are dark objects, with contrast from several to about 50 per cent. We present a broad range of UAPs. We see them everywhere. We observe a significant number of objects whose nature is not clear. Flights of single, group and squadrons of the ships were detected, moving at speeds from 3 to 15 degrees per second.


There are a variety of observations, basically it seems like they pointed cameras at the sky, and waited for things to fly across the field of view. They estimated the distances to be quite large, but I'm a bit suspicious. Especially since the "phantom" objects look very like flies.

2022-08-26_11-41-13.jpg2022-08-26_11-41-35.jpg

What do they say about "phantoms"
Article:
Phantom shows the colur [sic] characteristics inherent in an object with zero albedos. It is a completely black body that does not emit and absorbs all the radiation falling on it. We see an object because it shields radiation due to Rayleigh scattering. An object contrast makes it possible to estimate the distance using colorimetric methods. Phantoms are observed in the troposphere at distances up to 10 - 12 km. We estimate their size from 3 to 12 meters and speeds up to 15 km/s.

Phantoms are dark objects, with a contrast, according to our data, from 50% to several per cent.

Figures 7 and 8 show the image and color charts of the phantom object. The object is present in
only one frame, which allows us to determine its speed of at least 52 degrees per second, taking into
account the angular dimensions of the frame.

Fig. 8 shows the color characteristics inherent in an object with zero albedo. This means that the
object is a completely black body that does not emit and absorbs all the radiation falling on it. We see
an object only because it shields radiation in the atmosphere due to Rayleigh scattering. An object
contrast of about 0.4 makes it possible to estimate the distance to the object as about 5 km. The
estimate of the angular velocity given above makes it possible to estimate the linear velocity not less
than 7.2 km/s.
2022-08-26_11-54-22.jpg


The claim that Fig 7 shows an object with zero albedo seems specious. The image is entirely consistent with a normal dark object like a fly, and there's no comparison with such an object to eliminate it.

The spacing of the object in fig 13 is also suspect. If it were a high fast object, the frames would be equally spaced. That they are not, suggests a curved trajectory, consistent with a fly, like this:
Mother Cow Fly Seq  at 38s.jpg
 

Attachments

  • 2208.11215.pdf
    1.3 MB · Views: 243
Last edited:
Did they identify/exclude any insects?
The only control they cite is:
In Figure 9 we can see a local feature (water tower). The color diagram of the tower in Fig. 12 gives a distance estimate of 0 ± 1 km. The actual distance is about 300 meters. Thus, colorimetric measurements confirm our estimates.
Content from External Source
2022-08-26_16-01-04.jpg2022-08-26_16-01-25.jpg
I assume they are referring to the lower object in Fig 9.
2022-08-26_16-02-08.jpg
It's a bit odd, though, as that's clearly a false-color image, not an RGB image. The object is shown as blue, but the RGB values are essentially a noisy grey outside the object and black inside it. Their conclusion that "The color diagram of the tower in Fig. 12 gives a distance estimate of 0 ± 1 km." seems entirely unfounded based on this data. (Not to mention 0 ± 1 means -1 to +1, which is nonsensical)

Maybe I'm missing something?
 
Equipment used:

ZWO ASI 178MC:

1661554301313.png
https://astronomy-imaging-camera.com/product/asi178mc-color

ZWO ASI 294MC-Pro:

1661554331432.png1661554354252.png1661554375601.png
https://astronomy-imaging-camera.com/product/asi294mc-pro-color

Computar lenses 6mm (they don't specify the exact model or how they are used, but it looks like the model below would attach to the ZWO ASI 294MC-Pro above):

1661554814362.png1661554888440.png1661554964841.png
https://www.ebay.co.uk/itm/134002114215
https://www.computar-global.com/

Software SharpCap 4.0 used for data recording in file format .SER:

https://www.sharpcap.co.uk/sharpcap/sharpcap-downloads/4-0-beta
 
Last edited:
Maybe I'm missing something?
So the R,G, and B values in their graphs all start at 1. So they essentially set a reference point, then plot the relative values of R,G,B against that.

2022-08-26_16-28-59.jpg

So it would seem that the graph is a series of point samples of color, probably from a line across the object, with each R,G,B component relative to the first sample.

What they say about that graph (fig 8) is interesting:
Fig. 8 shows the color characteristics inherent in an object with zero albedo. This means that the object is a completely black body that does not emit and absorbs all the radiation falling on it. We see an object only because it shields radiation in the atmosphere due to Rayleigh scattering. An object contrast of about 0.4 makes it possible to estimate the distance to the object as about 5 km.
Content from External Source
But how??? They also say:

Fig. 11 shows an object contrast of about 0.3. It makes it possible to estimate the distance to the object as about 3.5 km
Fig. 14 shows an object contrast of about 0.55. This makes it possible to estimate the distance to the object at about 6.0 km
Content from External Source
And the water tower has a contrast of 0.0, and they estimate the distance as about 0.0 km (0 +/- 1)

Which all suggest they are using a very simple function to translate a single number contrast to distance. Although it's not particularly sensible looking.

2022-08-26_16-40-21.jpg
 
What the heck is the The Main Astronomical Observatory of NAS of Ukraine?
It has it's own little site. But listed anywhere else?

Not on the most exhaustive list of research observatories I know of:
https://en.wikipedia.org/wiki/List_of_astronomical_observatories

By country: https://en.wikipedia.org/wiki/Category:Astronomical_observatories_in_Ukraine


Not here either:
https://www.go-astronomy.com/observatories-ukraine.php
List of publications:

https://www.researchgate.net/profile/Boris-Zhilyaev

https://www.researchgate.net/profile/Volodymyr-Reshetnyk

Petukhov, V. N.

Related previous publications:

DAYTIME OBSERVATIONS OF SPACE INTRUSION (abstract only)

B.E.Zhilyaev, V.N.Petukhov, V.N.Reshetnyk, A.P.Vid’machenko, I.R.Buriak Main Astronomical Observatory of National Academy of Sciences of Ukraine, 03143, Kyiv, 27 Zabolotnoho St, Ukraine, zhilyaevv@mao.kiev.

Daytime observations of space intrusions have their own specifics. The brightness of the daytime sky is 4-6 magnitudes per square arc second. This makes it possible to observe stars up to magnitude 3 during the day with small telescopes. Objects of observation can be bright meteors, objects of artificial origin and of unknown nature. Devices and technologies for observing intrusions in the daytime sky have been developed. The hardware allows for subsecond exposures and a dynamic range of up to 96 decibels. The software implements 2D and 3D filtering and pattern recognition. We demonstrate observations of traces of meteor intrusions in the daytime sky, as well as objects of unknown nature with unusual properties for which we find no rational explanation.

The physics of space intrusions. III. Colorimetry of meteors (abstract only).

This article describes our approach to quantifying the characteristics of meteors such as temperature, chemical composition, and others. The program includes new algorithms for estimating temperature, heat radiation emitted by a fireball, and spectra of meteors containing emission lines. We are using a new approach based on colourimetry. We analyze an image of Leonid meteor-6230 obtained by Mike Hankey in 2012. Analysis of the temporal features of the meteoroid trail is performed. The main fluctuations in the brightness and wobbling of the meteor trail are observed at a frequency of about 3 Hz. The brightness variations in the integrated light are about 3%. The amplitude of the wobbling is about 2%. For determining the meteor characteristics we use the "tuning technique" in combination with a simulation model of intrusion. The progenitor of the meteor was found as an object weighing 900 kg at a speed of 36.5 km/s. The meteoroid reached a critical value of the pressure at an altitude of about 29 km in a time of about 4.6 sec with a residual mass of about 20 kg, and a residual speed of about 28 km/s. At this moment, a meteoroid exploded and destroyed. We use the meteor multicolour light curves revealed from a DSLR image in the RGB colour standard. We switch from the RGB colour system to Johnson's RVB colour system introducing colour corrections. This allows one to determine the colour characteristics of the meteor radiation. We are using a new approach based on colourimetry. Colourimetry of BGR three-beam light curves allows the identification of the brightest spectral lines. Our approach based on colourimetry allows direct measurements of temperature in the meteor trail. We find a part of the trajectory where the meteoroid radiates as an absolutely black body. The R/G and B/G light curves ratio allow one to identify the wavelengths of the emission lines using the transmission curves of the RGB filters. At the end of the trajectory, the meteoroid radiates in the lines Ca II H, K 393, 397 nm, Fe I 382, 405 nm, Mg I 517 nm, Na I 589 nm, as well as atmospheric O I 779 nm.
 
What about the possibility of drones or aircraft since Ukraine is an active war zone right now? I don't see when this data was collected. Drone + miscalculation of distance and speed?
 
They say that the phantom object exhibits 0 albedo, reflecting and emitting no energy, no light. Their own graph seems to contradict that, assuming the sharp dip in the rgb values is the object, since the dip does not go to 0. I do not understand why they mention Rayleigh scattering -- which may just mean that I dont understand it. Can anybody explain?

(Respectfully, were I in Ukraine right now I might be focusing my attention on other things. This just seems very strange.)
 
What about the possibility of drones or aircraft since Ukraine is an active war zone right now? I don't see when this data was collected. Drone + miscalculation of distance and speed?
Their previous papers on colorimetry are from 2021. The first that mentions unexplained phenomena is from at most August 2021, thus they probably have been observing unexplained phenomena since they started the space intrusion (meteors and meteorites) observation campaign around the end of 2020, beginning of 2021.
 
They say that the phantom object exhibits 0 albedo, reflecting and emitting no energy, no light. Their own graph seems to contradict that, assuming the sharp dip in the rgb values is the object, since the dip does not go to 0. I do not understand why they mention Rayleigh scattering -- which may just mean that I dont understand it. Can anybody explain?

(Respectfully, were I in Ukraine right now I might be focusing my attention on other things. This just seems very strange.)
It's normal human behaviour to try to fall back to normality after the initial shock of a tragic event.

They mention Rayleigh scattering because that's the only light source. It's the sky scattering the light emitted by either the Moon, or the Sun, or background stars of known magnitude. If an object blocks a patch of the sky, then it produces a measurable contrast relative to the background. If the object is highly reflective, as the only source of light is scattering from a source of known magnitude, then, from the measured contrast and assumptions, it's possible to estimate the thickness of the airmass between the sensor and the object, theoretically giving away its distance to the sensor.
 
Their conclusion that "The color diagram of the tower in Fig. 12 gives a distance estimate of 0 ± 1 km." seems entirely unfounded based on this data. (Not to mention 0 ± 1 means -1 to +1, which is nonsensical)

Maybe I'm missing something?
That's because they need considerable airmass between the object and the sensor in order to calculate the distance, thus it seems they can't measure with good accuracy the distance to the same water tower closer than 3km away with those cheap astrophotography cameras. The water tower was too close to serve as a "calibration" target.

For the detections with only one camera, the limitation in accuracy for close objects opens up the possibility of insects, birds and bats being the culprit, if they can't demonstrate that they are undetectable within 1km. Also, being close to the ground opens up the possibility that targets could also be illuminated by light scattered from man-made sources, affecting the contrast and thus the estimated distance.

For detections with two cameras, their results look more interesting, as they can use parallax, and the distance between cameras is so big that possibly they can only look at objects at high altitude and in orbit.
 
The experiment seems relatively cheap to run. Why don't we set up cameras like these ourselves with identical methodology (as close as possible with the info given) and see how easy it is to get a faulty distance calculation from a known insect flying across the camera. Maybe we'll even catch some aliens while we're at it.
 
Last edited:
Regarding distance calculations in the paper - if objects are smeared across a frame due to fast motion, their spectra are obviously mixed with the spectra of background sky (motion blur), thus invalidating any measurements based on authors' "colorimetric methods". See Fig. 9 for an example.
 
Last edited:
Maybe their methodology is comparable to the one Hartmann used in the Condon report to study the McMinnville photographs, see http://project1947.com/shg/condon/case46.html
Screenshot_2022-08-21-13-17-33-379~2.jpeg
One would expect that the scattering coefficient b is related to albedo.
Also note how in equation (5) everything is normalized to the sky brightness, i.e., the contrast with respect to the sky brightness is taken.
For a black body with known temperature, B0 would also be known.

Maybe this gives a small piece of the puzzle. I'm not sure what they are doing either...
 
How are they making black body observations using RGB values from an optical CCD? You need a diffraction spectroscope right? I'm pretty sure anyone who understands EMS detection could critique this methodology.
 
How are they making black body observations using RGB values from an optical CCD? You need a diffraction spectroscope right? I'm pretty sure anyone who understands EMS detection could critique this methodology.
The CCD is essentially also separating colour bands, but just with greater bandwidth (RGB). A good and nicely resolving spectrometer, like a hyper-spectral imager is great to have but perhaps they cannot afford it.


My "theory" is that this group is out of funding, because of the war. So, why not set up a nice interesting thing that attracts people? I cannot prove this though.
 
In the formula of post #20 above, B0 is the object's brightness if you were right in front of it, without changing anything to the object's position. So the albedo, the light directy reflected, and any light radiated by the object itself are all packed in this one variable. (So forget my earlier remark in post #20 that the small b is somehow related to albedo, it is not. It purey represents the scattering coefficient of the atmosphere.)

If you move away from the object, the intensity B0 diminishes because of scattering by the atmosphere in between you and the object (the extinction term in the equation). At the same time, the air in between you and the object emits photons on its own in your direction due to atmospheric Rayleigh scattering - photons that appear to come from the object but in reality don't. This is the first term in the equation.

As a result, the observed brightness of the object will approach the brightness of the sky as you move further and further away, because the light from the object itself becomes increasingly scattered and atmospheric Rayleigh scattering from the air in between you and the object becomes increasingly dominant.

It seems a bit like for the 'phantoms' they assume B0 = 0 (see post #20 above). In other words, the 'phantoms' are 'completely black' if you would be right in front of them. Then only the scattering coefficient b and the brightness relative to Bsky is sufficient to calculate distance.
Somehow, the RGB ratios give them a clue that the 'phantoms' can assumed to be black. I just can't figure out how..
 
Last edited:
Satellite?
20 Hz indicates it's spinning at most 20 times a second (1,200RPM). If it's a failed satellite with 4 solar panels extended, it would be spinning at at least 5Hz/300RPM whilst giving out a 20Hz signal, if what they detected was the reflection of the panels as the satellite rotated, it's too fast for normal operations. Thus it's either space junk from an anti-satellite missile test, or failed satellite/rocket, or faulty measurement/assumptions, or a new kind of space asset, or not man-made.
 
Last edited:
What the heck is the The Main Astronomical Observatory of NAS of Ukraine?
It has it's own little site. But listed anywhere else?

Not on the most exhaustive list of research observatories I know of:
https://en.wikipedia.org/wiki/List_of_astronomical_observatories

By country: https://en.wikipedia.org/wiki/Category:Astronomical_observatories_in_Ukraine


Not here either:
https://www.go-astronomy.com/observatories-ukraine.php
Wikipedia:
The National Academy of Sciences of Ukraine (NASU; Ukrainian: Національна академія наук України, Natsional’na akademiya nauk Ukrayiny, abbr: NAN Ukraine) is a self-governing state-funded organization in Ukraine that is the main center of development of science and technology by coordinating a system of research institutes in the country. It is the main research oriented organization along with the five other academies in Ukraine specialized in various scientific disciplines. NAS Ukraine consists of numerous departments, sections, research institutes, scientific centers and various other supporting scientific organizations.
Content from External Source
https://en.m.wikipedia.org/wiki/National_Academy_of_Sciences_of_Ukraine

NAS website:
https://www.nas.gov.ua/EN/Pages/default.aspx
 
So the R,G, and B values in their graphs all start at 1. So they essentially set a reference point, then plot the relative values of R,G,B against that.

2022-08-26_16-28-59.jpg

So it would seem that the graph is a series of point samples of color, probably from a line across the object, with each R,G,B component relative to the first sample.

What they say about that graph (fig 8) is interesting:
Fig. 8 shows the color characteristics inherent in an object with zero albedo. This means that the object is a completely black body that does not emit and absorbs all the radiation falling on it. We see an object only because it shields radiation in the atmosphere due to Rayleigh scattering. An object contrast of about 0.4 makes it possible to estimate the distance to the object as about 5 km.
Content from External Source
But how??? They also say:

Fig. 11 shows an object contrast of about 0.3. It makes it possible to estimate the distance to the object as about 3.5 km
Fig. 14 shows an object contrast of about 0.55. This makes it possible to estimate the distance to the object at about 6.0 km
Content from External Source
And the water tower has a contrast of 0.0, and they estimate the distance as about 0.0 km (0 +/- 1)

Which all suggest they are using a very simple function to translate a single number contrast to distance. Although it's not particularly sensible looking.

2022-08-26_16-40-21.jpg
I am trying to work out how that graph means anything else other than "it was darker than an arbitrary reference point"
 
This reminds me of something another UFOlogist was trying to do in relation to the Phoenix Lights. He was trying to do a "spectral analysis" of ordinary photos and videos. What he presented were actually crude histograms of the brightness across a pixel-high strip of the photos.

Colorimetry can not be used as these researchers seem to be attempting.

Colorimetry is "the science and technology used to quantify and describe physically the human color perception."

https://web.archive.org/web/2009051...ilities/photo/Publications/OhnoNIP16-2000.pdf
The perception of color is a psychophysical phenomenon, and the measurement of color must be defined in such a way that the results correlate accurately with what the visual sensation of color is to a normal human observer.
Colorimetry, in my experience, is used in photography and printing to make things look right. It has no meaning in relation to analyzing the nature of objects recorded on photos or videos.

It is in no way related to a spectral analysis.

A spectral analysis must use data observed and recorded with special equipment. It cannot be done in a post hoc fashion with ordinary photos or videos.

In general, this group (?) seems to be trying to do a post-hoc analysis of videos that have not been collected with specialized equipment. This has been described as like trying to do a DNA analysis of Abraham Lincoln using a portrait photo. The data isn't there.
 
Last edited:
I am trying to work out how that graph means anything else other than "it was darker than an arbitrary reference point"
Yeah.. maybe their reasoning is something like: if the object itself does not emit any radiation, you will only see the contribution of the atmosphere behind the object subtracted from the reference. This means a bigger drop in blue and a lesser drop in red, since the atmosphere has a bluish tint due to Rayleigh scattering. Something along those lines..
 
It's normal human behaviour to try to fall back to normality after the initial shock of a tragic event.

They mention Rayleigh scattering because that's the only light source. It's the sky scattering the light emitted by either the Moon, or the Sun, or background stars of known magnitude. If an object blocks a patch of the sky, then it produces a measurable contrast relative to the background. If the object is highly reflective, as the only source of light is scattering from a source of known magnitude, then, from the measured contrast and assumptions, it's possible to estimate the thickness of the airmass between the sensor and the object, theoretically giving away its distance to the sensor.
My problems with that -- do we know the object is not lit, either by it's own light, or by the Sun or reflected Earth-light? Is the object highly reflective, or "0 albedo," or somehwere in between, and how do they know that?
 
Yeah.. maybe their reasoning is something like: if the object itself does not emit any radiation, you will only see the contribution of the atmosphere behind the object subtracted from the reference. This means a bigger drop in blue and a lesser drop in red, since the atmosphere has a bluish tint due to Rayleigh scattering. Something along those lines..
It reads like pseudo science to me, they need to:

  • Release all the footage/data and have the article's conclusions rigorously peer reviewed, it "feels" like pseudo-science to me but I am not an SME here.
  • Write a description in a more layman terms, science communication is important if you want your extraordinary conclusions to be accepted.
 
Last edited:
Yeah.. maybe their reasoning is something like: if the object itself does not emit any radiation, you will only see the contribution of the atmosphere behind the object subtracted from the reference. This means a bigger drop in blue and a lesser drop in red, since the atmosphere has a bluish tint due to Rayleigh scattering. Something along those lines..
I've made a diagram to illustrate the basic underlying Physics of their method for the "Phantom" object (camera looking straight up):

1661615411222.png
 
This looks like a comic book to me. An illustration based on nothing.


Scattering of light doesn't work as shown.

Luminous column of air? What's that?

Blocked light? Blocked by what?
 
Last edited:
I've added an RGB profiler to Sitrec.
https://www.metabunk.org/sitrec/?sitch=rgb

2022-08-27_09-04-41.jpg

The default image is from a video of a swarm of bees I took. You can add your own image by dragging it over the region on the left.

Click and drag to select a region, the default behavior is to sum columns left to right. The first column is used as a reference for each of R,G,B, so on the graph they all start out at 1 (graph's y-axis is 0..2)

In the options, you can select centerLine to just sample a line through the center of the box (relative to the first pixel on that line). The other options are not really that relevant to the RGB profiler, so leave them alone.

This is, I think, essentially what they are doing in this paper. Here's a few bees
2022-08-27_09-14-27.jpg

2022-08-27_09-15-13.jpg
2022-08-27_09-17-02.jpg

Considerable variation in objects just a few meters from the camera. The last one has a contrast of about 0.6, which they seem to suggest would make it 6km away.

Of course this is just an image from a video. But I suspect any image containing insects would have similar variability for reasons of focus, motion blur, and the color and lighting of the insect.

Absent more details, I think their distance estimating methodology is not useful.
 
This looks like a comic book to me. An illustration based on nothing.


Scattering of light doesn't work as shown.

Luminous column of air? What's that?

Blocked light? Blocked by what?
Blocked by the object. The column of air between the object and the sensor and behind the object have their own brightness and color, which varies with the depth of the airmass.

From their paper:

"The colors of the object and the background of the sky make it possible to determine the distance using colorimetric methods. The necessary conditions are (1) Rayleigh scattering as the main source of atmospheric radiation; (2) and the estimated value of the object’s albedo. The object partially shields the diffuse sky background and thus becomes visible. (...)"
 
I can't probe your mind to see what you see in the diagram. Explain what you see first, so it can be clarified. Also, read the paper before commenting, as you clearly missed the line I highlighted in their ranging method.
 
I can't probe your mind to see what you see in the diagram. Explain what you see first, so it can be clarified. Also, read the paper before commenting, as you clearly missed the line I highlighted in their ranging method.
It's not complicated. Things look more like the sky as they get further away. So if you know all the parameters involved you can roughly calculate the distance.

But they don't.
 
Light spreads, it doesn't self-organize into columns.

Light scatters, it doesn't self-organize into columns.

The concept of a "luminous column of air" and a "dark column" is a naïve and unique(?) conjecture.

The authors try to invent a new definition of Colorimetry as far as I can tell. "We use colourimetry methods to determine of distance to objects and evaluate their colour characteristics."

The only thing measured with their technique is the video frames themselves, not the objects depicted in the frames.

Once again like trying to do a DNA analysis of Abraham Lincoln by taking scrapings from this photo. The data isn't there.
Abraham_Lincoln_O-77_matte_collodion_print.jpg
 
Last edited:
https://www.mao.kiev.ua/index.php/en/home
Observatory was headed by Acad. of USSR Academy of Sciences O.Ya.Orlov (1944 - 1948 and 1951 - 1952); Corresponding Member of USSR Academy of Sciences V.P. Tsesevych (1948 - 1951), Corresponding Member of USSR Academy of Sciences A.A. Yakovkin (1952 - 1959), Academician of UkSSR Academy of Sciences E.P. Fedorov (1959 - 1973), Ph.D. I.K. Koval (1973 - 1975). Since 1975 the Observatory of NAS of Ukraine has been headed by Academician of NAS of Ukraine Yaroslav Stepanovich Yatskiv.
http://ukr-vo.org/personalities/index.php?b&6&lit=Y&idp=658

Yaroslav Stepanovich Yatskiv is currently 81 years old, the last dated activity in his biography is: During 2010-2011 he was in charge of the laboratory "Ukrainian Center for determining the parameters of the Earth's rotation."

This observatory seems to be SW of Kyiv "... 12 km from the center of Kyiv Holosiiv forest."


Absolute conjecture on my part: This small legacy facility is all but abandoned and someone.... a naïve, but well meaning soul ... a caretaker?... a student?... is responsible for this "research"?
 
Last edited:
I do understand the concept. There is a part of the atmosphere from which photons that happen to be scattered in your direction cannot reach you because these photons are blocked by the object. This means less atmospheric photons will be able to reach you, which makes that part of the sky a little darker. This is how a distant black object that does not reflect any radiation becomes visible - as a part of the sky which is darker than the surrounding sky.

The term Bsky(1 - e^(-br)) in post #20 represents this effect. The term Bsky/e^(br) that is subtracted from Bsky represents the scattered light that is not able to reach you because it is blocked by the object. The further away the object is, the smaller this term will be. If the object is at distance 0, the term equals Bsky, so no atmospherically scattered photon will reach you because the object is blocking them all.

The only question I have is how do they conclude the object is a 'black body'? My guess is that the ratios of attenuation between red and blue play a role in this. The insects shown by Mick all show a larger drop in red than in blue. The 'phantoms' in the article show the reverse: a larger drop in blue than in red. My gut feeling is that this is one of the reasons why they conclude it is a 'black body' object (at least by approximation).
 
Last edited:
Back
Top