Ukrainian UAP Study: "Observation of Events"

Adding my 2 cent to the discussion: I believe this photo best illustrates what the paper is attempting to do: notice that the more distant a shadow is, the brighter and bluer it is.
AA02E2B5-43B8-474A-9960-DD665C238DDD.jpeg
That's generally true with distance, but in absolute numbers it's greatly dependent upon the precise atmospheric conditions at the time, even to the point that I'd suspect that even a highly calibrated instrument might give a different reading from top to bottom of its vertical field of view: think of ground-hugging bands of moist air, for example. The concept, although nicely illustrated by the large distances seen in your mountain photo, doesn't seem like a thing designed to work with a small, close sighting of an insect. And of course comparing shadows requires conditions that produce shadows.

I don't have your expertise with cameras; I'm just suggesting some limiting conditions that might make such methodology problematic.
 
That's generally true with distance, but in absolute numbers it's greatly dependent upon the precise atmospheric conditions at the time, even to the point that I'd suspect that even a highly calibrated instrument might give a different reading from top to bottom of its vertical field of view: think of ground-hugging bands of moist air, for example.
Calibrating by water tower horizontally and then looking at an object above practically ensures there'll be problems, because vertically the density gets less.
 
I think it is important to note, without judging the groups expertise, that from all the objects astronomers photograph, they have the least experience with close proximity ones. I think that is not trivial.
 
That's generally true with distance, but in absolute numbers it's greatly dependent upon the precise atmospheric conditions at the time
It's also highly dependent on the color of the object and how it is lit. The only chance you have of getting a distance calculation is if you know all the variables. Look at these bits of mountain. I've highlighted regions at similar distances that have very different tone.
2022-09-16_09-16-00.jpg
 
No. If they respond, I will post about it here.
Mick, did you try emailing Mr. Reshetnyk?
His contact information is available here:

https://space.univ.kiev.ua/reshetnyk-volodymyr-mykolajovych/

His last 4 publications suggests that he should be able to answer our questions (google translated):
  • Pokhvala SM, Reshetnyk VM, Zhilyaev BE Tests of commercial color CMOS cameras for astronomical applications // Advances in astronomy and space physics. - 2013. - Vol. 3, N 2. - P.145-146.
  • Pokhvala SM, Zhilyaev BE, Reshetnyk VM, Shavlovskij VI Low-resolution spectroscopy of the chromospherically active stars 61 Cyg AB with small telescopes //
  • Kinematics and physics of celestial bodies. - 2014. - Vol. 30, No. 6. - C. 25-27.
  • Simon A. O., Reshetnyk V. M. Determination of the photometric system based on observations of scattered globular clusters NGC 7243, NGC 7762 and IC 5146 // Bulletin of the Astronomical School. - 2014. - Vol. 10, No. 2. - P. 152-156.
 
His last 4 publications suggests that he should be able to answer our questions (google translated):
  • Pokhvala SM, Reshetnyk VM, Zhilyaev BE Tests of commercial color CMOS cameras for astronomical applications // Advances in astronomy and space physics. - 2013. - Vol. 3, N 2. - P.145-146.
  • Pokhvala SM, Zhilyaev BE, Reshetnyk VM, Shavlovskij VI Low-resolution spectroscopy of the chromospherically active stars 61 Cyg AB with small telescopes //
  • Kinematics and physics of celestial bodies. - 2014. - Vol. 30, No. 6. - C. 25-27.
  • Simon A. O., Reshetnyk V. M. Determination of the photometric system based on observations of scattered globular clusters NGC 7243, NGC 7762 and IC 5146 // Bulletin of the Astronomical School. - 2014. - Vol. 10, No. 2. - P. 152-156.
If his techniques were developed with astronomical applications in mind, it's no surprise that they're not meant to apply to an insect close to the camera.
 
AA02E2B5-43B8-474A-9960-DD665C238DDD.jpeg
That's generally true with distance, but in absolute numbers it's greatly dependent upon the precise atmospheric conditions at the time, even to the point that I'd suspect that even a highly calibrated instrument might give a different reading from top to bottom of its vertical field of view: think of ground-hugging bands of moist air, for example. The concept, although nicely illustrated by the large distances seen in your mountain photo, doesn't seem like a thing designed to work with a small, close sighting of an insect. And of course comparing shadows requires conditions that produce shadows.

I don't have your expertise with cameras; I'm just suggesting some limiting conditions that might make such methodology problematic.

I think it's impossible to ascertain genuine light and reflectivity from any photo. For example, my 'UFO' in this pic appears to be glowing under its own light. I could easily pass it off as having its own bright internal light, blazing away in the sky.....and off in the distance a mile or so away. In fact the 'UFO' is a butterfly caught in the sunlight....and about 6 feet away. I cite this primarily as an example of how easily a mere insect can be turned into a UFO. And note also the flies in the image, that have exactly the same triangular appearance as those in the Ukraine study...


P1050755 - Copy.JPG
 
If his techniques were developed with astronomical applications in mind, it's no surprise that they're not meant to apply to an insect close to the camera.

Mick, did you try emailing Mr. Reshetnyk?
His contact information is available here:

https://space.univ.kiev.ua/reshetnyk-volodymyr-mykolajovych/

His last 4 publications suggests that he should be able to answer our questions (google translated):
  • Pokhvala SM, Reshetnyk VM, Zhilyaev BE Tests of commercial color CMOS cameras for astronomical applications // Advances in astronomy and space physics. - 2013. - Vol. 3, N 2. - P.145-146.
  • Pokhvala SM, Zhilyaev BE, Reshetnyk VM, Shavlovskij VI Low-resolution spectroscopy of the chromospherically active stars 61 Cyg AB with small telescopes //
  • Kinematics and physics of celestial bodies. - 2014. - Vol. 30, No. 6. - C. 25-27.
  • Simon A. O., Reshetnyk V. M. Determination of the photometric system based on observations of scattered globular clusters NGC 7243, NGC 7762 and IC 5146 // Bulletin of the Astronomical School. - 2014. - Vol. 10, No. 2. - P. 152-156.

This research involves spectroscopy, which involves equipment that breaks down light before it is photographed.


640px-spectroscope_psf.svg7526027619667425123.png


Spectroscopy doesn't even rely on color film or video. It predates color film.

Diagrams1.jpg



It predates film.

Kirchhoffs_improved_spectroscope-572x500px.jpg



The research cited above seems to be about slitless spectroscopy, which involves inserting a disperser (in most cases a grism) into the optical path of light that would otherwise result in a regular image.

(And I suspect it involves developing techniques to use equipment that is relatively cheap, because of pitifully inadequate funding: for example, color CMOS cameras. They developed a technique to calibrate for the quirks of these cheap cameras. But the use of these cameras without the telescope and grism, and everything else involved, is nonsense.)

Grism
download (4).jpg


There must be something - prism, grism, whatever - in the optical path. It doesn't and can't involve an ordinary camera and the post-hoc analysis of an ordinary photo.

An analysis of an ordinary photo, or video frame, involves uncontrolled sources of variation. The quirks of the camera, undefined sources of light and shadow, unknown sources of color. How do you calibrate for light affected by Rayleigh scattering when you don't know the exact conditions across the entire sky? You can't.

How do you calibrate, on top of that, for light affected by things other than Rayleigh scattering? You can't.




More Reading: https://jwst-docs.stsci.edu/methods-and-roadmaps/jwst-wide-field-slitless-spectroscopy
 
Last edited:
Nice. Spectrometry in astronomy and the instrumentation used is my specialism for many years. It needs accurate spectral calibration using known sources (spectral lines). But it is a very powerful measurement method.
 
https://www.mao.kiev.ua/index.php/en/

The Main Astronomical Observatory is distancing itself from the UAP-Study

“Because this paper and results have drawn a lot of attention and sometimes an inadequate reaction, we organized the Astrophysical Seminar of the MAO NASU on September 15, 2022, where one of the authors, Dr.Sci. B.E. Zhilyaev, reported about these observations, interpretation, and argumentation why the observed events are precisely UAPs. The results of the discussion were reported at the meeting of the Scientific Council of the MAO NASU. Members of the Scientific Council came to the following conclusion:
- Observations conducted by B.Yu. Zhilyaev and his colleagues are original. But the processing and interpretation of results were performed at an inappropriate scientific level and with significant errors in determining distances to the observed objects. Also, the dates of observations are absent in the paper; the authors do not indicate which events were observed simultaneously from two sites; the authors do not provide arguments that natural phenomena or artificial objects of earthly origin may be among the observed UAPs (meteors; objects carried by the wind over long distances; space debris, artifacts, etc.).
- Instead of a critical analysis of observations (possible errors, adequacy of models, accuracy in post-processing), the authors postulate unreasonable conclusions about the characteristics of the observed objects as UAPs.
- The Scientific Council of the MAO NAS of Ukraine requests the authors to update the version of the article on the website of the relevant archive with the mandatory replacement of the sentence "The Main Astronomical Observatory of NAS of Ukraine conducts an independent study of UAP ..." to the phrase "The authors of this paper conduct an independent study of UAP ...".
- The Scientific Council of the MAO of NASU believes that the information published in the aforementioned paper by B.E. Zhilyaev et al. was premature and did not meet the professional requirements for publication of the results of scientific research.“
 
He's taking credit (or responsibility) for this? I can think of several different scenarios to explain this... all of them sad.
 
There are two comments appended to that report.
The following is from the comment made by O.A. Veles, Ph.D. in Phys & Math, MAO NAS of Ukraine:

A partial translation from Ukrainian via Google Translate.

A critical review of the opus B.E. Zhilyaev et al. "Unidentified aerial phenomena Observations of events"

After listening to the seminar, I became convinced of the frank anti-scientific nature of these studies. Instead of a critical analysis of observations (taking into account errors, adequacy of models, accuracy in post-processing), the authors fit the data to absolutely non-physical results.

The author of the "research" openly refuses to lead a discussion, answer questions, etc. As colleagues noted, the author openly professes a "religious" method of cognition, belief in UFOs or UAP, which is incompatible with the methods of modern science. He repeatedly weaves in the words or visions of some military personnel as an argument of the discussion, which have nothing to do with his observations nor to the discussed topic.

For uncomfortable questions or explanations of the unreality of the results of the interpretation of his observations within the framework of modern physics, the author refers to some metaphysical, supernatural laws or extraterrestrial technologies. Skeptical views on the accuracy of observations, remarks about the uncertainty or outright fallacy of the models are ignored or rejected by the author.

Accordingly, further scientific discussions are absolutely impossible, the place of such "research" is somewhere in a sect of "UFO witnesses" or a church, where "faith" and "miracles" are the main arguments for knowing the world.

Ouch.

Veles continues...
Nevertheless, I present my main comments and questions to the "article":

Photometry

1. The uncertainty of the shape of the object leads to significant errors in distance measurement even under otherwise ideal conditions. For example, an asymmetry of only 2 times leads to an error in determining the distance up to 3 times in the lower layers of the atmosphere and up to ten times at heights of more than 5-6 km, where the nonlinearity of the contrast-altitude dependence is the largest.

2. The uncertainty of the object's albedo leads to similar errors, even under otherwise ideal circumstances. The assumption of zero albedo is very far-fetched, if only because the authors themselves observe both bright and dark objects. And in nature it is difficult to find surfaces, whose albedo is less than 0.05 and more than 0.95.

Accordingly, the uncertainty of albedo can introduce an error in determining the distance by tens of times in the direction of decreasing distance.

Colorimetry

1. The authors use color cameras with a 6mm lens. The corresponding scale for the camera ASI 178MC (2.4 µm pixel) = 80 arc.sec/pixel = 1.3 arc.min/pixel ASI 294MC (4.63 µm pixel) = 2.6 arc.min/pixel Both values contradict the parameters given in the article (10 pixels = 3 arcmin ) respectively 4 and 8 times approximately.


2. At this scale, the distance between the blue and red pixels is proportional to the angular dimensions of the objects, which makes any colorimetry impossible.

For example, for the ASI 294MC camera, the distance between color pixels (6.5μm) becomes about 11 meters at a distance of 10 km. Accordingly, it simply does not make sense to measure the color of the declared objects with a size of 3-12 meters.

3. The article does not consider the problem of camera lens aberrations, but as can be seen from Fig.21 they are quite significant. The best wide-angle lenses have sizes chromatic aberrations at the level of 0.5-1 pixel, which can significantly affect the colors in neighboring pixels of the Bayer matrix and, accordingly, make huge errors in colorimetry.

Methodology

1. The authors use the Adobe RGB format for processing, which is inconvenient and very inaccurate for data manipulation because it is non-linear and contains some simplification (covers approximately 50% of colors visible to the eye).

2. When converting RAW data from .SER to an RGB space, interpolation is used
by neighboring pixels using reference points (white balance). It brings it down
nullifying all subsequent colorimetry.

Errors and problems in the text:
Page 1 and 7
1. It is not clear how the authors measured a flash width of one-hundredth of a second (10 ms), if on Fig. 23 pulses contain 3-4 points at 125 Hz? That is, the impulses continue
minimum 20-30 ms.

I think we've reached the point of diminishing returns...
 
Last edited:

Attachments

  • TRANSLATED review-2 Chief AKIOC of the GAO of the National Academy of Sciences, Ph.D. Veles O.A.pdf
    290.6 KB · Views: 173
  • review-1 Ph.D. S. Kravchuk .pdf
    83.6 KB · Views: 115
  • TRANSLATED review-1 Ph.D. S. Kravchuk.pdf
    141.9 KB · Views: 170
  • review-2 review-2 Chief AKIOC of the GAO of the National Academy of Sciences, Ph.D. Veles O.A.pdf
    366.5 KB · Views: 129
The last paragraph in review 1 is pretty interesting and I think it might be useful for analyzing other UFO photo if someone figure out how to take advantage of it.


"That is, the dependence is completely different from what the authors use. It can be seen that, in principle, it is possible to estimate the distance based on the data of observations at different frequencies to the object and this idea has the right to life, but the method of determining the distance implemented in the work is erroneous.

S. Kravchuk"
 
My main takeaway is one of respect towards the high scientific standards upheld at the Academy of Science of Ukraine (NASU) and its Main Astronomical Observatory (MAO). The succinct, sharp and informed analysis with which they debunked Zhilyaev's methodology and UFO conclusions puts the likes of UAPTF/AOIMSG to shame.

We may rightly criticise the former Soviet Union for many things, but they left a legacy of great science in certain disciplines.

Former UAPTF "Chief Scientist" Travis Taylor should go to Ukraine for a few lessons.
 
Avi Loeb has weighed in and determined they are artillery shells. As this forum discussed, the distance calculation is wrong.


Source: https://twitter.com/TOEwithCurt/status/1577650736686571524?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1577650736686571524%7Ctwgr%5E25379a849ca1e176e0cfe597a59409bc895bf21f%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.reddit.com%2Fr%2FUFOs%2Fcomments%2Fxwb5tr%2Faaro_director_visited_dr_avi_loeb_last_night_and%2F


https://avi-loeb.medium.com/down-to...fied-aerial-phenomena-in-ukraine-6d8bb9f64f85

I concluded that the reported speeds and sizes of the ``phantom’’ objects would have generated fireballs of detectable optical luminosity at their suggested distances, and so these objects could not have appeared dark. However, if the phantom objects are ten times closer than suggested, then their angular motion on the sky corresponds to a physical velocity that is ten times smaller, 1.5 kilometers per second and their inferred transverse size would be 0.3–1.2 meters, both characteristic of artillery shells.
 
Avi Loeb has weighed in and determined they are artillery shells.

I disagree with his determination, based on the image they show in the paper. I emailed Avi:


The biggest issue with the artillery shells hypothesis is the composite "phantom" image in the Ukranian paper:

2022-08-26_11-41-35.jpg

Firstly, and most significantly, the spacing (at 50hz) indicates a varying velocity perpendicular to the line of sight. That's not something an artillery shell could do in 1/25th of a second, but it's entirely consistent with the curved flight path of an insect such as a fly.

Secondly, the object has slight variations in its already irregular shape and variations in brightness on the sides parallel to the flight path.

2022-10-05_08-14-46.jpg

Both features can be seen in this composite video frame of a known fly.

2022-08-26_11-43-24.jpg
Content from External Source
He replied, saying it might be a rocket and that I must have forgotten gravity.

In a second email I wrote:

I assure you I did not forget about gravity. It's just not a factor for large fast objects over a very small time frame. The three frames represent two movement intervals of 1/50th of a second (0.02s)

If we say the object is of length L, then one movement is approximately 7.25L and the other is 10.5L.

So assuming constant acceleration, that acceleration is (10.05-7.25)/t^2
or 3.25L/0.0004

For a half-meter shell (or small rocket)

3.25*0.5/0.0004/9.81 = 414g

Unfeasibly large, and certainly not gravity.

For a half cm insect, on the other hand, that's just about 4g, a quite reasonable value for a fly (renown for their swatting avoidance aerobatics)
Acceleration.jpg
Content from External Source
[EDIT: fixed small issue with diagram labels]
 
Last edited:
Coincidentally, this was the exact same three-point calculation of acceleration I did for Sitrec a few days ago, so I had high confidence in the math.

Code:
    // Equations of motion say
    // v = u + at
    // so a = (v-u)/t
    //
    // s1 = ut + 0.5at^2
    // v = u + at
    // s2 = vt + 0.5at^2
    // s2 = ut + at^2 + 0.5at^2
    // (s2-s1) = ut + at^2 + 0.5at^2 - ut - 0.5at^2
    // s2-s1 = at^2

You don't really need to derive it from the equations of motion, it's just a nice double-check that it fits them.
 
Last edited:
I wanted to point out a rather BIG flaw in the paper, well, one of them at least, but the most important one that I saw "The stations are equipped with ASI 178 MC and ASI 294 Pro CCD cameras" Both of these cameras are CMOS, not CCD...
 
Last edited by a moderator:
I think the curvey arrows going into columns is going to make this diagram confusing for folks. Correct me if I am wrong, but I think what you are trying to illustrate is that the further away the object, the more space you have for light to get scattered to your eye from particles/molecules in the columnand thus the more washed out the object will appear. If that's it, then I can see how that could be used to estimate how much atmosphere is between you and the object, but I think you'd have to know how much light the object is emitting or reflecting.
The authors have been using this technique for ranging asteroids, which I would accept.. I don't see how this method could possibly be used for determining distances under 100km, and they are using it to determine like 10 km distances...
 
If his techniques were developed with astronomical applications in mind, it's no surprise that they're not meant to apply to an insect close to the camera.
If I remember correctly, Boris Zhilyaev developed this technique for calculating the distance of asteroids, and has several very credible research papers on the topic. This paper is nothing near the quality of his prior work, where I have to wonder if it is somehow a joke, or simply intended to bring more awareness to the Ukraine war. The guy isn't a quack at all, at least no more of a quack than any other astronomers/astrophysicists :) https://www.researchgate.net/profile/Boris-Zhilyaev
 
I can see how this could work if you have a large enough sample of object-materials and frames before and after the object appears. With object-materials i mean they need to know how various materials influence the measured data (glass, tempered glass, reflective / refractive / absorbing (such as vanta black paint) surfaces, etc).

Then they can arguably perform statistical calculations, based on the mean scatter / light levels of the sample, to account for all the reflected light that will hit the sensor regardless of an object blocking a "light column" and estimate its distance with an alpha / p-value that allows you to come up with a reasonable estimation.

the issue that i see however are the outliers in this approach.

i tried to draw an example:

48661346-7DE3-416C-B7FC-5458C84D173A.jpeg

1) A very small object in front of the lens / sensor of the measuring device.

2) A very large object in front of the main light source.

I would guess the applied method would only work for objects in between 1 and 2 while the closer the object in question is approaching (grey corridor) the limits (1 and 2) the higher the chance of it exceeding an alpha of 0.05 and or become indistinguishable of it being extremely far away or extremely close.

Bugs for example would qualify imo as approaching the limit of 1 while the moon (solar eclipse situation) would approach the limit of 2.

Without having a large enough sample size that includes the mentioned object variations, i just cant see how their functions will have any relevance. At the very least they have to say that this only applies to an hypothesized object with specified characteristics and they cant form a conclusion for anything else.
 
Last edited:
Back
Top