Ukrainian UAP Study: "Observation of Events"

JMartJr

Senior Member
I've made a diagram to illustrate the basic underlying Physics of their method for the "Phantom" object (camera looking straight up):

1661615411222.png
I think the curvey arrows going into columns is going to make this diagram confusing for folks. Correct me if I am wrong, but I think what you are trying to illustrate is that the further away the object, the more space you have for light to get scattered to your eye from particles/molecules in the columnand thus the more washed out the object will appear. If that's it, then I can see how that could be used to estimate how much atmosphere is between you and the object, but I think you'd have to know how much light the object is emitting or reflecting.
 

Itsme

Active Member
I think the curvey arrows going into columns is going to make this diagram confusing for folks. Correct me if I am wrong, but I think what you are trying to illustrate is that the further away the object, the more space you have for light to get scattered to your eye from particles/molecules in the columnand thus the more washed out the object will appear. If that's it, then I can see how that could be used to estimate how much atmosphere is between you and the object, but I think you'd have to know how much light the object is emitting or reflecting.
Screenshot_2022-08-21-13-17-33-379~2.jpeg
There are two effects at work here.

What you describe is represented by the term B0.e^(-br), the second term in equation (4). For a completely black object that does not emit or reflect any light this term will be 0 (if we neglect black body radiation caused by temperature), i.e. B0 is 0.

The first term in this equation describes what is illustrated in john.phil's picture. That term is NOT about the light emitted or reflected by the object itself, but describes the brightness caused by normal background atmospheric scattering - the thing that makes the sky look bright. The presence of the object prevents photons that happen to be scattered from the atmosphere in your direction from reaching you, because they are blocked by the object (see post #40 for more info). These photons originate from a cone-shaped (not column shaped) part of the atmosphere that is behind the object.
 

Z.W. Wolf

Senior Member.
https://goo.gl/maps/3PNSrWnpeT5eGaQQA


Main Astronomical Observatory of Academy of Science of Ukraine on Google Earth


It looks to me to be nothing more than a tourist attraction now... maintained by volunteers.

https://www.google.com/maps/place/M...7e413f480!8m2!3d50.3643236!4d30.4957274?hl=en


From Google Earth reviews:

3 years ago
"Almost forgotten place surrounded by nature, almost nothing to see except old buildings. It’s your 15 min short bike-around trip."

a year ago
"Conduct very interesting excursions. Children and adults are provided with a lot of cognitive information. Be in a picturesque place. It would be necessary to properly finance, but only altruists remained to work here."

a year ago
"Impressive place! It is interesting to visit here and touch the moon and the stars, which are many millions of distances away from you. It is unfortunate that astronomy is poorly funded and the buildings are in disrepair, and the museum was raised at the expense of the people who work there for the idea and in the name of science. I advise you to visit, the impressions are great. Entrance costs 150 UAH per person (~ 5 $)."



This is a list of papers by B. E. Zhilyaev.
https://link.springer.com/search?dc.creator=B. E. Zhilyaev

Who is he? This isn't very informative:
https://www.researchgate.net/profile/Boris-Zhilyaev
 
Last edited:

Z.W. Wolf

Senior Member.
I wonder if they were trying to use a Slitless spectroscopy method? And the translation came out as colourimetry? I can't see how this could be a valid technique though.

Edit: Nah. Just trying to make some kind sense of this.
 
Last edited:

Mick West

Administrator
Staff member
2022-08-27_16-49-02.jpg
In the RGB charts, the bottom scale is pixels. The noisy region on either side of the dip will represent the noise in the sky. We only have the monochrome image on the left, but can replicate it in the Sitrec RGB profiler by saving the image as a 42 pixel PNG (attached), and then taking a line sample across the darkest part.

2022-08-27_16-59-06.jpg

Suggesting that the RGB version of this somehow demonstrates a "black body" is ludicrous. The separation of the channels instead seems to suggest either an object that is reflecting or emitting light unevenly, or possibly chromatic aberration. Given the low resolution, and the noise, it might even simply be noise.

The paper seems deeply flawed.
 

Attachments

  • Phantom 42 pixels.png
    Phantom 42 pixels.png
    5 KB · Views: 44

FatPhil

Senior Member.
They say that the phantom object exhibits 0 albedo, reflecting and emitting no energy, no light. Their own graph seems to contradict that, assuming the sharp dip in the rgb values is the object, since the dip does not go to 0. I do not understand why they mention Rayleigh scattering -- which may just mean that I dont understand it. Can anybody explain?

(Respectfully, were I in Ukraine right now I might be focusing my attention on other things. This just seems very strange.)

I disagree with their definition of albedo 0 too - obsidian has 0 albedo, despite it reflecting light well enough to use as a mirror. Hmm, thinking about it, polished silver should have albedo 0 too. Albedo's about *diffuse* reflection, and mirrors have absolutely none of that.

I think their reference to Rayliegh scattering is that they're simply trying to contrast things against "the sky" - which they are assuming is uniform raleigh-scattering-dominated illumination. They seem to have forgotten that the uniform mixing of the frequencies depends on the input frequencies, and thus is in constant flux.

At the moment, they're teetering on the edge of the "not even wrong" filing cabinet - you know, the inverted-conical-frustum-shaped one.
 

Ravi

Senior Member.
In optical labs solid body diffuse reflectance standards are used. The best (most used) ones are from Labsphere, which is a company that makes a plastic material called "spectralon" that is made from "sintered PTFE", a very open structure wherein the light can bounce around and at the same time is perfectly Lambertian. The highest "albedo" they offer is 99%, but they also have less than that, they just add an amount of carbon.

I used them myself, pretty cool material and is also used in many optical instruments on satellites.
spec.png
Spectralon

At the same time, in the atmosphere, a huge amount of tiny droplets of water (cloud) also becomes a great diffuser with a high albedo value, as the light can also bounce around in there. It is actually even used by scientist (Earth observing) to determine albedo and calibrate the instruments.
 

FatPhil

Senior Member.
Light spreads, it doesn't self-organize into columns.

Light scatters, it doesn't self-organize into columns.

It doesn't need to, nor is there any claim made of "self-organising" - you're introducing that bunk yourself. Every photon that enters the optical aparatus appearing to be coming from the direction of the blocking object *necessarily* came from outside the column of air between aperature and object, and then was scattered such that it entered the aperture. The image *can* be interpreted to have meaning.
 

Mick West

Administrator
Staff member
Every photon that enters the optical aparatus appearing to be coming from the direction of the blocking object *necessarily* came from outside the column of air between aperature and object, and then was scattered such that it entered the aperture.
Except for the photons that come from the object itself. Of course, those photons also originally come from the sun (probably)
 

FatPhil

Senior Member.
Except for the photons that come from the object itself. Of course, those photons also originally come from the sun (probably)

But those are excluded: "It is a completely black body that does not emit and absorbs all the radiation falling on it." (from quote in post #1)
 

Ravi

Senior Member.
But those are excluded: "It is a completely black body that does not emit and absorbs all the radiation falling on it." (from quote in post #1)
Just for the record, a perfect black body does not exist, but some come close (99.9%). Therefore some photons still get scattered by the black body surface. Also one can still expect some radiation coming from it, probably LW IR.
 

Mick West

Administrator
Staff member
But those are excluded: "It is a completely black body that does not emit and absorbs all the radiation falling on it." (from quote in post #1)
Excluded by the writers of the paper - but they produce no evidence to back that claim of the "phantom" being a black body. That's one of the things I asked them about via email (no response)
 

Mick West

Administrator
Staff member
Took a short video of a plane today, and something flew by. The 0.8 dimness of the plane indicates it's far away
2022-08-31_11-28-20.jpg

Nearby leaves are 0.25, but they are green.
2022-08-31_11-30-32.jpg

The flying object is about 0.7, with a separation of the color channels. So, far away?
2022-08-31_11-31-49.jpg

It's a fly, of course. I saw it zip by. I'm just including this as an example of the flawed methodology.
 

FatPhil

Senior Member.
It's a fly, of course. I saw it zip by. I'm just including this as an example of the flawed methodology.

An in-focus and non-motion-blured fly would be a fairer test. But any test that requires the test subject to be just-so isn't a particularly useful test.

As a heuristic, the principle being employed can work, we use it all the time - we interpret the distances of mountains relative to each other using it, for example, whether we do it consciously or not. But in that situtation, it's always a relative measurement with the two candidate objects under the same conditions. I think I've seen some 3D game rendering engines employing this too, but I suspect that's so that they can seamlessly impose a fake horizon.

Edit: Fence posts (with chain link fence between, so that everything's relative to sky) might be a more controlled test case.
 
Last edited:

Mick West

Administrator
Staff member
An in-focus and non-motion-blured fly would be a fairer test.
Well, you could use the leaves, noting that there's a variety of different intensities, even though they are all at the same distance, and are the same color.
2022-09-01_08-23-29.jpg
The separation of the R,G,B lines indicates something that's not black.
 

Z.W. Wolf

Senior Member.
It doesn't need to, nor is there any claim made of "self-organising" - you're introducing that bunk yourself. Every photon that enters the optical aparatus appearing to be coming from the direction of the blocking object *necessarily* came from outside the column of air between aperature and object, and then was scattered such that it entered the aperture. The image *can* be interpreted to have meaning.
I disagree. The diagram self-produced by member john.phil shows light doing things it doesn't do. I think that's important. If you don't think that's important... I don't know what to say.

The main issue remains that this so-called "colorimetry" technique is useless and random.

MW has not received a response to his email. I suspect that's because the address is long abandoned. There's no indication that Boris Ephimovich Zhilyaev is still an active researcher or even alive.

The facility known as the Main Astronomical Observatory of Academy of Science of Ukraine is a relict. A tourist attraction. It's unclear whether or not there was an official skeleton staff still there even before the current lamentable war. I've seen some indication that at least one emeritus professor still used one of the minor telescopes - a 16 inch Celestron - and there was a demonstration using the solar telescope several years back.

Zhilyaev's research seems to have been largely concerned with Slitless spectroscopy and techniques for interpreting data gathered by the highly specialized equipment used in that technique. Ordinary video cameras aimed at the sky can't gather that kind of data.

I suspect that someone... and this is just my suspicion... someone used Zhilyaev's name, and attempted... in a very naïve way... to use some of his techniques for interpreting data. (A museum volunteer? An undergraduate student?)

Here's an example of Zhilyaev's work: https://link.springer.com/article/10.3103/S0884591313030057

The telescope: http://www.terskol.com/telescopes/zeiss.htm
Used a grating spectrograph.

The facility at Terskol peak is modern. In use when the relicts at the Main Astronomical Observatory of Academy of Science of Ukraine (barely above sea level and under the light polluted skies of a major city) probably were not.
http://www.terskol.com/telescopes/lhs_telescope.htm
One should be able to get some idea of the level of sophistication involved in spectroscopy.

Slitless spectroscopy is used... in very specific circumstances... in astronomy.


But even with the proper equipment, it seems highly unlikely to this (very) amateur astronomer that this technique could be used to measure distances(!) of moving objects(!) inside the atmosphere(!).

We all understand Rayleigh scattering. The issue is that the technique has no validity. The only thing measured is the video frames themselves, not the objects depicted in the video frames.
 
Last edited:

Litchy

New Member
2022-08-27_16-49-02.jpg
In the RGB charts, the bottom scale is pixels. The noisy region on either side of the dip will represent the noise in the sky. We only have the monochrome image on the left, but can replicate it in the Sitrec RGB profiler by saving the image as a 42 pixel PNG (attached), and then taking a line sample across the darkest part.

2022-08-27_16-59-06.jpg

Suggesting that the RGB version of this somehow demonstrates a "black body" is ludicrous. The separation of the channels instead seems to suggest either an object that is reflecting or emitting light unevenly, or possibly chromatic aberration. Given the low resolution, and the noise, it might even simply be noise.

The paper seems deeply flawed.
Are you sure the x-axis is pixels? (Figure 8) Given the return to a base or “normal” relative intensity, I suspect this is a function of time, likely ms
 

Mick West

Administrator
Staff member
Are you sure the x-axis is pixels? (Figure 8) Given the return to a base or “normal” relative intensity, I suspect this is a function of time, likely ms
Yes, it shows 42 pixels, the same as in the image, and the greyscale intensity graph is a match.
 

Charlie Wiser

Active Member
Just from a graph-drawing point of view, I'm unhappy the y-axis does not start at zero given they're illustrating intensity and claiming the phantom is "completely black". The graph may be accurate but it's misleading as a visual representation.
1663119741443.png
 

yoshy

Member
Has anyone found any responses to this by other experts in the field? Vice just wrote an article on it. No new information. No comments from other experts; they don't even mention reaching out to other experts. And most interestingly,

Boris Zhilyaev, the lead researcher on the paper, declined to comment.
 

jarlrmai

Senior Member
Yeah that's exactly what I would do if I had world changing verifiable scientific evidence of unknown objects coming from space, just clam up and ignore everyone.
 

Mick West

Administrator
Staff member
I'm trying to see how they justify the two-site claims. Here are the images they show:2022-09-14_12-04-57.jpg

2022-09-14_12-12-20.jpg

2022-09-14_12-12-51.jpg

Article:
Fig. 21 demonstrates two-site observations of UAPs. It is necessary to synchronize two cameras with an accuracy of one millisecond. Shoot at a rate of at least 50 frames per second is needed. In a field of view of 5 degrees at a base of 120 km, objects above 1000 km can be detected.

An object against the background of the Moon was detected at zenith angle 56 degrees. Parallax about 5 degrees was evaluated. This allow us to evaluate distance equal to 1524 km, altitude 1174 km, and linear speed of 282 km/s.

Coincidence of 2-point light curves in Fig. 22 means: we observe the same object. Fig. 23 shows the light curve at a sampling rate of 125 Hz. The object flashes for one-hundredth of a second at an average of 20 times per second.


These seem to be two different patterns. They don't show the actual lights (although it would be trivial to make a composite image demonstrating this.) The images are oddly distorted. Here I've rescaled them to make the moon circular, and the same size in both.
2022-09-14_12-21-16.jpg

They also appear to be taken with very different camera systems, with the Kiyv image being low resolution and compressed. The X and Y scales seem meaningless. They claim a field of view of 5°, but the images are clearly cropped from a much wider field of view. The Kiyv image is just 184 pixels across.

They do not explain where the "2-point light curve" in fig 22 comes from, but it seems difficult to relate to the photos because it shows very low resolution on/off equal length durations of exactly 0.02 seconds, or a flash every 0.04 seconds. Even at that resolution, the two curves are quite different, with the second peak being higher than the third in one, but lower in the other.

Fig 23 claims to be sampling at 125hz, but only shows one graph. The spacing of the flashes is about 0.05 seconds for the first three, then 0.04 for the last four.
 

Mick West

Administrator
Staff member
Here are the two graphs with the scales adjusted to be the same. They claimed the cameras are synchronized to 0.001 seconds (1ms). I've also scaled and aligned them in various ways. They are not the same, in any sense.
2022-09-14_13-50-54.jpg


2022-09-14_13-55-49.jpg2022-09-14_13-56-30.jpg
 

Mick West

Administrator
Staff member
There seem to be a variety of people trying to replicate it, with mixed results. So I thought I'd give it a go too, to get some reference.


I used a Sony A6400 camera with a 500mm lens, recording 4K video at 30hz and 1/1000th exposure (electronic shutter). I focussed on the moon and got about 12 minutes of video, so 12*60*30 = over 20,000 frames at fairly high resolution (3840x2160). This produced an 8.72 GB file!

No results yet. It's quite a challenge, looking through 20,000 frames for phantoms. I'm using a technique I've used before, re-rendering the video with an "Echo, Minimum, 120" effect, which stores only the darkest pixel in the last 120 frames. I initially tried applying this (in After Effects) but the time estimate for completion was something like 50 hours!. So I re-rendered at 1080p, and it's telling me it should be done in three hours.....
 

jarlrmai

Senior Member
1/1000 is quite a high shutter for video. Each frame lasts 1/30th of a second but the shutter is only open for a fraction of that. If something very fast crosses the frame in that interval you don't capture it.
 

Mick West

Administrator
Staff member
A practical note. The slow-speed encoding was partly related to memory pressure. Closing other apps has sped this up quite a bit. But I could certainly do with more memory.

2022-09-15_09-18-24.jpg

2022-09-15_09-20-59.jpg
 

Mick West

Administrator
Staff member
1/1000 is quite a high shutter for video. Each frame lasts 1/30th of a second but the shutter is only open for a fraction of that. If something very fast crosses the frame in that interval you don't capture it.
If it's IN the frame, then it will be captured. The objects they describe cross the frame over several frames, like flies do. 1/1000 prevents motion blur.


If something crossed the frame in 1/30th of a second then it would be a faint blur (if even visible) at 1/30 sec exposure, or a possible single image at 1/1000th

If something were crossing the entire frame in 1/1000th of a second then you would not see it either way (other than a possible faint blur at 1/1000th)
 

jarlrmai

Senior Member
If it's IN the frame, then it will be captured. The objects they describe cross the frame over several frames, like flies do. 1/1000 prevents motion blur.


If something crossed the frame in 1/30th of a second then it would be a faint blur (if even visible) at 1/30 sec exposure, or a possible single image at 1/1000th

If something were crossing the entire frame in 1/1000th of a second then you would not see it either way (other than a possible faint blur at 1/1000th)
What shutter speed was used by them? What focal length?
 

Mick West

Administrator
Staff member
What shutter speed was used by them? What focal length?
1/1000th. Focal length is "6mm", and I'm not sure how to convert that. I may well be zoomed in more than them, based on the size of the moon. Here's one frame from my video.
2022-09-15_08-55-50.jpg

Sadly the 12 minutes revealed no phantoms. I'm reexporting it with the "max" filter, to see if there are any bright "swift" objects.
 

Mick West

Administrator
Staff member
This paragraph seems nonsensical:
Helmholtz established that the eye does not fix phenomena lasting less than one-tenth of a second. It takes four-tenths of a second to recognize an event. Ordinary photo and video recordings will also not capture the UAP. To detect UAP, you need to fine-tune (tuning) the equipment: shutter speed, frame rate, and dynamic range (14 - 16 stops).
Content from External Source
Firstly, the Helmholtz thing is irrelevant. It's about reaction times, not about if something is visible or invisible. The flies in the reference videos I've used sometimes cross the frame in much less than a tenth of a second. Sure, you might not always notice them, but that does not mean such an object is permanently invisible.

Secondly, saying "Ordinary photo and video recordings will also not capture the UAP" seems nonsensical as the shutter speed of 1000th of a second and frame rate of 50Hz are about what a phone will record at in daylight. I just took this 60hz video of a fly.



The exposure time is not recorded, but it's sufficiently fast to freeze the motion of the shadow of the fly as it traverses the frame faster than the Ukranian objects.

Finally: dynamic range (14 - 16 stops) is also irrelevant and seems to be a mistranslation of bit depth, which is dynamic resolution, which they mention later:
In practice, the exposure time was less than 1 ms, and the frame rate was no less than 50 Hz. Frames were recorded in the .ser format with 14 and 16 bits. Violation of these conditions leads to the fact that objects will not be registered during observations.
Content from External Source
14 and 16 bits (why both?) is great - normal photos use 8 bits. But it's not going to help unless the object is almost exactly the same color and intensity as the sky. If the relative intensity of a phantom were around 0.99, then yes, a higher resolution would help. But it's not, it's around 0.4 (less than half the brightness), so 8-bit would do just fine (you'd get values like sky=200 and object=80 on a 0..255 scale), and neither higher dynamic range nor dynamic resolution is going to help.

And, if your object's apparent color is very similar to the sky's color, you want a LOWER dynamic range, not higher, as all the detail is packed into a very small range

I could probably go on. But I think it's clear (as it was already) that this paper has many major problems.
 

Stingray

New Member
There’s a chance that the people involved might have given interviews in Ukranian that we could find and translate. I know some Russian so I was able to find their full names with publicly available information. This might facilitate our search for some clarification:
(name, patronymic, family name)
  • B.E. Zhilyaev -> Boris Yukhimovych Zhilyaev -> Борис Юхимович Жиляєв
  • V. N. Petukhov -> Volodymyr Mykolayovych Petukhov -> Володимир Миколайович Петухов
  • V. M. Reshetnyk -> Volodymyr Mykolayovych Reshetnyk -> Володимир Миколайович Решетник

Note that in the paper they used an N for Petukhov’s patronymic, I believe this was a mistake and that the correct is M, as shown in their staff list:
observatorylist.PNG
https://www.mao.kiev.ua/index.php/ua/observatoria/staff-ua
 

Lee100

New Member
Adding my 2 cent to the discussion: I believe this photo best illustrates what the paper is attempting to do: notice that the more distant a shadow is, the brighter and bluer it is.

img_3836.jpg

Now, as for the insect hypothesis, is it at least plausible? lets see:

The paper mentioned that the phantom moves at least 52 degrees per second. roughly angle of view of a 6mm lens

1663293024590.png
Now there are lots of insects with vastly different flying speed and size, so I'll assume that the object is a fly.
According to google a fly flies at about 7km/h or 74.8"/s and it's about 7mm or 0.25" long. if the phantom were a fly, it would have to be closer than 76" inches (about 1.9m) from the camera to achieve 52 degrees per second

At that distance a 7mm fly will be at least 0.17 degrees (or 612 arc seconds) long, of course there is motion blur, softness etc to consider. on the paper i can only find one reference to the size of phantom: it's 400 arc seconds. Certainly the numbers are close enough that we cannot completely rule out insects


1663293151869.png

The next issue is can a fly be imaged at that distance, even if we assume the lens is wide open, the DOF near limit is 0.74m, so yes, it can be imaged at that distance if it were a fly.

1663294951856.png

I still want it to be aliens though
 

Lee100

New Member
There seem to be a variety of people trying to replicate it, with mixed results. So I thought I'd give it a go too, to get some reference.

I used a Sony A6400 camera with a 500mm lens, recording 4K video at 30hz and 1/1000th exposure (electronic shutter). I focussed on the moon and got about 12 minutes of video, so 12*60*30 = over 20,000 frames at fairly high resolution (3840x2160). This produced an 8.72 GB file!

No results yet. It's quite a challenge, looking through 20,000 frames for phantoms. I'm using a technique I've used before, re-rendering the video with an "Echo, Minimum, 120" effect, which stores only the darkest pixel in the last 120 frames. I initially tried applying this (in After Effects) but the time estimate for completion was something like 50 hours!. So I re-rendered at 1080p, and it's telling me it should be done in three hours.....

considering how time consuming it is may i suggest putting out some trash or sugar water or some such thing to entice insects to fly around your camera?

Also can a telephoto lens even capture something close to the camera? have you tried putting an insect sized item 1-2 meters away from the camera to see if it's even possible to image it? Also while we're at it, maybe paint the object with e.g. mosou black to see if you can replicate the color effect?
 
Last edited:

jarlrmai

Senior Member
Equipment used:

ZWO ASI 178MC:

1661554301313.png
https://astronomy-imaging-camera.com/product/asi178mc-color

ZWO ASI 294MC-Pro:

1661554331432.png1661554354252.png1661554375601.png
https://astronomy-imaging-camera.com/product/asi294mc-pro-color

Computar lenses 6mm (they don't specify the exact model or how they are used, but it looks like the model below would attach to the ZWO ASI 294MC-Pro above):

1661554814362.png1661554888440.png1661554964841.png
https://www.ebay.co.uk/itm/134002114215
https://www.computar-global.com/

Software SharpCap 4.0 used for data recording in file format .SER:

https://www.sharpcap.co.uk/sharpcap/sharpcap-downloads/4-0-beta

https://astronomy.tools/calculators/field_of_view/

If I add a 6mm lens on the ZWO ASI 294MC-Pro the box doesn't even appear in the field of view demonstrator using the Moon as a demo target, green is a 40mm lens

6mm =

1663312700882.png

1663311969706.png
 
Last edited:

LilWabbit

Senior Member
The main fallacy in their 'colorimetric' methodology is regarding digital photographs as accurate distance meters. They also seem to ignore camera settings, image edits and the fact that the perceived brightness or colouring of an image or image detail does not always vary with distance. This pretty much renders their methodology pseudo-science.

Not too dissimilar from Travis Taylor's fallacy of regarding image brightness as an accurate thermometer discussed on another thread a while back.
 
Top