Skinwalker Ranch S4 E11 Claims of "Wormhole" from LIDAR Scan with Gap

Well, if the red ring isn't caused by reflectivity but by some form of data reduction, then the label 'reflectivity' on that image is misleading. I suspect we won't know the reason unless we get hold of some other scans performed by the same equipment.
 
My earlier hypothesis in post #73 was that the red ring was caused by self-shadowing, also known as the opposition effect. If the sensor is placed on the drone so that the light from the centre of the scan is reflected directly back to the sensor, then reflections from that region could appear brighter than reflections at the edge of the scan, because the regions at the edge are seen at a more oblique angle and include more shadows. This could apply to the green region almost directly under the drone, since the scanner is scanning that region more obliquely.
 
Last edited:
This may be merely subjective, but I think it does. Remember to ignore the dark shadows.
I initially thought so, too, but drawing an elipse over it showed that if there is any slight difference at any point, it just gets lost in the noise. There's no significant difference in radius between the 2 semi ellipses.
 
My earlier hypothesis in post #73 was that the red ring was caused by self-shadowing, also known as the opposition effect. If the sensor is placed on the drone so that the light from the centre of the scan is reflected directly back to the sensor, then reflections from that region could appear brighter than reflections at the edge of the scan, because the regions at the edge are seen at a more oblique angle and include more shadows. This could apply to the green region almost directly under the drone, since the scanner is scanning that region more obliquely.
I am pretty convinced the red band is likely the stray angular light effect as similarly seen in Mick's post #97. It makes sense, the red is predominant in the parts in the scene that are "sandy", showing the effect of material/texture on the light reflectance.
 
The paper Mick references is this one.
https://www.mdpi.com/2072-4292/12/17/2855
The raw intensities of twenty laser wavelengths are strongly influenced by changes in distance (Figure 8). With increasing distance (where the leaf is perpendicular to the laser beam at 0°), the raw intensities increase first and then decrease. We can see that the intensity at each laser wavelength of the measured targets increases more rapidly at a distance from 4 to 8 m and then decreases less rapidly greater than that. When the measured range is shorter than around 8 m, the instrumental properties, such as the near-distance reducer or the refractor telescope’s defocusing effect, have a significant influence on the raw intensities, and this makes the intensities disaccord with the radar range equation.
Ah, yes. I dismissed that because the distances (8m v 35m) didn't match up. But if the LiDAR sensor is significantly different in behaviour to the ones tested in that paper, then the light that reaches the ground in the green circle might all be slightly defocussed compared to the light further out, and that could cause more light to be lost and scattered away. This would lower the apparent albedo.
 
About halfway through the rotation, the viewing angle gets adjusted, but there is no shift in the radius of the corresponding reflexive semicircle, which doesn't match what your hypothesis predicts.
I think the red/yellow circle is a function of the angle of incidence with the ground and/or the distance from the drone to the ground.

Neither of those are altered by the change in viewing angle as they are purely based on the position of the drone, not the angle of the camera.
 
@Mick West Agreed. My math chops aren't good enough to make perfect sense of @jplaza's contribution about gaussian beam focus (namely https://www.edmundoptics.es/knowledge-center/application-notes/lasers/gaussian-beam-propagation/), but as far as I can tell, those equations don't say anything about the waist distance from the origin of the beam, so we can't make any definitive statement regarding that... sure hope some laser physicist is lurking about, ready to throw equations about that distance at us... Also a good lead is what you were on to with reflectivity over distance, though we need values about various soils, rather than leaves. I think the red portion of the ring, though obvious, can be reasonably placed into software issues land, while the smaller, less conspicuous, gradient portion of it is more relevant to the phenomenon being observed. I am satisfied with the overall debunking of wormholes going on at SWR (they didn't even energize the atmosphere(?) like they did with superfly, why would crazy stuff even happen?) but I think there's room for improvement for the potential explanation of the ring.
 
The only situation I can think of where the albedo of a surface would decrease when seen from directly above is if the soil is very dark, and covered with a sparse covering of light-coloured vegetation. When seen from a drone, looking down, the darker soil surface would be visible and absorb more light. When seen at an angle the vegetation could merge together to conceal the darker soil beneath, so that only the lighter vegetation can be seen. I've seen this effect in a stand of short, dry dwarf wheat on dark soil. But it isn't very common.

Is the ground cover in this location made up of sparse, pale grass covering darker soil, perhaps?
 
@Eburacum
Not sure if I agree with your conclusion that a denser point cloud will show a higher "back reflection intensity". I am certain the calibration takes care of any system errors that one can expect. But even so, a denser point cloud is a cloud of individual measurements, thus I/O per pulse, where the amount of them does not influence the magnitude of intensity.
Simply put, if you look at a point 12 times, it doesn't get brighter than if you look at it 2 times.
 
About halfway through the rotation, the viewing angle gets adjusted, but there is no shift in the radius of the corresponding reflexive semicircle, which doesn't match what your hypothesis predicts.
Compare:
IMG_1830.jpeg.jpg
It looks to me that to create this scan, they fixed the drone in place and rotated it. The central black area is a "shadow", i.e. a place that the Lidar didn't reach. And the red data might be spurious data that is circular because the drone rotated. Both changed when they adjusted the angle of the Lidar.
 
I'd bet that if the Lidar is tilted up too far, there's a stray light reflection that confuses the reflectivity reading. But that's just a wild guess on my part.
 
That's right, it is spurious data. I'm just trying to figure out where it comes from. The operators of the drone have never seen it before, or so they say. Is that because they have never created a scan by rotating a drone in place before? Perhaps they should do it again, somewhere else?
 
Am I misunderstanding what LiDAR does? I thought it rotates internally, in other words with no need to rotate the drone at all. Shift position, yes, but not rotate. Am I incorrect in that?
 
Perhaps they should do it again, somewhere else?
You're absolutely right, of course.
And they should have critically reviewed their findings (as is being done by people here) before allowing themselves to be associated with any claims about finding a wormhole.
But I don't think that anyone connected with the Skinwalker Ranch, umh, investigations has shown any great interest in checking "anomalous" results or providing prosaic explanations.

I'm not sure that their top priority is the education of the viewing public.

Incidentally, since the extraordinary, unexpected and momentous discovery of a wormhole on or intersecting Earth's surface, have the team kept up their observations? Invited in better qualified and equipped researchers? Perhaps stuck a few warning signs around it? For some reason I doubt it.
 
Am I misunderstanding what LiDAR does? I thought it rotates internally, in other words with no need to rotate the drone at all. Shift position, yes, but not rotate. Am I incorrect in that?
Yes.
This Lidar sensor's FOV is a 70.4⁰ wide circular cone. See the user manual:
As it was found they seem to use the sensor as mentioned by @Beck in post #36, I looked into the manual of the sensor (Livox MID-70). They use a radial pattern scanning (non moving) method. The pattern the scanner makes, looks like a daisy wheel / Lissajous curve, covering the FoV.


Screenshot 2023-07-18 at 19.50.21.png
Content from External Source
https://terra-1-g.djicdn.com/65c028...d-70/new/Livox Mid-70 User Manual_EN_v1.2.pdf
 
So the drone rotates to scan the countryside with a field of view of 70.4⁰ at any one time. I suspect it scans gradually, rather than rotating in jerky segments, but the end result would probably look the same.

The mysterious part to me is why there is a green circle of lower 'reflectivity' underneath the drone, surrounding the black hole of no data. I don't think it is an albedo effect, which might be caused by the local geometry of the ground cover - it looks too even for that. I now think it is connected with the optics of the drone LiDAR system itself.

Looking at the data supplied by @jplaza, laser beams have a focal length which limits their usefulness within a certain distance. If the focal length of the laser in this system is something on the order of 35 metres, then readings inside that distance could be reduced by defocussing. A defocussed beam would return less light to the sensor and look like a less reflective surface.
 
So the drone rotates to scan the countryside with a field of view of 70.4⁰ at any one time. I suspect it scans gradually, rather than rotating in jerky segments, but the end result would probably look the same.

The mysterious part to me is why there is a green circle of lower 'reflectivity' underneath the drone, surrounding the black hole of no data. I don't think it is an albedo effect, which might be caused by the local geometry of the ground cover - it looks too even for that. I now think it is connected with the optics of the drone LiDAR system itself.

Looking at the data supplied by @jplaza, laser beams have a focal length which limits their usefulness within a certain distance. If the focal length of the laser in this system is something on the order of 35 metres, then readings inside that distance could be reduced by defocussing. A defocussed beam would return less light to the sensor and look like a less reflective surface.
It could be. According to this the LiDAR should be operated at an alitude of 50-100m:
DJI L1 Operation Guidebook
So they operated it well below it's recommended altitude which very well could lead to artifacts. I havn't read the whole guidebook but it seems like there is a lot of great information.
 
Back
Top