In a paper titled "Deadly Ultraviolet UV-C and UV-B Penetration to Earth’s Surface:" J. Marvin Herndon, Raymond D. Hoisington, and Mark Whiteside write:
http://www.sciencedomain.org/abstract/23870Our multifold measurements of solar irradiance spectra demonstrate conclusively that all wavelengths in the spectral range 200-400 nm reach Earth’s surface, contrary to the widespread perception that all UV-C and the majority of UV-B never reach the surface.
We demonstrate that satellite-based LISIRD solar spectra irradiance at the top of the atmosphere is badly flawed with some regions of the spectrum being less intense than measured at Earth’s surface. That calls into question any calculations made utilizing LISIRD data.
They get values for UV-C (their readings are shown in the graph above in red and black) that are much higher than found in space (the green line). The graph above is a comparison they make in their paper. I've corrected for the rather misleading log scale they used:
Their readings obviously go wrong below 280 nm.
Herndon's answer to this problem is that his cheap device is right and NASA's satellite measurements for the last couple of decades are wrong.
We demonstrate that LISIRD solar spectra irradiance at the top of the atmosphere is badly flawed with some regions of the spectrum being less intense than measured at Earth’s surface. That calls into question any calculations made utilizing LISIRD data.
I asked the manufacturers of the instrumente used, International Light Technologies, about this:
I'm interested in the use of the ILT950UV for measuring UV-C solar radiation. I see in this recent article: http://www.journalrepository.org/media/journals/JGEESI_42/2018/Mar/Herndon1422018JGEESI40245.pdf that a ILT950UV was used over the range 200-400nm. However it demonstrate an unexpected upwards curve below about 260nm (see Fig 3) where the expected UV from sunlight would be near zero. Is this result a problem with the ILT950UV at the lower range, or an issue with calibration or filtering?
Thank you for your interest in our ILT950.
All CCD array spectrometers are prone to effects of stray light in the UVC. The CCD is made of silicon which is very insensitive in the UVC, so it has a large calibration factor multiplier which magnifies the small levels of stray light reaching the pixels used to measure UVC. Basically the error starts as stray light but then during conversion from pixel to irradiance the calibration multiplier increases the error of the stray light. I also noticed that the graph is plotted in log scale which tends to magnify smaller numbers.
Thought our new model ILT950 with a CMOS sensor has more UV sensitivity as compared to the Visible range it will still suffer the effects of stray light.
The only way to truly measure very small UV signals in the [presence] of very intense white light is with a monochromator, or possibly even a double monochromator which cost in the 15-25,000 dollar range.
So the rise in UV at the bottom of the range is due to the multiplication of noise from stray sunlight multiplied by a function that ramps up for low values of UV.
i.e. it's a graph of the correction function made visible by not filtering out all the other light.
Follow up from ILT:
So the curve seen there is essentially stray light multiplied by calibration multiplier? i.e. the shape of the curve actually reflects the function used to calculate the calibration multiplier?
Yes the shape is reverse of the sensitivity or a silicon CCD. As the sensitivity decreases, the calibration multiplier is larger. so the stray light effects are amplified by the cal factor. Then it is magnified by the use of a log scale
ILT say the only way to measure UV-C with this instrument is monochromator. This is a type of specialized filter that blocks out all stray light outside of one narrowly defined part of the spectrum. This was not done by Herndon, et. al, which is why they have the peak in UV-C. It's simply the calibration curve added in by the software. No UV-C was detected.
It is unfortunate that the authors of the paper were seemingly unfamiliar with the problems of stray light in measuring UV-C. This is actually a well know problem in the field, and was in fact raised as an issue by of the peer-reviewers of the paper:
The problem is well known to the manufacturers and experienced users of these instruments. Here's discussion from Instrument Systems, a division of Konica Minolta:Authors have measured solar irradiance spectrum in the spectral range 200- 400 nm. This study is flawed in its physical component. The UV-C measurements are not credible because they were obtained with an inappropriate instrument and they do not agree with theoretical predictions (Planck’s law - blackbody emission). I suspect these UV-C fluxes to be spurious results of instrument limitations.
They include this explanatory chart:The impact of stray light correction on the spectrum of a calibration source is particularly distinct in the UV and IR spectral range, because the detector of an array spectroradiometer has only a low sensitivity at the edges. Due to the division of the measured spectrum by the reference spectrum, errors due to stray light contaminated measurement spectra are intensified in the areas of lower sensitivity. Thus, stray light correction leads directly to a higher precision in radiometric evaluations of UV LEDs.
Notice the most accurate readings (green) with the scanning double monochromator (as suggested by ILT) are around 0.001. Basically zero.
Herndon, et. al, do mention "stray light"
What they fail to realize is that 0.3% of very bright sunlight is still a lot of stray light falling on the UV-C portion of the sensor.This is a new line of investigation employing International Light Technologies ILT950UV Spectral Radiometer with fractional nanometer resolution in the short-wavelength portion of the ultraviolet (UV) spectrum with stray light rejection >99.7%.
[Note: this is a summary post of the thread below, original discussion follows]