Explained: "It's photoshopped because it has to be"

Mick West

Administrator
Staff member


Much has been made of Robert Simmon's statement about the Blue Marble image (above) that "it's photoshopped because it has to be"

I think the misunderstand of this statement started out as trolling, but has evolved so that now some people are convinced that it means ALL images of Earth are fake.

Simmon's is referring to the Blue Marble image from 2002, which actually isn't the image you see above, exactly, it's this:

land_shallow_topo_2048.jpg
Plus this:
cloud_combined_2048.jpg
Source:https://visibleearth.nasa.gov/view_cat.php?categoryID=1484&order=asc&p=1

i.e. it's an image of the entire planet's surface with no clouds and in daylight, and it's also an image of the entire planet's surface with just clouds. Combining the two together allows you to render images like the one famously used on the iPhone lock screen, and also to render 3D animations from arbitrary positions, like.



So of course this particular image has to be photoshopped. There are always clouds on the earth, and half the Earth is dark, and in any photo taken from space, you can only see half (or less) of the planet. To create this image you have to stitch together hundreds of daylight images from all around the globe, and just use the bits that don't have any cloud on them.

So when Simmon said "it's photoshopped because it has to be" he was referring to this composite and edited image. He was NOT referring to all photos from space, of which there are many coming in every hour, for example, the DSCOVR satellite which sends non-photoshopped images of the daylight side of the Earth every few minutes:

Metabunk 2019-04-15 09-59-54.jpg

Or the GOES-16 satellite which has a geostationary orbit, and so sees night and day:
Metabunk 2019-04-15 10-04-18.jpg

And many more:
https://www.metabunk.org/full-disk-hd-images-of-the-earth-from-satellites.t8676/
 

Attachments

  • Animated Blue Marble - 32 seconds.mp4
    1.5 MB
Last edited:
The two images earlier in this thread, from dscovr and GOES also reminds me of claims to be fake because of difference of colours. I have a 3 monitor pc system (for photography work) and I have had to colour profile them with an x-rite colour munki colorimeter (and also my printers), otherwise at default settings the same image will indeed be different in colour or contrast. Walk into any electrical store that sell TVs, see how much difference there is between different manufacturers default settings. This will be the same with any one image output to different media. Then of course, there is the difference between different cameras and sensors. So without Photoshoping, expect different results. That's the norm.
 
For the laymen, would you describe Himawari-8 full-disk images as "composites"?

I think there's a danger of confusion in asking if something fits the meaning of a word without actually specifying the meaning of the word. This is especially true when you know that the Flat Earth and Fake Space folk want to claim that all space images are fake. If you say an image is a "composite" then that could be interpreted as saying it's fake.

All the satellites mentioned in this thread use a monochrome camera (or cameras) and filters (and/or beam splitters). for example, here's the GOES-16/17 setup:
Metabunk 2019-04-24 08-45-02.jpg
Source: https://www.goes-r.gov/downloads/resources/documents/GOES-RSeriesDataBook.pdf

The visible light images come from the VNIR (Visible & Near Infrared) sensor on the right, which is actually responsible for six bands.

So if you are seeing a color image then it's a "composite" of two or more channels (bands, wavelengths, colors).

Most of them "scan" the image in a couple of ways. Firstly, they divide a region into "swathes" which they image individually and then stitch those swathes into a set of single-band images (which can then be combined into color images, or processed to highlight or measure certain things, like cirrus clouds). This is somewhat analogous to the "panorama" mode on a phone camera (but with less overlap). This large scale scanning is done by aiming with two mirrors which obviously can be moved much easier than the camera itself.

Then, many of the satellites "scan" the image in way more like a simple scanner. Instead of the full image being registered on a single CCD, like on a conventional camera, the satellite imagers have "Focal Plane Modules" which have multiple columns of pixels, different columns for different bands. As the east/west mirror moves across the image this registers one column at a time
Metabunk 2019-04-24 08-55-58.jpg Here's the resolution of the
Metabunk 2019-04-24 08-57-03.jpg
While they have multiple columns, only one pixel per row is used (it picks the best one). The first two are visible wavelengths. The highest resolution on is the 0.64µm (640nm, red) channel at 1460 pixels high. The lower resolution 0.47µm channel is blue. Notably absent is green (which would be around 0.51µm). So you can't get a natural color image by combining red, green, and blue (which is what most cameras do). Green has to be calculated from the other channels - maybe including the IR 0.86 band, which detects vegetation, as a partial green band. (Himawari-8 has a green band)

Are the GOES or Himawari images "composite?" They produce full disk single band images that are scanned and stitched, but I would not describe those as composite unless you are being particularly pedantic. Here's the red image
Metabunk 2019-04-24 09-21-53.jpg

I'd be quite happy in describing that as a full-disk image of the Earth from space.

Once the individual band data is on the ground, it is used to make composite images referred to as "products". For example the "GeoColor" product
Metabunk 2019-04-24 09-25-41.jpg

Notice half the planet is in night, and you can see clouds and city lights in the dark half - here the day and night parts of the image come from different bands. The night clouds probably from the longwave radiation (outgoing heat) I think the city lights seem to be from a single image, possibly the 2012 Night Lights image from the Blue Marble project. So that's a composite image. Other products composite images in different ways, including compositing the longwave clouds with the Blue Marble land image to produce an image with no night time.

And if you really want a non-scanned, non-stitched, and certainly non-composite image, the best place is the DSCOVR, with the EPIC (Earth Polychromatic Imaging Camera), which has a more conventional 2048x2048 pixel CCD, although it uses a filter wheel to get red, green, and blue bands. But a single band from the EPIC is, in fact, a photo of the earth from space by even the strictest definition (unless you only accept actual chemical film photos).
Metabunk 2019-04-24 10-09-05.jpg
 
I've copied the above post from: https://www.metabunk.org/posts/230395/ because it pertains to this thread. Firstly it explains how the images from the geostationary satellites are scanned in strips and then those strips are stitched together to form an image. Somewhat analogous to a panorama photo with a photo.

Secondly, I briefly mentioned composite products, like the one with night lights and clouds visible in dark regions.

Notice half the planet is in night, and you can see clouds and city lights in the dark half - here the day and night parts of the image come from different bands. The night clouds probably from the longwave radiation (outgoing heat) I think the city lights seem to be from a single image, possibly the 2012 Night Lights image from the Blue Marble project. So that's a composite image. Other products composite images in different ways, including compositing the longwave clouds with the Blue Marble land image to produce an image with no night time.

Images of the full disk 24 hours with visible clouds are another example of a type of image that HAS to be composite. Luckily there's a (full disk) image that has clouds - the longwave infrared. LWIR is kind of like a thermal camera, so even in the dark portions it can see the cold clouds over the warmer ocean or land.

Ten minutes ago the planet looked like this to Himawari 8 (with a map overlaid)
Metabunk 2019-04-29 12-23-37.jpg

But in LWIR it saw this:
Metabunk 2019-04-29 12-24-05.jpg
(notice the clouds match)

This can be combined with a map of the ground/ocean:
http://www.jma.go.jp/en/gms/largec.html?area=6&element=0&mode=UTC&time=201904291910&line=0
Metabunk 2019-04-29 12-25-10.jpg

This one "has to be photoshopped" because it's dark in Australia. But the visible light image does not.
Metabunk 2019-04-29 12-27-36.jpg

Not can you make the visible light image from the IR image by using something like a Blue Marble layer. The visible light image contains all kinds of details in the cloud layer that are not in the IR image. There are various layers of clouds that don't show up well in IR, then there are the shadows of the clouds.
Metabunk 2019-04-29 12-33-42.jpg

There's also a lot of detail on Earth's surface that are not in the Blue Marble image, and that can be verified from the ground - things like forest fires, volcanos, algae blooms, dust storms, pollution, vegetation changes, ice cover on lakes, etc. All things that are essentially impossible to fake.
 
Back
Top