1. Teertskcab

    Teertskcab New Member

    Another thing that comes up often, is the question "why don't we ever see any other satellites and space debris that are supposedly orbiting the Earth?"

    I have my thoughts on this, namely, that I'm not sure we should expect to see any because of how small they are in comparison. But we CAN see satellites from the ground on a clear nighttime sky where there's minimal light pollution in the form of reflected light, right? So why don't we ever see the same thing in these full disc Earth images? And also on that topic, why don't we see stars in them?
     
  2. cloudspotter

    cloudspotter Senior Member

    You can see satellites from the ground because they are lit by sunlight against a dark night sky, any satellites in the images will be lit by the sun against the disk of the earth lit by the sun so wont stand out

    Camera exposure is set to capture a brightly lit earth so the stars will be too dim to show up
     
    • Like Like x 2
    • Agree Agree x 1
  3. Mick West

    Mick West Administrator Staff Member

    • Like Like x 1
  4. Mick West

    Mick West Administrator Staff Member

    The DSCOVR satellite is very far away, so has a much narrower field of view (a long lense), so it is more capable of photographing stars, etc. However it's still designed just for Earth. Hover, it did take this long exposure of Jupiter and it's moons:
    20170623-073303-khizc.
    Source: https://epic.gsfc.nasa.gov/galleries/2016/imaging_jupiter

    You can boost that image to maybe see some stars. Might just be sensor noise though.
    20170623-073440-r44nl.
     
    • Like Like x 1
  5. Astro

    Astro Active Member

    The sunlit earth is very bright and the exposures required to properly expose it without over-exposing it prevent us from seeing stars and satellites in the images, which are far too dim. Here's some math on that which I did a while ago. At the time, DSCOVR was 967,970 miles from earth. ISS, being one of the largest and brightest satellites, is still only about 357 feet long, or about 0.07 miles. That means it would have an angular size of about 0.012785 arcseconds from DSCOVR. Earth has an equatorial diameter of about 7,926 miles, and in this image that length occupies about 1548 pixels, for a resolution of about 5 miles per pixel. That means even ISS would be very much a point-like source of light at best. We can therefore treat it as a star-like object.

    How bright would the "star" of ISS be from this distance? Well at a perigee of 402 km altitude and 100% illuminated it has an apparent magnitude of -5.3.
    H = m-2.5*log(distance of ISS^2 * distance of ISS from sun^2/AU^4)
    (This is assuming a phase angle of 1, fully illuminated)
    Where H = the absolute magnitude and m = the apparent magnitude. Plugging in we get a solar system absolute magnitude of ISS of H = 22.55 (this is the same metric used to calculate the expected brightness and size of asteroids).

    Now we can reverse this and solve for the apparent magnitude of ISS at a given distance, such as the distance of DSCOVR, assuming a best case scenario where the station is fully illuminated. Given that DSCOVR has a distance to earth of about 967,970 miles, that works out to 1557797 km. Plugging that in we get an apparent magnitude of ISS of 12.6. That's well over a hundred times dimmer than the dimmest star that can be seen by naked eye. DSCOVR's EPIC camera uses fast, short exposures to properly expose the bright daylit earth. It's too fast to detect stars, let alone satellites, even a large and bright one like ISS. You would need a very long and deep exposure just to detect the brightest satellites and the view would be filled with stars. The problem is that the earth is too bright; the glare would blind the camera long before it could get to magnitude 12.
     
    • Like Like x 2
    • Agree Agree x 1
  6. Mick West

    Mick West Administrator Staff Member

    There's a GOES-16 viewer up and running, and it's pretty awesome. Although it's a bit slow generating the zoomed in animations.
    http://rammb-slider.cira.colostate.edu/
    20170825-125404-sakx7.

    I anticipate this being very useful in contrail tracking and explanation. In particular it should help people understand why grids sometimes form.
     
    Last edited: Aug 25, 2017
    • Like Like x 2
  7. Mick West

    Mick West Administrator Staff Member

    The above is likely getting a lot of traffic because of hurricane Harvey which is making landfall in Texas and about to create some serious flooding. So the site may speed up in a couple of days.
    20170825-130221-jb2mb.
     
    • Like Like x 2
  8. StarGazer

    StarGazer Member

    I made a simulation of the Lunar Transit as seen from the DSCOVR Satellite



    Camera Distance: 1 609 344 km (1 000 000 miles).
    Diameter of the Earth: 12 742 km (7917 miles)
    Diameter of the Moon: 3474 km (2158 miles)
    Distance between Earth and Moon: 384 400 km (238 855 miles)

    I'll be working on visually showing the angular diameters of the Earth and the Moon from that simulation.
     
    Last edited: Jan 15, 2018
  9. Agent K

    Agent K Active Member

    Live views of SpaceX Starman

    [​IMG]

    "Fisheye lens", right? Wrong. One minute later:

    [​IMG]

    [​IMG]

     
    Last edited: Feb 7, 2018
  10. dc_hatman

    dc_hatman Member

    Probably off topic, but has that car being launched into space set some kind of kind of new speed record for electric powered automobiles? I know it wasn't being powered by it's electric engines at the time but surely that's just nitpicking.
     
  11. Tesla roadster image and Hinawari 8 :)

    [​IMG]
     
    • Like Like x 2
  12. Trailblazer

    Trailblazer Moderator Staff Member

    • Like Like x 1
  13. Rory

    Rory Senior Member

    I saw a notice that says GOES-17 is due to become operational this month, replacing GOES-15.

    I'm looking for source imagery for the Pacific Coast on July 6th, 2015. This was the date of the first DSCOVR photo of the full disk of the Earth: the one that contains the infamous 'sex in the clouds'.

    I've found this:


    Source:
    Source: https://i.imgur.com/ul5DuJM.mp4


    But it'd be nice to trace it back to the root.
     
    Last edited: Jan 15, 2019
  14. This page should give you what you're looking for:

    https://www.ssec.wisc.edu/datacenter/goes-archive/#GOES15

    It will take you to a search page where you can be as broad or as specific as you like.
     
  15. Rory

    Rory Senior Member

  16. Tumeni

    Tumeni New Member

    Given that neither you nor I have actually travelled there, I would suggest you defer to those who have.

    In recent months, there has been a joint Chinese/Saudi mission which returned a new Earthrise photo
    The Chinese have landed on the far side, and prior to this, placed a comms satellite in a halo orbit for signal relay purposes.

    If the accepted distance were wrong, dontcha think someone would have noticed, when their craft missed the Moon?


    Of course, there are other methods of confirming distance to the Moon;

    - Laser ranging
    - Radio signal ranging
    amongst others

    Lastly, there's the Italian (?) schoolchildren who took the unedited Apollo voice transmissions, and using the delay in the signal, and the speed of radio waves, calculated the accepted distance from those.

    So many agreements with the "accepted distance", and just you out of step....
     
  17. Mendel

    Mendel Member

    It's not about the distance, it's about the legitimacy of the satellite images. If there is an image that's "impossible", it serves as evidence for the space conspiracy. There must be a reasonable explanation for it.

    First, some corrected values:
    R=3963 miles (semi-major axis, since we're on or close to the equator)
    b=222422 miles (excentricty of lunar orbit, actual value for time/date, source: mooncalc)
    B'=B+0.24° (adding half the angular size of the moon, estimated by measuring the full disc at 9cm and the moon at 0.5cm on my screen)
    Note also that the latitude of Himawari 8 is +/- 0.04° (Satellite motion charts for HIMAWARI-8). I was unable find the actual position for that date.

    But even with these corrections, angle A is still approximately 20 arc seconds too large.
    To take up the excess, the moon needs to be further away from the satellite. Waiting 1:40 minutes puts the moon at 31.46W, and everything matches. (+/- 15 seconds for the orbital variation of Himawari 8). (central angle calculator)

    How could this occur?
    I know that Himawari 8 is on a 20 minute cycle, i.e. the data collection schedule repeats every 20 minutes. I also know that Himawari images are published with an apparent 20 minute delay. My hypothesis is that the image is stamped with the start time of the data cycle, but actually collected and transmitted after this starting time.

    How could this be corrobarated?
    a) ask the Japanese Metereological service
    b) find sensor skew due to Earth rotation in the full disc image, and compute the time it takes to collect
     
    Last edited: Feb 2, 2019
    • Like Like x 1
  18. Rory

    Rory Senior Member

    I was waiting for a Himawari-8 live image the other day and I think it was a 10 minute delay.

    Just checked now and it was 24/25 minutes.
     
    Last edited: Feb 2, 2019
    • Like Like x 1
  19. Agent K

    Agent K Active Member

    Last edited: Mar 24, 2019
  20. Rory

    Rory Senior Member

    Here's a great video from Scott Manley, talking about a large asteroid that burned up over the Bering Strait:


    Source: https://www.youtube.com/watch?v=fpaxvjFh-qA

    The occurence wasn't discovered until some time after, given that it was almost literally in the middle of nowhere, but the cool thing is that when satellite images from that timeframe were looked at, the asteroid was seen, as well as its trail. Himawari got some good shots, and other weather satellites captured it too.
     
  21. Learjet

    Learjet New Member

    Hi guys. Interesting thread. I'm an electronics tech, amongst other things. One of the fun things I like to do is receive weather satellite images. I live in cyclone territory and to have access to satellite weather data live rather than from internet. This is useful to track cyclones when the internet goes down. I have access to the NOAA ATP sats, the Russian Meteor M2 and more recently Himawari 8.

    I've attached an image I received yesterday. You can see cyclone Veronica in WA and ex cyclone Trevor in the NT. This was received with a 2.3m dish, Novra S300D and DVB-S2 receiver via Jcsat2B. The image is a 9 band composite, made from one visible and 8 infrared bands. Single bands of course are monochrome. Xrit2pic was used to combine the bands.

    Looks like it's spherical. ;) Anything you want to know just ask.
    190323_2320_J.
     
    • Like Like x 2
    • Informative Informative x 1
  22. Mendel

    Mendel Member

    @Learjet if it's one visible band, why is the surface blue and green and brown? and also I thought Himawari has three visible band sensors?
     
  23. Learjet

    Learjet New Member

    This is a false colour palette. Green is vegetation from one of the IR channels, land is from a different IR channel and visible at 644nm is mapped to blue, even though 644nm is actually red. Confusing isn't it? Himawaricast via Jcsat2B doesn't include green or blue sensor data for some reason so a false colour palette has to be created.
     
    • Informative Informative x 1
  24. Rory

    Rory Senior Member

    What's the time difference between you receiving the images and them being available on the Himawari website?
     
  25. Learjet

    Learjet New Member

    I have the images in about 15 minutes. It takes about that amount of time to receive all the data. For instance for the 12pm image, data transfer starts at about 12:05 and ends at 12:15. There are 140 - 150 HRIT files. This adds up to nearly 500mb! That's about 3gb per hour. On top of that LRIT files are generated along with sataid files.
     
  26. Mendel

    Mendel Member

    I probably should correct my calculation post above where I stated that Himawari is on a 20 minute cycle.
    ( https://www.data.jma.go.jp/mscweb/en/himawari89/space_segment/spsg_ahi.html )
    image.

    This is the system @Learjet is using:
    image.
    https://www.data.jma.go.jp/mscweb/en/himawari89/himawari_cast/himawari_cast.php

    https://www.data.jma.go.jp/mscweb/e...ast/note/HimawariCast_dataset_20150624_en.pdf
     
  27. Learjet

    Learjet New Member

    Yes Mendel, that's the method I use. I get parts of an image for each band first which after 10 minutes makes a complete image.
     
  28. Rory

    Rory Senior Member

    Like a lot of people, I used to think that full disk images from the Himawari-8 images were taken with a single shot, rather than being 'composites'. But what I've recently learned is that its camera takes twenty-three 500km wide 'swaths' in order to make up a full disk:
    It also takes another eight swaths that zoom into specific areas, giving 50ish images every ten minutes, from 87 swaths:

    7387346_orig-1024x643.
    Source: http://spaceflight101.com/spacecraft/himawari-8-and-9/

    I say "50ish" because I'm not quite sure how they've tallied the numbers. The above link states "49 images every ten minutes" but the breakdown seems to show 53.

    There's a couple more links here detailing the process:

    https://www.data.jma.go.jp/mscweb/en/VRL/VLab_RGB/RGBimage.html
    https://www.wired.com/2015/08/americas-next-best-weather-satellite-japan-already/
     
    • Informative Informative x 1
  29. Mick West

    Mick West Administrator Staff Member

    Here's a video showing a possible scan sequence. It's interesting that it scans the smaller regions in the middle of the full-disk scan. But that makes sense, as it's doing the full-disk scan over ten minutes, and the smaller regions are scanned multiple times during those ten minutes.

    Source: https://www.youtube.com/watch?v=sFg_VpY3e0g


    The video was made before deployment, so the actual scan sequence could vary.

    I think most satellite cameras employ some kind of scanning technique, which makes sense as they can get a higher resolution that way, and they don't need to capture fast action. I suppose you'd get some objection that "it's all CGI" because it's scanned, but that's rather a semantic argument. Lots of phone cameras use a "rolling shutter" where they "scan" the image very rapidly. Rolling shutter is even used in some film cameras, where (for very short exposures) only one narrow strip of film is exposed at a time (a very short time, but still the same concept)

    The DSCOVR images, however, are "single images" using a since 2048x2048 CMOS sensor.
    https://epic.gsfc.nasa.gov/about/epic

    So if there's some objection to scanned images you can always show them DSCOVER: single images that show the full disk of a rotating Earth with changing weather.
    https://epic.gsfc.nasa.gov/
    Metabunk 2019-04-06 09-33-36.
    Photo from 2019-04-04 18:46:25 (UTC, I think)
     
  30. Agent K

    Agent K Active Member

    • Like Like x 3
  31. Learjet

    Learjet New Member

    About Himawari 8 scanning, with the data that comes down, I receive 10 north / south segment "swaths" per image in HRIT format for each band. There are 14-15 bands available from Himawaricast and 16 from the net version. I don't know why they leave out green and blue for Himawaricast since I have to use a false colour IR palette rather than RGB for a colour image, but I digress.

    14x10 = 140 HRIT files / part images every 10 minutes for full disc only from Himawaricast. I can sometimes see a slight delay between segments, especially when the Moon finds itself in the middle of two segments, it is shifted slightly.

    So for a full disc monochrome image, 10 segments are needed. For a 3 band colour image, 30 segments/strips/swaths/etc are used. I sometimes use a 9 band composite with various transparency settings as it gives more cloud layers, so that's like 90 segments to make a single image. The curve, of course, is clearly seen in every segment anyway.
     
  32. Rory

    Rory Senior Member

    For the laymen, would you describe Himawari-8 full-disk images as "composites"?
     
  33. Mick West

    Mick West Administrator Staff Member

    I think there's a danger of confusion in asking if something fits the meaning of a word without actually specifying the meaning of the word. This is especially true when you know that the Flat Earth and Fake Space folk want to claim that all space images are fake. If you say an image is a "composite" then that could be interpreted as saying it's fake.

    All the satellites mentioned in this thread use a monochrome camera (or cameras) and filters (and/or beam splitters). for example, here's the GOES-16/17 setup:
    Metabunk 2019-04-24 08-45-02.
    Source: https://www.goes-r.gov/downloads/resources/documents/GOES-RSeriesDataBook.pdf

    The visible light images come from the VNIR (Visible & Near Infrared) sensor on the right, which is actually responsible for six bands.

    So if you are seeing a color image then it's a "composite" of two or more channels (bands, wavelengths, colors).

    Most of them "scan" the image in a couple of ways. Firstly, they divide a region into "swathes" which they image individually and then stitch those swathes into a set of single-band images (which can then be combined into color images, or processed to highlight or measure certain things, like cirrus clouds). This is somewhat analogous to the "panorama" mode on a phone camera (but with less overlap). This large scale scanning is done by aiming with two mirrors which obviously can be moved much easier than the camera itself.

    Then, many of the satellites "scan" the image in way more like a simple scanner. Instead of the full image being registered on a single CCD, like on a conventional camera, the satellite imagers have "Focal Plane Modules" which have multiple columns of pixels, different columns for different bands. As the east/west mirror moves across the image this registers one column at a time
    Metabunk 2019-04-24 08-55-58. Here's the resolution of the
    Metabunk 2019-04-24 08-57-03.
    While they have multiple columns, only one pixel per row is used (it picks the best one). The first two are visible wavelengths. The highest resolution on is the 0.64µm (640nm, red) channel at 1460 pixels high. The lower resolution 0.47µm channel is blue. Notably absent is green (which would be around 0.51µm). So you can't get a natural color image by combining red, green, and blue (which is what most cameras do). Green has to be calculated from the other channels - maybe including the IR 0.86 band, which detects vegetation, as a partial green band. (Himawari-8 has a green band)

    Are the GOES or Himawari images "composite?" They produce full disk single band images that are scanned and stitched, but I would not describe those as composite unless you are being particularly pedantic. Here's the red image
    Metabunk 2019-04-24 09-21-53.

    I'd be quite happy in describing that as a full-disk image of the Earth from space.

    Once the individual band data is on the ground, it is used to make composite images referred to as "products". For example the "GeoColor" product
    Metabunk 2019-04-24 09-25-41.

    Notice half the planet is in night, and you can see clouds and city lights in the dark half - here the day and night parts of the image come from different bands. The night clouds probably from the longwave radiation (outgoing heat) I think the city lights seem to be from a single image, possibly the 2012 Night Lights image from the Blue Marble project. So that's a composite image. Other products composite images in different ways, including compositing the longwave clouds with the Blue Marble land image to produce an image with no night time.

    And if you really want a non-scanned, non-stitched, and certainly non-composite image, the best place is the DSCOVR, with the EPIC (Earth Polychromatic Imaging Camera), which has a more conventional 2048x2048 pixel CCD, although it uses a filter wheel to get red, green, and blue bands. But a single band from the EPIC is, in fact, a photo of the earth from space by even the strictest definition (unless you only accept actual chemical film photos).
    Metabunk 2019-04-24 10-09-05.
     
    • Informative Informative x 2