The Cooperative Institute for Research in the Atmosphere (CIRA) in Fort Collins, Colorado, the Cooperative Institute for Meteorological Satellite Studies (CIMSS) in Madison, Wisconsin, and the Naval Research Laboratory (NRL) in Monterey, California are developing and distributing the Simulated True Color product.
The Simulated True Color products are sent to the National Weather Service (NWS) Regional Headquarters from which they are distributed to Weather Forecast Offices (WFOs) for display on their local AWIPS systems. Imagery updates are available approximately two times per day from the MODIS sensors on board Terra (~10:30 AM local time) and Aqua (~1:30 PM local time).
The size of Simulated True Color product images is determined by the span and resolution of the AWIPS domain itself. Since current AWIPS system displays accommodate 1-byte per pixel, a good rule of thumb is that the size of the imagery (in bytes) corresponds roughly to the total number of pixels in a given AWIPS domain. For example, an AWIPS domain having dimensions of 1000 x 1000 pixels will require approximately 1 Megabyte (~106 bytes).
True Color imagery approximates the response of normal human vision, providing a depiction of the satellite-observed scene. The purpose of doing so is to provide a visually intuitive depiction that is useful to experts and non-experts alike, improving the interpretation of various features such as vegetation, water bodies, clouds and snow, deserts, etc., based on usage of natural colors to highlight those features. For this reason, ‘true color’ and ‘natural color’ are often interchanged when referring to this kind of popular satellite imagery.
Simulated True Color demonstrates the kind of imagery that will be possible in the GOES-R era. GOES-R will feature the Advanced Baseline Imager (ABI) sensor which when combined with complementary data from the Visible/Infrared Imager/Radiometer Suite (VIIRS) on the National Polar-orbiting Operational Environmental Satellite System (NPOESS) satellites will be able to produce versions of the imagery shown here at much higher time resolution.
By combining satellite measurements from bands having responses similar to those of the human retina (namely, blue, green, and red bands), we can create an image that appears as a color photograph. The bands are corrected for atmospheric scattering (otherwise there would be a milky appearance to the imagery, due to the affects of Earth’s atmosphere which accounts for blue skies during the daylight hours). The challenge for the GOES-R ABI is that it will not include one of the required (green) bands. We can overcome this challenge by forming a relationship between the missing green band and nearby bands that we do have. The relationship is formed using a satellite dataset which contains all the ABI bands as well as the green band (here, from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument). Once a relationship is determined, we can use the bands the ABI will have to approximate the green band that it will not. This is why we call it “simulated” true color as opposed to just “true color.” The basic concept of the approach is illustrated below.
Among 3-color composite imagery, True Color requires perhaps the least amount of user training from the standpoint of interpreting colors. To reasonable approximation, these images resemble what an astronaut on the Space Shuttle or International Space Station might observe from roughly 500 miles above the surface.
Minor exceptions to this statement include the fact that the response of human vision to color is not replicated perfectly by the satellite sensor, and we have taken the additional step of subtracting-away the atmospheric signal which would contribute to a milky “haze” appearance due to the scattering of sunlight – an effect that is particularly strong at more slant-path views through the atmosphere (e.g., toward the horizon).
Despite the visually intuitive nature of the imagery, interpreting various cloud structures, snow cover, and complex topography are inherently challenging due simply to the way these features appear from the satellite vantage point. The examples below demonstrate how true color depicts lush green deciduous forests over Pennsylvania, the tan sediment-laden waters of the Mississippi River delta, and turquoise shallow-water sand bars in the Caribbean.
The main advantage of true color imagery over various false color or gray-scale imagery enhancements is the intuitive familiarity of the colors as they relate to physical components of the scene, which reduces the amount of training required to make use of the imagery.
The main limitation at this time, and one that will likely go away with the installation of AWIPS-II, is the limitation of 8-bit color (256 colors) to represent a scene that draws (potentially) from a 24-bit (256^3 = almost 17 million) palette of possible colors. The best approach to dealing with this issue would be to use a unique 8-bit color palette for each new true color image, but this is impractical. The next best approach may be for developers to build palettes that are regionally and seasonally dependent, and install/update these palettes at specific forecast offices.