Product Information:
Product Information:
Who is developing and distributing this product?
The Cooperative Institute for Research in the Atmosphere (CIRA) in Fort Collins, Colorado, together with the NOAA/NESDIS RAMM Branch is developing and distributing the Snow/Cloud Discrimination product.
Who is receiving this product, and how?
The Snow/Cloud Discrimination product is created at CIRA, and will be sent to NWS Regional Headquarters, and then distributed to the WFOs as a product on their AWIPS.
What is the product size?
The size of one east or west Snow/Cloud discrimination product is ? MB, with updates available every 15 minutes at 4 km spatial resolution from current GOES imagery.
Product Description:
What is the purpose of this product?
The Snow/Cloud Discrimination product, developed at the CIRA, is demonstrated on the RAMSDIS Online web page for both GOES-West and GOES-East. The product displays standard GOES Imager data in a unique way using a Red-Green-Blue (RGB or 3-color) combination of images and image products, producing a single daytime-only Snow/Cloud Discrimination product. Inputs are the 3.9 µm (shortwave) window band, the 10.7 µm (longwave) infrared window band, and the 0.7 µm (visible) window band from the GOES Imager. Only the infrared window band is used directly, whereas the visible and shortwave bands are first converted into the Visible Albedo product, and the Shortwave Albedo products, respectively.
Why is this a GOES-R Proving Ground Product?
The Snow/Cloud Discrimination product demonstrates a unique kind of imagery that is already available, but is under-utilized, as well as a continuing product in the GOES-R era. GOES-R will feature the Advanced Baseline Imager (ABI) sensor which will be able to produce both a higher spatial resolution (2 km) and higher temporal resolution (5 min) version of the Snow/Cloud Discrimination product.
How is this product created now?
Contact:
Related Training:
Data Display:
WFO/Testbed Feedback:
Product/Type:
Usage:
Product Examples and Interpretations:
Here is a brief description of how the Snow/Cloud Discrimination product is created from GOES Imager data:
The Snow/Cloud Discrimination product can be computed from three (longwave infrared, shortwave infrared, and visible) window bands available on nearly all operational geostationary and polar-orbiting satellites, such as GOES, MODIS, etc.
Since the Snow/Cloud Discrimination product is a combination of three images, two of which are derived image products, each of those images or products are explained separately as follows:
1) For the shortwave infrared window band, a simple algorithm is used to compute the reflected (only) component of the shortwave (3.9 µm) infrared window band, which normally consists of both emitted and reflected energies. The emitted component is removed through the use of the longwave (10.7 µm) infrared window band, by computing the shortwave-equivalent radiance from the longwave radiance through simple Planck function relationships. (This part of the Snow/Cloud Discrimination product treats the shortwave infrared window band in the same way as in the Low-Cloud/Fog product, another Proving Ground product that is also available from CIRA, for which a product description is available.)
The reflected component of the shortwave infrared window band is a key to the product. In the shortwave portion of the spectrum, ice clouds and snow are not very reflective, unlike water-droplet clouds which are highly reflective. This characteristic leads to an easy way to distinguish between ice and water clouds. Although this distinction is seen in the shortwave infrared window band alone, subtracting the emitted component of the shortwave highlights/enhances the reflected component for the user of this product. A zenith-angle-correction is also applied to each pixel of the product, to further enhance the parts of the image with low sun angles. An example of this shortwave component is given in Figure 2.

Figure 2. Example of the Shortwave Albedo product, as a component of the Snow/Cloud Discrimination product, during the daytime over the eastern United States. This product depicts how the shortwave infrared window band is treated prior to being combined into the Snow/Cloud Discrimination product. This product is also known as the Low-Cloud/Fog Product, another Proving Ground Product available from CIRA. In this image, high clouds are color-enhanced, but that enhancement is not part of the Snow/Cloud Discrimination product.
2) For the visible window band, a simple algorithm is used to compute the zenith-angle-corrected reflectance at each pixel of the image, resulting in a lighter/brighter image that appears is if the sun is overhead at each pixel. This part of the Snow/Cloud Discrimination product treats the visible window band in the same way as in what’s known at CIRA as the Visible Albedo product. For a detailed description of the Visible Albedo product, see Kidder et al (2000).
An example of this visible component is given in Figure 3.

Figure 3. Example of the Visible Albedo product, as a component of the Snow/Cloud Discrimination product, during the daytime over the eastern United States. This product depicts how the visible window band is treated prior to being combined into the Snow/Cloud Discrimination product. This product is also known as the Visible Albedo Product.
3) For the longwave infrared window band, this band is used directly. There is no intermediate product, as for the other two components of the Snow/Cloud Discrimination product. An example of the longwave infrared window band is given in Figure 4.

Figure 4. Example of the longwave infrared window image, as a component of the Snow/Cloud Discrimination product, during the daytime over the eastern United States. In this image, high clouds are color-enhanced, but that enhancement is not part of the Snow/Cloud Discrimination product.
At this point the three components described above are combined using Red-Green-Blue techniques that are common in most imagery manipulation software. The Visible Albedo is used as the Red component, the Shortwave Albedo is used as the Green component, and the Infrared window band is used as the Blue component. Figure 5 is a flowchart of the RGB process.

Figure 5. Flowchart of the Red-Green-Blue processing of the three components of the Snow/Cloud Discrimination product.
3) Product Examples and Interpretation:
Figure 6 is another example of the Snow/Cloud Discrimination product, but for the western U.S., for the same time as the first example in Figures 1 though 4. The component images of this example are not shown here, only the RGB combination that forms the Snow/Cloud Discrimination product.

Figure 6. Example of the Snow/Cloud Discrimination product during the daytime over the western United States. Note the high-level ice clouds, presented in bright magenta color, covering much of the western U.S., with a few low clouds in off-white colors off of Baja and in eastern Oklahoma and western Arkansas. The snow-covered land portions of the image are presented in red, as opposed to the magenta color of ice clouds. The distinction between the red (of ice clouds) and magenta (of snow) is subtle on one hand, but is easily distinguishable with minimal training.
Advantages and Limitations:
The Snow/Cloud Discrimination product provides a visual display of different types of clouds, as well as snow coverage for the land areas that are not covered by cloud. The color enhancement clearly shows higher, colder ice clouds in magenta, distinct from lower clouds, which appear as off-white. Low clouds are clearly differentiated from snow, which appears red to its low reflectivity in the shortwave region and to its cold temperatures in the infrared.
The main disadvantage of this product is the fact that snow-covered land clouds cannot be seen when clouds obscure the land below. Some of this may be alleviated by observing a temporal loop of the Snow/Cloud Discrimination product, to see through the clouds as they move. Product mages are currently available every 15 minutes at 4 km spatial resolution from current GOES imagery.
The Snow/Cloud Discrimination product is generated both day and night, but with major deficiencies at night, due to the lack of visible radiation. The lack of visible radiation causes a loss of the visible band and the Visible Albedo component, as well as a change in the shortwave band and the Shortwave Albedo component. There is also a transition in the product as the day-night terminator approaches or leaves the image. At night the lack of visible radiation eliminates the red color for snow-covered land and along with it, the distinction between high-level ice clouds and snow-covered ground.
Interactions with NWS users via the Proving Ground will assist algorithm developers in improving this product, such as the desired enhancements, product scaling, etc., to best tailor this unique application to the end-user needs. The development of future, improved products will also benefit from user feedback.
Product Information:
Product Information:
Who is developing and distributing this product?
The Cooperative Institute for Research in the Atmosphere (CIRA) in Fort Collins, Colorado, in partnership with the Naval Research Laboratory (NRL) in Monterey, California, is developing and distributing the GeoColor product.
Who is receiving this product, and how?
The GeoColor products are sent to the National Weather Service (NWS) Regional Headquarters from where they are distributed to participating Weather Forecast Offices (WFOs) for display on their local AWIPS systems.
What is the product size?
The size of one CONUS scale image is 22 MB, updates are available every 30 minutes.
Product Description:
What is the purpose of this product?
The GeoColor satellite imagery product, developed at NRL and first demonstrated on the NexSat web page (www.nrlmry.navy.mil/NEXSAT.html), displays standard GOES data in a new way that includes customized day/night backgrounds and makes a seamless transition from daytime (visible) to nighttime (infrared) imagery.
Why is this a GOES-R Proving Ground Product?
GeoColor demonstrates the kind of imagery that will be possible in the GOES-R era. GOES-R will feature the Advanced Baseline Imager (ABI) sensor which when combined with complementary data from the Visible/Infrared Imager/Radiometer Suite (VIIRS) on the National Polar-orbiting Operational Environmental Satellite System (NPOESS) satellites will be able to produce versions of the colorful imagery shown in GeoColor without the need for some of the special blending techniques used here, and at higher spatial and time resolution.
How is this product created now?

Figure 2 illustrates the various components of the GeoColor imagery blending technique. In the foreground of this image are the GOES E/W satellite visible and infrared datasets (upper-most left and right panels of Fig. 2, respectively.) For this image, which spans the full Continental U.S., we have stitched together the time-matched (here, 0000 Greenwich Mean Time (GMT) on 14 September 2005) Geostationary Operational Environmental Satellite (GOES); East (hovering over the equator at 75’W) and West (135′ W) are stitched together along the 100’W meridian. In this example, the eastern half of the United States lies in total darkness, while the western half remains illuminated by late afternoon sun.
Natural or “true” color backgrounds require channels that are not available from the current GOES. To simulate what GOES-R will provide, we apply a daytime background to the GOES visible imagery using the NASA Blue Marble (based on a composite of cloud-cleared imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS) “Big Blue Marble” dataset. On the nighttime side, we have created a background to the GOES infrared based on the National Geophysical Data Center’s (NGDC) 2003 “Nighttime Lights of the World” dataset (which uses the Defense Meteorological Satellite Program (DMSP) Operational Linescan System (OLS) satellite sensors low-light imaging capabilities). These backgrounds, which are shown in the middle left/right panels of Fig. 2, are ‘static’ in the sense that they will not capture a change in vegetation (e.g. burn scar) or a nighttime power outage (resulting in city lights disappearing). In the GOES-R/NPOESS era this will no longer be the case.
The data are first blended vertically (the dynamic GOES cloud imagery serving as the foreground, overlaid upon the static day/night backgrounds) using scaled and normalized versions of the GOES data to define a transparency factor. For example, for visible imagery (daytime side), the transparency factor is smallest (i.e. most opaque) for the brightest GOES visible channel (channel 1; 0.65um) pixels (which are presumably highly reflective cloud or snow cover) effectively blocking out the Blue Marble background. The transparency factor is largest for the darker GOES visible channel pixels (e.g., clear sky scenes over land and ocean)-allowing more of the Blue Marble background to “bleed through” the satellite imagery foreground layer. Similarly for the nighttime side, transparency factors are specified according to the brightness temperatures of the GOES infrared window channel (channel 4; 11.0um) pixels (e.g., cold pixels indicative of high/thick clouds are assigned a low transparency, and warmer pixels indicative of clear sky scenes are assigned higher transparency). This has the effect of making the thick/deep (cold) clouds opaque, and relatively thin clouds semi-transparent. An undesirable side effect of this scaling is that low/thick clouds (e.g., marine stratus/stratocumulus) may also appear semi-transparent by virtue of their relatively warm cloud top temperatures. This is overcome in part by introducing additional GOES channels that are cpable of identifying these clouds and artificially increasing their opacity in the GeoColor product. An example of this method applied to GeoColor low clouds at night is demonstrated in the Product Examples and Interpretation section below.
After the “vertical blending” step is completed for both the visible (daytime) and infrared (nighttime) images independently, the day and night images are blended horizontally along the solar terminator, using the solar zenith angle as a weighting term (top-center panel of Fig.2, with white areas corresponding to twilight conditions). This technique results in a graduated transition between the daytime and nighttime blended products, and represents one of the unique elements of this GeoColor product.
Contact:
Related Training:
Data Display:
WFO/Testbed Feedback:
WFO
Product/Type:
Usage:
Product Examples and Interpretations:

Figures 3 and 4 demonstrate the performance of the GeoColor product for daytime and nighttime scenes using imagery from Hurricane Katrina.
During the day (Figure 3), the Blue Marble true color background depicts blue ocean water in the Gulf of Mexico as well as light green shallow-water features (highly reflective sand/shoals) near Key West, Florida, and Key Largo (south of Cuba). Green vegetation dominates the land portions of the background scene. Close inspection reveals semi-transparent thin cirrus over Cuba, in contrast to the opaque clouds associated with Katrina’s rain bands over souther Louisiana.
Fig. 4. Hurricane Katrina moves closer to the shore in this nighttime-only composite. Purple terrain has been turned off in the background of this example. City lights offer additional information on the proximity of storm features to major metropolitan areas. Click on figure for full resolution.
At night (Figure 4), the Blue Marble background is replaced by nighttime city lights, shown as yellow/orange patches (to simulate the appearance of sodium lighting which dominates most urban lighting), corresponding to the major metropolitan areas. Again, note the transparency (city lights shining through cirrus) to the north of Katrina in contrast to the opacity (coastal cities obscured by deep, cold clouds) closer to the storm’s rain bands. These city lights provide useful additional information to the imagery analyst in terms of relating the current location of weather features to population centers and major transportation corridors (e.g. small towns along interstates often manifest themselves as linear features traced out in the nighttime lights background.

One of the inherent limitations of the visble/infrared version of GeoColor lies in the scaling for assignment of transparency. As explained previously, low/thick clouds at night may appear erroneously transparent in GeoColor due to their temperatures residing at the lower-end of scaling bounds. To overcome this problem requires the introduction of additional ‘spectral information’ available from GOES. It turns out that a difference between two infrared channels (3.9 and 11.0um – GOES channels 2 and 4, respectively) provide an ability to identify these low-clouds/fog layers by virtue of differences in scattering properties between the two channels. The differences give rise to an apparent difference in the cloud-top temperature due to these radiative property differences (sometimes referred to as a “radiometric temperature”), and we can then enhance areas that display these differences when they exceed a certain threshold.
Once we have isolated the low-cloud/fog layer in this way, we can introduce it as yet another layer in the vertical blending of GeoColor. In this case, we have chosen to place it in between the surface background and the standard infrared imagery. It makes physical sense to do so, since this is where low clouds exist in the atmosphere (below the high/thick clouds, and above the surface). Furthermore, since we have isolated this as a ‘low-cloud/fog’ layer, we can point it out as such in the GeoColor imagery by assigning it a color other than white/gray (which would have made it ambiguous with the high clouds). Figure 5 demonstrates the concept of introducing the low-cloud/fog layer to GeoColor, with a comparison of what information is lost if we do not.
Advantages and Limitations:
The GeoColor technique provides a simple yet visually powerful mechanism for transitioning seamlessly between multiple sources of information both in the vertical and horizontal dimensions. Behind the scenes in the GeoColor algorithm itself, tunable scaling factors provide developers the flexibility to adjust the relative strength of transparency in both dimensions (i.e. providing control over the amount of information retained/lost during the blending operation). This technique results in dramatic improvement to the presentation quality of standard visible and infrared satellite imagery. The smooth transition from visible to infrared-detected clouds across the day/night terminator is superior to previous methods which either use infrared exclusively (to avoid the terminator problem) or invoke discrete cutoffs between the two kinds of imagery at a developer-defined location. Noteworthy in this regard is the image in Figure 1, which demonstrates a transition from infrared to visible imagery for frontal clouds extending from Arizona through Wisconsin (for clouds over the terminator, a blend of visible/infrared is used according to the product creation discussion above).
The main caveat users must always be aware of when using this product is that the backgrounds (ie, the Blue Marble and NGDC nighttime city lights) in the current GeoColor product are static. As such, almost all dynamics pertaining to the background will not be captured in this product. Unless some aspect of the change is captured in the real-time GOES visible/infrared observations, it will not be represented. For example, seasonal changes in vegetation, power outages, river plumes, variation of nighttime land brightness with lunar phase, etc, or impacts from a natural disaster, may not be represented in real-time imagery produced from this method. Examples of backgrounds that will be detected are snow cover and sea ice during the daytime, which will manifest as bright targets in the current GOES imagery. Although technically part of the ‘background’ since they are not atmospheric features, they will be revealed in the GeoColor imagery due to their high reflectance in the GOES daytime visible channel imagery. For an example of poor performance due to static backgrounds, the city lights of New Orleans appeared to shine brightly the night after Hurricane Katrina’s passage over the city.
When this product is available in the actual GOES-R era, the backgrounds will be available more often because of the instruments on the NPOESS satellites. Specifically, the backgrounds will be produced more often and at higher fidelity by the Visible/Infrared Imager/Radiometer Suite (VIIRS) sensors (which include the Day/Night Band for nighttime lights). More frequent updates will enable the capture of some of the dynamic background components that were mentioned previously—improving its representation of all components of the scene. As for the nighttime side of GeoColor, the GOES-R ABI provides ‘low light’ sensitivity, but the light levels in this case refer to twilight conditions as opposed to natural/artificial terrestrial light emissions and lunar reflection light levels several orders of magnitude fainter.
Finally, in discussing the example shown in Fig. 5, we have been careful to use the language “low-cloud/fog” instead of distinguishing between the two. The reason for this is because the simple GOES channel 2-4 difference cannot by itself distinguish between a low cloud and a fog layer (where the latter is simply a low cloud whose base is in contact with the surface). There do exist techniques that attempt to distinguish between the two, and in principle this information could be introduced within GeoColor as yet another ‘layer’ of imagery. Other examples of additional layers that can be added are lofted dust, volcanic ash, smoke, and even snow-cover at night detected by way of moonlight reflection! Provided that there exists an algorithm capable of isolating these features, they can be converted into a distinct layer and incorporated as a unique piece of information within GeoColor. In this sense, GeoColor can be regarded as a platform for the simultaneous display of multiple pieces of information originating potentially from many different sources. Interaction with NWS users via the Proving Ground will assist GeoColor algorithm developers in determining which fields to include, the desired colors, scalings, etc., to best tailor this powerful application to the end-user needs.
WMO
Feedback Product | Feedback Date | User Office | Feedback Source | Feedback Text |
---|---|---|---|---|
GeoColor | 10/1/2011 | BOU | AFD(morning) | …. RGL GEO-COLOR IR SATELLITE CURVE INDICATES A THIN VEIL OF CIRRUS OVER THE AREA ATTM… |