Skip to Navigation Skip to content

Regional and Mesoscale Meteorology Branch

Search the RAMMB website

GeoColor Imagery

Product Information:

Expand All | Collapse All

The Cooperative Institute for Research in the Atmosphere (CIRA) in Fort Collins, Colorado, in partnership with the Naval Research Laboratory (NRL) in Monterey, California, is developing and distributing the GeoColor product.

The GeoColor products are sent to the National Weather Service (NWS) Regional Headquarters from where they are distributed to participating Weather Forecast Offices (WFOs) for display on their local AWIPS systems.

The size of one CONUS scale image is 22 MB, updates are available every 30 minutes.

Product Description:

Expand All | Collapse All

The GeoColor satellite imagery product, developed at NRL and first demonstrated on the NexSat web page (, displays standard GOES data in a new way that includes customized day/night backgrounds and makes a seamless transition from daytime (visible) to nighttime (infrared) imagery.

GeoColor demonstrates the kind of imagery that will be possible in the GOES-R era. GOES-R will feature the Advanced Baseline Imager (ABI) sensor which when combined with complementary data from the Visible/Infrared Imager/Radiometer Suite (VIIRS) on the National Polar-orbiting Operational Environmental Satellite System (NPOESS) satellites will be able to produce versions of the colorful imagery shown in GeoColor without the need for some of the special blending techniques used here, and at higher spatial and time resolution.


Fig. 2. Illustration of the five primary components contributing to the blended GeoColor imagery. Click on figure for full resolution.

Figure 2 illustrates the various components of the GeoColor imagery blending technique. In the foreground of this image are the GOES E/W satellite visible and infrared datasets (upper-most left and right panels of Fig. 2, respectively.) For this image, which spans the full Continental U.S., we have stitched together the time-matched (here, 0000 Greenwich Mean Time (GMT) on 14 September 2005) Geostationary Operational Environmental Satellite (GOES); East (hovering over the equator at 75’W) and West (135′ W) are stitched together along the 100’W meridian. In this example, the eastern half of the United States lies in total darkness, while the western half remains illuminated by late afternoon sun.


Natural or “true” color backgrounds require channels that are not available from the current GOES. To simulate what GOES-R will provide, we apply a daytime background to the GOES visible imagery using the NASA Blue Marble (based on a composite of cloud-cleared imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS) “Big Blue Marble” dataset. On the nighttime side, we have created a background to the GOES infrared based on the National Geophysical Data Center’s (NGDC) 2003 “Nighttime Lights of the World” dataset (which uses the Defense Meteorological Satellite Program (DMSP) Operational Linescan System (OLS) satellite sensors low-light imaging capabilities). These backgrounds, which are shown in the middle left/right panels of Fig. 2, are ‘static’ in the sense that they will not capture a change in vegetation (e.g. burn scar) or a nighttime power outage (resulting in city lights disappearing). In the GOES-R/NPOESS era this will no longer be the case.

The data are first blended vertically (the dynamic GOES cloud imagery serving as the foreground, overlaid upon the static day/night backgrounds) using scaled and normalized versions of the GOES data to define a transparency factor. For example, for visible imagery (daytime side), the transparency factor is smallest (i.e. most opaque) for the brightest GOES visible channel (channel 1; 0.65um) pixels (which are presumably highly reflective cloud or snow cover) effectively blocking out the Blue Marble background. The transparency factor is largest for the darker GOES visible channel pixels (e.g., clear sky scenes over land and ocean)-allowing more of the Blue Marble background to “bleed through” the satellite imagery foreground layer. Similarly for the nighttime side, transparency factors are specified according to the brightness temperatures of the GOES infrared window channel (channel 4; 11.0um) pixels (e.g., cold pixels indicative of high/thick clouds are assigned a low transparency, and warmer pixels indicative of clear sky scenes are assigned higher transparency). This has the effect of making the thick/deep (cold) clouds opaque, and relatively thin clouds semi-transparent. An undesirable side effect of this scaling is that low/thick clouds (e.g., marine stratus/stratocumulus) may also appear semi-transparent by virtue of their relatively warm cloud top temperatures. This is overcome in part by introducing additional GOES channels that are cpable of identifying these clouds and artificially increasing their opacity in the GeoColor product. An example of this method applied to GeoColor low clouds at night is demonstrated in the Product Examples and Interpretation section below.

After the “vertical blending” step is completed for both the visible (daytime) and infrared (nighttime) images independently, the day and night images are blended horizontally along the solar terminator, using the solar zenith angle as a weighting term (top-center panel of Fig.2, with white areas corresponding to twilight conditions). This technique results in a graduated transition between the daytime and nighttime blended products, and represents one of the unique elements of this GeoColor product.

Product Examples and Interpretation:


Fig. 3. Daytime composite of Hurricane Katrina advancing on the city of New Orleans, LA. Note cirrus transparency near Cuba, in contrast to the relative opacity of the primary hurricane cloud formations (e.g., spiral rainbands). Click on figure for full resolution.

Figures 3 and 4 demonstrate the performance of the GeoColor product for daytime and nighttime scenes using imagery from Hurricane Katrina.

During the day (Figure 3), the Blue Marble true color background depicts blue ocean water in the Gulf of Mexico as well as light green shallow-water features (highly reflective sand/shoals) near Key West, Florida, and Key Largo (south of Cuba). Green vegetation dominates the land portions of the background scene. Close inspection reveals semi-transparent thin cirrus over Cuba, in contrast to the opaque clouds associated with Katrina’s rain bands over souther Louisiana.







Fig. 4. Hurricane Katrina moves closer to the shore in this nighttime-only composite. Purple terrain has been turned off in the background of this example. City lights offer additional information on the proximity of storm features to major metropolitan areas. Click on figure for full resolution.

At night (Figure 4), the Blue Marble background is replaced by nighttime city lights, shown as yellow/orange patches (to simulate the appearance of sodium lighting which dominates most urban lighting), corresponding to the major metropolitan areas. Again, note the transparency (city lights shining through cirrus) to the north of Katrina in contrast to the opacity (coastal cities obscured by deep, cold clouds) closer to the storm’s rain bands. These city lights provide useful additional information to the imagery analyst in terms of relating the current location of weather features to population centers and major transportation corridors (e.g. small towns along interstates often manifest themselves as linear features traced out in the nighttime lights background.







Fig. 5. A comparison between GeoColor without (upper) and with (lower) the inclusion of an additional ‘low-cloud/fog detection’ layer included in red/pink tonality. The Proving Ground version of the product includes the low-cloud/fog detection capability. Click on figure for full resolution.

One of the inherent limitations of the visble/infrared version of GeoColor lies in the scaling for assignment of transparency. As explained previously, low/thick clouds at night may appear erroneously transparent in GeoColor due to their temperatures residing at the lower-end of scaling bounds. To overcome this problem requires the introduction of additional ‘spectral information’ available from GOES. It turns out that a difference between two infrared channels (3.9 and 11.0um – GOES channels 2 and 4, respectively) provide an ability to identify these low-clouds/fog layers by virtue of differences in scattering properties between the two channels. The differences give rise to an apparent difference in the cloud-top temperature due to these radiative property differences (sometimes referred to as a “radiometric temperature”), and we can then enhance areas that display these differences when they exceed a certain threshold.

Once we have isolated the low-cloud/fog layer in this way, we can introduce it as yet another layer in the vertical blending of GeoColor. In this case, we have chosen to place it in between the surface background and the standard infrared imagery. It makes physical sense to do so, since this is where low clouds exist in the atmosphere (below the high/thick clouds, and above the surface). Furthermore, since we have isolated this as a ‘low-cloud/fog’ layer, we can point it out as such in the GeoColor imagery by assigning it a color other than white/gray (which would have made it ambiguous with the high clouds). Figure 5 demonstrates the concept of introducing the low-cloud/fog layer to GeoColor, with a comparison of what information is lost if we do not.

Advantages and Limitations:

The GeoColor technique provides a simple yet visually powerful mechanism for transitioning seamlessly between multiple sources of information both in the vertical and horizontal dimensions. Behind the scenes in the GeoColor algorithm itself, tunable scaling factors provide developers the flexibility to adjust the relative strength of transparency in both dimensions (i.e. providing control over the amount of information retained/lost during the blending operation). This technique results in dramatic improvement to the presentation quality of standard visible and infrared satellite imagery. The smooth transition from visible to infrared-detected clouds across the day/night terminator is superior to previous methods which either use infrared exclusively (to avoid the terminator problem) or invoke discrete cutoffs between the two kinds of imagery at a developer-defined location. Noteworthy in this regard is the image in Figure 1, which demonstrates a transition from infrared to visible imagery for frontal clouds extending from Arizona through Wisconsin (for clouds over the terminator, a blend of visible/infrared is used according to the product creation discussion above).

The main caveat users must always be aware of when using this product is that the backgrounds (ie, the Blue Marble and NGDC nighttime city lights) in the current GeoColor product are static. As such, almost all dynamics pertaining to the background will not be captured in this product. Unless some aspect of the change is captured in the real-time GOES visible/infrared observations, it will not be represented. For example, seasonal changes in vegetation, power outages, river plumes, variation of nighttime land brightness with lunar phase, etc, or impacts from a natural disaster, may not be represented in real-time imagery produced from this method. Examples of backgrounds that will be detected are snow cover and sea ice during the daytime, which will manifest as bright targets in the current GOES imagery. Although technically part of the ‘background’ since they are not atmospheric features, they will be revealed in the GeoColor imagery due to their high reflectance in the GOES daytime visible channel imagery. For an example of poor performance due to static backgrounds, the city lights of New Orleans appeared to shine brightly the night after Hurricane Katrina’s passage over the city.

When this product is available in the actual GOES-R era, the backgrounds will be available more often because of the instruments on the NPOESS satellites. Specifically, the backgrounds will be produced more often and at higher fidelity by the Visible/Infrared Imager/Radiometer Suite (VIIRS) sensors (which include the Day/Night Band for nighttime lights). More frequent updates will enable the capture of some of the dynamic background components that were mentioned previously—improving its representation of all components of the scene. As for the nighttime side of GeoColor, the GOES-R ABI provides ‘low light’ sensitivity, but the light levels in this case refer to twilight conditions as opposed to natural/artificial terrestrial light emissions and lunar reflection light levels several orders of magnitude fainter.

Finally, in discussing the example shown in Fig. 5, we have been careful to use the language “low-cloud/fog” instead of distinguishing between the two. The reason for this is because the simple GOES channel 2-4 difference cannot by itself distinguish between a low cloud and a fog layer (where the latter is simply a low cloud whose base is in contact with the surface). There do exist techniques that attempt to distinguish between the two, and in principle this information could be introduced within GeoColor as yet another ‘layer’ of imagery. Other examples of additional layers that can be added are lofted dust, volcanic ash, smoke, and even snow-cover at night detected by way of moonlight reflection! Provided that there exists an algorithm capable of isolating these features, they can be converted into a distinct layer and incorporated as a unique piece of information within GeoColor. In this sense, GeoColor can be regarded as a platform for the simultaneous display of multiple pieces of information originating potentially from many different sources. Interaction with NWS users via the Proving Ground will assist GeoColor algorithm developers in determining which fields to include, the desired colors, scalings, etc., to best tailor this powerful application to the end-user needs.


Feedback Product Feedback Date User Office Feedback Source Feedback Text