The Cooperative Institute for Research in the Atmosphere (CIRA) in Fort Collins, Colorado and the Naval Research Laboratory (NRL) in Monterey, California are developing and distributing the Normalized Difference Vegetation Index (NDVI) imagery product.
NDVI imagery products are sent to the National Weather Service (NWS) Regional Headquarters from which they are distributed to Weather Forecast Offices (WFOs) for display on their local AWIPS systems. Imagery updates are available approximately two times per day from the MODIS sensors on board Terra (~10:30 AM local time) and Aqua (~1:30 PM local time).
The size of a given NDVI image is determined by the span and resolution of the domain itself. Since current AWIPS system displays accommodate 1-byte per pixel, a good rule of thumb is that the size of the imagery (in bytes) corresponds roughly to the total number of pixels in a given AWIPS domain. For example, an AWIPS domain having dimensions of 1000 x 1000 pixels will require approximately 1 Megabyte (~106 bytes).
Knowledge of vegetation coverage and health has numerous applications to land management, including large-scale monitoring of croplands, forest health, and the impact of droughts. Looking at the big picture, the carbon dioxide uptake of plants is an important player in the “carbon cycle” with implications to the current and future states of the Earth’s climate. Knowledge of vegetation boundaries is also an important consideration for boundary layer moisture patterns (i.e., evapotranspiration) which can influence meteorological fields. Changes in NDVI can be indicative of seasonal variations in vegetation. NDVI can also be used to identify the extent of burn scars resulting from forest/grass fires – information that can become useful in the forecasting of debris flows resulting from heavy rains.
NDVI demonstrates the kind of imagery that will be possible in the GOES-R era. GOES-R will feature the Advanced Baseline Imager (ABI) sensor which will be able to produce versions of the imagery shown here at much higher time resolution.
The imagery is created now using the MODIS sensors on board the Terra and Aqua satellites. The specific target of the algorithm is the sensing of chlorophyll. The presence of chlorophyll in plants is key to the process of photosynthesis (whereby plants convert sunlight energy into vital nutrients, “breathing” in carbon dioxide and “exhaling” oxygen in the process). Chlorophyll is a strong absorber of blue light (~0.45 µm) and also marginally absorbs red light (~0.65 µm), while moderately reflecting green light (~0.55 µm) and strongly reflecting near infrared (~0.86 µm). This explains why most plants appear green.
The vegetation product is based on this unique spectral behavior of chlorophyll. It is basically a visualization of the NDVI, defined as the difference between measured solar reflection from a satellite band very sensitive to chlorophyll (here, 0.86 µm) and a band in the red part of the visible spectrum (0. 65 µm). The mathematical formulation is as follows:
NDVI = [Ref(0.86) – R(0.65)] / [Ref(0.86) + R(0.65)]
We see from this formulation why it is called a “normalized difference,” since we are taking the difference between the reflectance at two wavelengths and then essentially normalizing (dividing by) the sum of those reflectances. The normalization part is important for describing the relative strength of the chlorophyll signal, regardless of the particular brightness of a given scene. A green color palette is used to display the NDVI imagery. Low values of NDVI (below 0.15) are not shown and instead are replaced by natural color imagery that typically reveals barren desert landscapes.
The NDVI is actually a quantitative product (each pixel of the image has a characteristic NDVI value) but is displayed here as qualitative imagery for illustration and spatial interpretation. In general, bright green regions correspond to high vegetation content, while dark regions correspond to sparse vegetation. In regions of very low or zero NDVI, the imagery is replaced with “true color” imagery in order to reveal the nature of the surface features. Most of the time, the true color imagery will appear as brown tones, as one might expect for land surfaces having very low vegetation content. True color is displayed over all water surfaces.
Figure 1 presented a continental U.S. view of the NDVI product from late winter. Figure 3, above, shows a similar view of the CONUS during the late summer. The darker green values on the central plains in Figure 1 can be contrasted with the much brighter green values in Figure 3, owing to the seasonal variation of vegetation. At the current scale only a well-trained eye can discern the bright green linear features, oriented roughly east-west, leading from the Rocky Mountains to the Great Plains. These are rivers with enhanced vegetation along their banks. Regional and local scales of the NDVI product will capture these and other fine-scale vegetation spatial features.
Viewing examples of the biomass product over a long time period can give indications of seasonal trends in vegetation over large domains. The enhanced contrast of high vegetation backgrounds can enhance the detection of fine-scale details (for example, cleared forest patches in Brazil, road networks) and cloud/smoke features.
An important caveat of the current product has to do with the effects of cloud cover. Cloud cover inherently produces low NDVI (flattening out the spectral structure of signal across the VIS/NIR band that is being used as a proxy for biomass content). Because of this undesired affect, these areas are replaced by true color imagery in the NDVI product whenever clouds are detected in the scene. However, corrections have not been applied to account for sub-pixel scale cloud and thin cirrus contamination, resulting in an NDVI suppression that presents itself in the imagery as a “ring effect” surrounding clouds. Some of these artifacts are pointed out in Figure 4 above, cross-referenced with true color imagery.
Users are advised to cross reference the NDVI product with the true color and cirrus detection Proving Ground products, as well as recent passes over the same area. In the future, other channels such as the 1.38 µm channel which is very sensitive to thin cirrus clouds will be introduced to this algorithm to mitigate some of these effects…filtering out additional parts of the image that currently show up as artificially-suppressed values of NDVI.