Applications of Data Fusion Techniques to Fire Detection

Dan Bikos, Jim LaDue, Lexy Elizalde-Garcia, Katy Christian

40 Min

1.2.3, 3.3.4, 4.1.2, 5.1.6

Introduction:


The goal of this training session is to apply the information learned in this module on data fusion to detect wildland fires as quickly and accurately as possible.  A variety of datasets are used to help increase confidence in the probability of detecting fires given the known limitations of the observational data.  The products include GOES satellite, radar data, webcams, and other environmental data with an emphasis on synthesizing these together to overcome individual dataset limitations and build on their collective strengths.  NWS meteorologists provide valuable information on initial detection of wildland fires which is a key component to fire mitigation efforts.  This training provides the most in-depth knowledge in fusing datasets for fire detection, along with examples and AWIPS procedures that can assist in operational implementation.  This training session will build on data fusion methodologies from the Data Fusion section of the WOC Severe Course (updated in FY21).

Training Session Options:


NOAA/NWS students – to begin the training, use the web-based videoYouTube video, or audio playback options below (if present for this session). Certificates of completion for NOAA/NWS employees can be obtained by accessing the session via the Commerce Learning Center.

References / Additional Links:


This course is Advanced

The Data Fusion section of the WOC Severe Track is a prerequisite to this training. Under the Data Fusion section, see the two “Practice and Applications from Multiple Data Sources” training sessions.

Contact:

Dan Bikos

Dan.Bikos@colostate.edu

Page Contact


Unless otherwise noted, all content on the CIRA RAMMB: VISIT, SHyMet and VLab webpages are released under a Creative Commons Attribution 3.0 License.