Skip to content

How we used satellite data to track California wildfires

(By Eric Sagara / March 9, 2016). As different satellites scan the globe, they gather a variety of information. That data gathering, known as remote sensing, regularly is used by scientists, and the data is free and publicly available.

The use of satellite data is a fairly recent tool in most newsrooms.

We used it last year to show how California’s drought affected vegetation health and how that, in turn, affected the state’s wildfire season. For that project, we worked with the MODIS sensor on NASA’s Terra satellite. This time, we turned to Landsat 8, a joint satellite mission between NASA and the U.S. Geological Survey. Both satellites capture imagery from a wide spectrum of light stretching beyond what the eye can see. But Landsat 8 has a much finer spatial resolution than MODIS, which means you can see more details in the images.

Landsat 8 scans Earth every 16 days. We selected imagery from four days from late July through September. These days were chosen primarily based on how much cloud cover obscured the area where fires were burning.

The data is available from the geological survey online for free. Landsat 8 captures multiple bands of light – both what can be seen by the naked eye and what cannot – and each band is stored in its own file, for a total of 12 images. By combining these bands in different ways, we could tell different aspects of the story.

Band Name Wavelength (micrometers) Resolution (meters) Purpose
1 Coastal aerosol 0.43 – 0.45 30 Studies looking at coastal areas or focusing on aerosols such as dust or ash.
2 Blue 0.45 – 0.51 30 Bathymetric mapping. Separating some vegetation types as well as distinguishing soil from vegetation.
3 Green 0.53 – 0.59 30 Vegetation health.
4 Red 0.64 – 0.67 30 Vegetation slopes.
5 Near infrared 0.85 – 0.88 30 Shorelines and biomass.
6 Shortwave infrared 1 1.57 – 1.65 30 Moisture content of soil and vegetation. Also penetrates thin clouds.
7 Shortwave infrared 2 2.11 – 2.29 30 Better moisture content analysis and cloud penetration.
8 Panchromatic 0.50 – 0.68 15 Sharper image in the red, green and blue wavelengths.
9 Cirrus 1.36 – 1.38 30 Used to detect cirrus clouds.
10 Thermal infrared 1 10.60 – 11.19 100 resampled to 30 Heat mapping and soil moisture.
11 Thermal infrared 2 11.50 – 12.51 100 resampled to 30 Improved heat mapping and soil moisture.
12 Quality assurance NA 30 Provides metadata on each pixel.

Combining the red, green and blue bands allowed us to produce a true-color image of what the land looks like to the naked eye. Adding a panchromatic band helped us increase the resolution of the images because it captures imagery in the red, green and blue spectrums at a resolution of 15 meters per pixel.
True-color image of fire area

To examine vegetation health, we used a second technique that combines imagery captured in the near infrared, red and green bands. This method was developed during World War II to detect camouflage.

More info