Washington State University: A better picture of vegetation to reduce fire risk

0

Getting a more accurate picture of vegetation across the Western U.S. could help to improve fire agencies’ ability to better understand true wildfire risks to communities.

Ji Yun Lee, an assistant professor in the Department of Civil and Environmental Engineering, has received a one-year grant from Google to develop a tool that will use satellite imagery along with artificial intelligence and machine learning to get a better picture of vegetation across the West to help agencies develop better risk assessments and management.


Ji Yun Lee
Wildfire risk has increased significantly in recent years, particularly in the West, and with a changing climate, the risk is expected to continue growing across many parts of the United States.

Unlike other types of natural hazards, such as hurricanes or earthquakes, humans have significantly more control over how much their communities are affected by wildfires, says Lee. About 90 % of wildfires are started by people, and people can also do mitigation measures, such as removing fuels, to control how a fire will spread.

Community wildfire protection plans are used in more than 3,000 communities around the U.S. to assess risk and plan for wildfires. The plans, however, often use overly simplified information about vegetation in their assessments, says Lee. The vegetation information is often composed of old-fashioned static maps that are outdated and only rough estimates of the vegetation. While fire agencies might have excellent real-time weather information, such as current and predicted wind speeds and temperatures, the information on vegetation often doesn’t provide an up-to-date indication of just how much of a fuel hazard might be present. So, for instance, the currently used vegetation maps won’t accurately catch if a large landowner has cleared their land.

The information on vegetation, in particular, is critically important to help agencies better predict the likelihood of ignition or to know how fast a fire might spread.

“Using these overly simplified representations of vegetation mapping can lead to some inaccurate characterization of the wildfire risk and ultimately to some inefficient investment in wildfire risk management actions,” Lee said.

Lee and her colleagues will use near real-time satellite imagery, artificial intelligence, and machine learning to get a better picture of vegetation. Recent advances in remote sensing technologies, such as from satellite imagery or unmanned aerial vehicles (UAVs), have made it possible to get very specific data about vegetation in fine detail.

The researchers hope to eventually provide a software tool for agencies that will allow them to classify the vegetation density and type at very fine scales to help them accurately estimate the risk.

“By using these deep learning methods, artificial intelligence, and machine learning techniques, we can have a better understanding of the vegetation mapping and incorporate that information into the wildfire risk assessment, and we can support wildfire resilient communities,” she said.