Build an automated AI and cloud pipeline to predict wildfires

Last year, the world lost 6.7 million hectares of tropical forest (about 18 soccer fields every minute). A lot of this loss is due to wildfires.

The traditional way to track these fires is slow, but organizations using Google Cloud and AI can fully automate the process, from spotting the fire to predicting its path. Here’s how. 

Finding fires when they are just small

You cannot stop a fire if you do not know it is there. To fix the satellite resolution problem, Google and the fire community built FireSat.

Instead of waiting for a fire to get huge, FireSat uses infrared sensors to spot fires as small as 5 by 5 meters. More importantly, it updates globally every 20 minutes. This gives systems the raw data needed to spot an ignition fast, rather than waiting hours for the next satellite pass.

 

wildfires_firesat_satellite_infographic.width-1250 (1)

FireSat is the first set of satellites built specifically to spot small fires quickly using high-quality images.(Source: Google Research)

 

Calculating speed and direction

Knowing where a fire is right now is not enough; you need to know where it is going. Researchers at USC built a generative AI model that combines different satellite feeds to figure this out.

 

ai-model-accurately-pr (1)A model developed by USC researchers showing how combining satellite feeds can predict a fire’s path and arrival time. (Credit: USC / Remote Sensing)

 

They take high-resolution maps that update twice a day and merge them with geostationary satellites that update every five minutes. That five-minute update is important because it tells the system exactly when the fire started. If you know the exact ignition time, you can accurately calculate the fire’s speed.

The model also factors in terrain (like the fact that fire moves faster uphill) to map the exact path.

Automating the workflow in Google Cloud

Building a warning system that works at scale requires a fully automated pipeline. By connecting satellite feeds directly to Google Cloud, you remove the delay between spotting a fire and alerting the people who need to know.

  • Data collection: In the morning, a tool called Cloud Composer pulls open-source weather data into BigQuery. If it detects severe drought criteria, it flags the region.
  • AI analysis: Vertex AI automatically scans satellite imagery of the high-risk regions stored in Cloud Storage, assigning a probability score to any heat signatures.
  • Instant alerts: If the score is high, Cloud Pub/Sub sends out an alert.
  • Automated delivery: Cloud Functions then handle the final output. For existing clients, the system makes an HTTP call to the client’s system to deliver the data and log the transaction. For new prospects, it uses SendGrid to email a blurred sample image and a link to buy the full data.

Why predicting deforestation is so difficult

Deforestation is incredibly hard to predict because there’s a lot of factors going in and it is driven by a complex mix of human activities (like farming, logging, mining, or building towns) together with environmental factors like wildfires.

The traditional way to forecast this loss meant manually combining scattered maps of local roads, economic indicators, and population density. This is hard to do at scale because the information is often incomplete, differs from one region to the next, and becomes outdated very quickly.

Forecasting years in advance

Google DeepMind also built a model called ForestCast to predict where trees might be lost in the future.

Instead of trying to use old maps of roads or towns, it focuses entirely on satellite images. By looking at how the landscape has changed over time (essentially seeing where forest has already been lost) it can accurately predict which areas are likely to be burned or cut down next. Since it only needs satellite data, the tool works the same way anywhere in the world.

 

 

Forest loss and land-use changes make up about 10% of global greenhouse-gas emissions. By using high-resolution satellites, AI, and automated cloud pipelines, we can remove the manual steps that slow everything down, so that data is sent out as soon as something happens, without waiting for a person to process it.

If you need help building automated data pipelines on Google Cloud, the team at Revolgy, a Google Cloud Partner, is here to help you design and set up the right architecture. Contact us today for a free consultation.