Skip to navigationSkip to content

Ideas

Our home for bold arguments and big thinkers.

NASA/Handout via Reuters
Space data analytics.
SPACE EMISSION

What climate change looks like from outer space

Mark Johnson
By Mark Johnson

Co-founder and CEO, Descartes Labs

In January, New Mexico governor Michelle Lujan Grisham signed an executive order that mandates the state will reduce greenhouse gas emissions “of at least 45% by 2030 as compared to 2005 levels.”

How much is that, you may ask? Good question.

The thing is, we don’t actually know definitively what greenhouse gas emission levels were in 2005. And governments and industries can’t make a plan to reduce greenhouse gas emissions by 45% until they know how much greenhouse emissions were being produced in 2005, how much is currently being produced, and where, exactly, those emissions come from.

As business management pioneer Peter Drucker said, “You can’t manage what you don’t measure.”

You also can’t regulate what you don’t measure. Insufficient, slow-to-verify data remains one of the largest problems when it comes to fighting climate change. But the information is out there.

Where do we find this kind of data? The answer is: literally everywhere.

Every second, petabytes of information—a petabyte equals 1,024 terabytes, or 1 million gigabytes—are captured by sensors around the world. These sensors are mounted on satellites, airplanes, ground detectors, and more.

If we can harness the many ways information is being collected around the world already, we can use data refineries—software platforms that turn large amounts of raw, un-ordered data into useful information—to answer the big sustainability questions we face on our planet.

Companies do this all the time; think about the level of information Google gathers every minute to predict what we want to read and buy. By using machine intelligence to store, prepare, and utilize the data being collected all around us, we can harness these huge datasets to predict the future of our planet.

The space age of climate monitoring

Insufficient, slow-to-verify data remains one of the largest problems when it comes to fighting climate change.

Governments have been sending satellites into space to monitor the planet since the 1970s. The European Space Agency’s Sentinel program ushered in a new generation of Earth observation with satellites such as the Sentinel 1 in 2014, which can see through clouds and detect deforestation. The more recent 2017 Sentinel-5P measures greenhouse gas emissions like methane, nitrogen dioxide, and sulfur dioxide.

In addition to government satellites, which make their data free and open to the public, hundreds of commercial satellites and sensors orbit our sky each day, ranging from the size of a shoebox to a dump truck.

Combine outer space data with information from sensors on the ground and we are looking at a much more effective way to monitor the amount of greenhouse gases being emitted into the atmosphere. If emission levels are exceeding certain levels at a natural gas compressor station with a leak, for example, the data is going to unequivocally reflect that.

It’s only been in the past 10 or so years that geospatial analytics technology has evolved to let us see where specific greenhouse gas emissions are being released. We’re still learning more about how to measure certain gases, especially methane, but this technology is progressing rapidly.

By using advanced satellite imagery analysis and sensor data, along with other AI-powered measures and forecasters, we can create computer-simulated models that can help governments and industries actually determine how to reach goals like a 45% reduction in 2005 levels of greenhouse gas emissions by 2030.

Creating a model like this, which is actually useful, requires a data refinery capable of combining multiple sensor datasets, such as satellite data, latitude and longitude locations of oil and gas well pads, historical wind models sourced from the US National Oceanic and Atmospheric Administration, and eventually aerial- and land-based methane sensors from public and private energy market stakeholders.

When states like New Mexico start to deploy sensors across all oil and gas infrastructure, and send that sensor data to a state depository that ties it together with satellite and other data, regulation can become a precise, data-driven process.

While data-driven policy is new territory in many ways, we’re already familiar with data-driven regulation in simple forms such as bridge tolls that read cars’ microchips. Government agencies use sensor data to manage natural resources all the time, such as stream gauges that record water levels, city water system sensors that test water quality, and weather sensors that inform forecasts.

The United Nations names climate change “the defining issue of our time.” With its Sustainable Development Goals, it’s calling for all nations to take action to limit the increase in global average temperatures to 3.6° F/ 2° C by 2030. According to the Intergovernmental Panel on Climate Change, this requires “far-reaching and unprecedented changes in all aspects of society.”

Four days before the UN Climate Action Summit convened in New York this week to decide on global actions, the 2030 Governor’s Energy and Environmental Technology Summit took place in Santa Fe, New Mexico to tackle the same issues on a state level.

Governor Grisham, along with members from my own team at Descartes Labs, invited representatives from the largest oil and gas extraction companies to sit side by side with the leaders from tech companies and vocal environmental advocacy groups in the state to create a viable, shared vision for New Mexico’s climate future.

The most exciting outcome of the 2030 New Mexico summit is a new plan to create mapping and modeling capabilities to monitor methane emissions in the state, starting in the Permian Basin, which is the highest-producing oilfield in the world. Descartes Labs will build a comprehensive data refinery for the state, making New Mexico the first state to formally leverage data from the Sentinel-5P satellite in a methane-monitoring solution.

Show me the data

For both the oil and gas extraction industry and environmental advocacy groups, data is the neutral territory.

While the groups we brought together at the 2030 Summit may be diametrically opposed on energy issues and resource extraction policies, there’s one unifying factor that neither can deny: data.

For members of both the oil and gas extraction industry and environmental advocacy groups, data is the neutral territory—the bridge between the roadblocks where we typically get stuck on polarized issues when we’re tackling large scale challenges like climate change.

Data tells us where the problems are and can help point us to solutions. More institutional investors, such as BlackRock Funds, are using geospatial data and machine learning models as part of their environmental, social, and corporate governance criteria to predict the future risk and return of potential financial investments, such as those planned by energy companies.

Today, data collected from space and ground-based sensors hold the most accurate and valuable information about our planet’s climate. If governments and industries have any hope of meeting the UN’s development goals by 2030, this is what they need to be tracking.

The conversation happening in New Mexico, a state that is deeply reliant on oil and gas as a primary source of funding, gives us a model for bridging traditional divides as we all work together to plan a future that is both sustainable and economically stable.

📬 Kick off each morning with coffee and the Daily Brief (BYO coffee).

By providing your email, you agree to the Quartz Privacy Policy.