How can Big Data help combat catastrophes?

Researchers from Cornell University have led the effort to use Big Data to combat catastrophes by identifying the risk factors for events such as blackouts and solar flares.

Funded by the National Science Foundation (NSF) the Predictive Risk Investigation System for Multilayer Dynamic Interconnection Analysis (PRISM) aims to harness data in order to identify risk factors across domains for catastrophic events.

With a team of experts in fields including data science, statistics, computer science, finance, energy, agriculture, ecology, hydrology, climate and space weather, PRISM will integrate data across different areas to improve risk prediction.

“We want to focus our attention on these worst-case scenarios and the risks associated with them, and how we might measure their likelihood,” said David Matteson, who is a principal investigator on the two-year, $2.42m grant, which emerged from the NSF’s Harnessing the Data Revolution Big Idea activity.

Matteson commented: “Our hope…is that by identifying systemically important critical risks – those that tie together different domains and have the biggest spillover potential – we will have the most widespread impact in terms of controlling those risks.”

Assembling large datasets across all sectors

The multidisciplinary approach is essential, Matteson said, because today’s world is composed of highly interconnected and interdependent systems, and no single expert is equipped to identify the signs of risk or the full impact of catastrophes. Using data science – which is one of the provost’s Radical Collaboration initiatives – will help integrate information to find patterns.

“We want to pull information from these diverse domains and put them together, to quantify when critical systems are stressed and strained, and figure out how to prepare,” Matteson said.

The researchers plan to assemble large datasets across sectors such as agriculture, climate and energy to create an interactive data library. Once they’ve developed this library, they’ll use cutting-edge data analysis to identify what they’ve called critical risk indicators – quantifiable information associated with risk exposure, particularly for potential catastrophes. They’ll also employ machine learning to look for anomalies in the data that might lead to new insights.

The researchers will then focus their efforts on identifying risk interconnections, and systemically important risk indicators across the different domains, in order to both predict potential hazards and to lessen the possible system-wide losses once they’ve occurred. They plan to examine known risk indicators and apply data science to identify new ones, Matteson said.

As part of the project, the researchers will work with stakeholders in the relevant fields, in hopes that policymakers would incorporate their findings. Their goal is to help create early warnings for catastrophes and improve preparedness for devastating events worldwide.

“Ultimately we hope to use this information to identify systemically important risk indicators as holistic targets for risk mitigation,” Matteson said, “and in identifying these we hope that policymakers would incorporate them into their planning.”

Subscribe to our newsletter

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements

Media Partners

Similar Articles

More from Innovation News Network