Machine learning helps predict the next global disaster
Predicting extreme events, including earthquakes, pandemics or “rogue waves”, can be problematic as computational modelling often falls short; statistically speaking, these events are so rare that there’s just not enough data to use predictive models to accurately forecast when they’ll happen again.
A team of researchers from Brown University and Massachusetts Institute of Technology say they may have solved this problem and in a new study in Nature Computational Science, the scientists describe how they combined statistical algorithms — which need less data to make accurate, efficient predictions — with a powerful machine learning technique developed at Brown.
The scientists trained it to predict rare events despite the lack of historical records and found this new framework can provide a way to circumvent the need for massive amounts of data traditionally required for these kinds of computations.
“You have to realise that these are stochastic events,” says George Karniadakis, a Professor of Applied Mathematics and Engineering at Brown and a study author. “An outburst of pandemic like COVID-19, environmental disaster in the Gulf of Mexico, an earthquake, huge wildfires in California, a 30-meter wave that capsizes a ship — these are rare events and because they are rare, we don't have a lot of historical data. We don't have enough samples from the past to predict them further into the future. The question that we tackle in the paper is: What is the best possible data that we can use to minimise the number of data points we need?”
Active learning helps DeepOnet think like a human
The researchers found the answer in a sequential sampling technique called active learning. These types of statistical algorithms are not only able to analyse data input into them, but can also learn from the information to label new relevant data points that are equally or even more important to the outcome that’s being calculated.
That’s critical to the machine learning model the researchers used in the study. Called DeepOnet, the model is a type of artificial neural network, which uses interconnected nodes in successive layers that roughly mimic the connections made by neurons in the human brain. In the paper, the research team shows that combined with active learning techniques, the DeepOnet model can be trained even when there are not many data points.
“The thrust is not to take every possible data and put it into the system, but to proactively look for events that will signify the rare events,” says Karniadakis. “We may not have many examples of the real event, but we may have those precursors. Through mathematics, we identify them, which together with real events will help us to train this data-hungry operator.”
In the paper, the researchers apply the approach to pinpointing parameters and different ranges of probabilities for dangerous spikes during a pandemic, finding and predicting rogue waves, and estimating when a ship will crack in half due to stress.
The researchers found their new method outperformed more traditional modelling efforts, and they believe it presents a framework that can efficiently discover and predict all kinds of rare events.
In the paper, the research team outlines how scientists should design future experiments to minimise costs and increase forecasting accuracy. Karniadakis is already working with environmental scientists to use the method to forecast climate events including hurricanes.
- “Augmented workforce” still finding its feet in shift to AIAI Strategy
- Machine learning hashes out a way to speed up huge databasesMachine Learning
- Scientists reflect on the Harry Potter nature of AI chatbotsAI Applications
- GPT-3 language model matches humans in psychological testsAI Applications