When disaster looms, who you gonna call? It could increasingly be a mathematician if IBM scientists succeed in one of their current research efforts.
IBM announced last week that its scientists have created specialized maths algorithms to help model and manage disasters. The "Stochastic Optimization Model" is being used to address disaster scenarios including the management of resources to battle giant wildfires and grapple with pandemics. Eventually the model could be applied to solving both large, seemingly intractable problems such as how to improve the American healthcare system and more modest business challenges such as scheduling limousines. The result, according to IBM, is more accurate insight into what needs to be done to survive the disaster or work through the business problem.
The Stochastic Optimization Model provides a framework for addressing problems that involve uncertainty or randomness, similar to game theory and discrete event simulation, according to Gartner Research Director John Morency. All models involve examining reasonable probabilities based on current and past events but add in a measure of randomness to examine how participants may respond, depending on how the situation changes or evolves.
IBM began developing the model in 2003 for a consortium of government agencies that were responsible for fighting forest fires. As Baruch Schieber, senior manager of IBM's Optimization Center, explains, the government's fire fighting budget was easily in the hundreds of millions of dollars. The agencies in charge -- including the Bureau of Land Management, the Bureau of Indian Affairs, the US Forest Service and others under the auspices of the US Department of Agriculture -- wanted help deciding where to position their resources, including personnel and equipment, without knowing in advance where the fires would be.
The project, called Fire Program Analysis, involves developing scenarios and plans to address a myriad of possible situations in which the probability of occurrence is high but also where the loss could be the greatest.
"The question is, what's the best way to do this planning so you get the biggest bang for your buck?" says Schieber, whose group is part of the Business Analytics and Mathematical Sciences Department at the T.J. Watson Research Center in the US.
IBM's Global Business Services division worked with its maths scientists as well as the government agencies and Colorado State University's Warner College of Natural Resources to create a sampling of scenarios, to which a probability was applied, taking into account variables such as seasonal rainfall levels, and how much damage to agriculture or structures a fire might cause.
But that was only part of the challenge, says Schieber. "If I just give the government agency or budget planner this list of all these probabilities, it would be hard for them to deduce what needs to be done." The next step, he says, is to design the optimal policy for allocating resources.
Even then, flaws in the model can surface that require scientists to reconsider their predictions. Tarun Kumar, an optimization researcher at IBM, recounts how during the testing of the model for the Fire Program Analysis project, results didn't make sense to government users in Mississippi. "[In the model] we had deployed a lot of water tenders -- [vehicles] that carry water from the base station to the fires," Kumar recalls. "They said, 'We don't use fire tenders.' We went back to the model and realized that it shouldn't allow water tenders, because you have the Mississippi River flowing right next to you."
The math calculations are performed by commercial and open-source software, including applications in the repository maintained by the Computational Infrastructure for Operations Research (COIN-OR), a site heavily supported by Schieber's department. The math itself, as he describes, consists of a set of mathematical variables -- about 200,000 of them in the Fire Program Analysis problem.
A scientific approach to risk management
The way in which IBM applies its Stochastic Optimization Model is still uncommon in business, Gartner's Morency says. "Organizations want to know how to do it. But very often they use improvised approaches to address it." The problem, he says, is that there's no single standard or generally accepted industry practices, so most companies simply use ad hoc statistics and make best guesses in order to predict the outcome of future disasters.
The limitations of the Stochastic Theory are the same for any predictive-analysis method: Garbage in, garbage out, Morency says. "The quality of what you come up with is only as good as the fundamental assumptions and data that you're basing the model on."