Thanks to advances in computing power and storm surge modeling systems, Louisiana officials bracing for Hurricane Isaac's arrival last month had more detailed data about the storm's potential impact than they had seven years earlier when they were preparing for Hurricane Katrina.
Researchers at university supercomputing centers in Texas and Louisiana used real-time data to inform emergency workers about what would happen once the hurricane sent water into canals, levies and neighborhoods.
When Katrina hit in 2005, tools for modeling storm surges, while good, were rudimentary compared with what's available today. Back then, Louisiana used computer models with up to 300,000 "nodes," and it took six hours to run a simulation.
For each node, which represents a particular location on a map, algorithms run computations to determine what will happen during a hurricane. The number of nodes represented is roughly analogous to the number of dots per square inch in a photograph: The higher the number, the more detail that's available.
Today, simulations with some 1.5 million nodes can be completed in an hour and a half, said Robert Twilley, an oceanographer and executive director of the Louisiana Sea Grant Program.
Louisiana is using an unstructured grid. To provide neighborhood-level details about potential flooding, nodes can be concentrated in areas that are most vulnerable. The system also helped identify the best staging areas for recovery efforts.
This version of this story was originally published in Computerworld's print edition. It was adapted from an article that appeared earlier on Computerworld.com.
Read more about hardware in Computerworld's Hardware Topic Center.