Contractors see a growing role for high-performance computers as investigative tools
- By Doug Beizer
- Sep 15, 2007
Supercomputers may help investigators determine the cause of the 1-35 bridge disaster in Minneapolis Aug. 1. In the wake of the incident, experts on disasters are calling on the federal government to invest more heavily in high-performance computer systems.
When an airplane slammed into the Pentagon in 2001, some destruction was inevitable. As horrific as the day was, the loss of life could have been greater.
Hundreds of lives were spared because of the modifications that had been made to the building, said Dave Perry, senior vice president and product general manager at high-performance-computing company Silicon Graphics Inc. "So while there was a lot of damage in the lower floors, the upper floors didn't collapse on top of them."
Supercomputers played a key role in developing the Pentagon's modifications, Perry said. The high-performance machines are ideally suited for complex work such as modeling how a building will handle various types of attacks.
And in the wake of the I-35 bridge collapse in Minneapolis, supercomputers will play an important role in determining what caused the disaster. Experts say the need for supercomputers to investigate the bridge disaster is a good example of why federal and private dollars should be invested in developing the high-performance systems.Physics-based modeling
Investigating accidents, preparing for terrorist attacks or designing better body armor are just some of the ways government agencies use supercomputers.
When engineers design a bridge, a high-end PC generally has enough computing power to do the job. With decades of experience designing certain structures, finding the specifications for things such the size of beams can be straightforward.
"But what we find in a lot of these cases, when something extraordinary happens, then you need more than just what a lookup table or what a low-fidelity piece of software can get you," said John West, a senior fellow at the Defense Department's High-Performance Computing Modernization program. "You need real physics-based modeling because the assumptions you made to build the quick-and-dirty answer failed, and you don't know where to look for the failure."
A supercomputer is capable of doing large-scale, physics-based modeling to determine why a structure failed. For example, the machines were able to determine why the levees failed in New Orleans when Hurricane Katrina struck.
"The idea with a supercomputer is you can bring thousands and thousands more time computational capability to bear on a single problem and really shine a light in the dark corners that you haven't been able see into before," West said.
After terrorist attacks in other countries in the 1990s, supercomputers were used to help investigators understand how and why buildings fail when hit with explosives. The computers take into account factors such as the type of explosive used, where the truck bomb was parked and which way it was oriented.
That kind of data helped with the retrofit at the Pentagon. The project was supercomputer-based civil engineering that went beyond engineering standards and rules of thumb. Precise, physics-based models were conducted to test the limits of how the structure would behave under various types of stress.
For the investigation in Minneapolis, a supercomputer will create detailed materials models with knowledge of the properties of steel beams. The models will factor in age and environmental degradation to determine how those factors would result in a beam deviating from accepted specifications.
"In this application they could use supercomputers to go back and understand exactly what the mechanism of failure was and whether materials degradation had anything to do with it," West said. "Or once the failure started, what are some theories on how it could have been transmitted."
The physical investigation in Minneapolis will also play an important role. Investigators will try to piece together as much as they can from the failed components. They'll look at how materials sheared and twisted. That work should help them develop a handful of theories about the causes of the bridge collapse. From there, they can use supercomputers to model those theories to determine which one is correct.Scientific visualization
To make sense of all the data the supercomputers will produce, scientific visualization, a related discipline, will also likely be used in the Minneapolis investigation.
High-performance computers don't provide a single answer to a problem but rather a series of numbers related to motion at every grid point of a structure. That consists of trillions of pieces of information and numbers that are hard to decipher.
Scientific visualization can animate the data so it makes sense to nonengineers and lay people.
"We take that data and use it in software like Hollywood uses for special effects," West said. "What that allows us to do is show physically accurate computed data in a way that resembles the real world. Anyone can look at it and understand. It's not just a cartoon, it is a technical computer simulation."
Although a supercomputer is a perfect tool for post-accident investigations, Perry said, the machines will play a bigger role in ensuring accidents don't happen in the first place.
After the Space Shuttle Columbia crash, supercomputers were used to model how much damage a foam chunk from the fuel tank could inflict on the spacecraft's wing. Those models led to an experiment where foam was shot out of a cannon at a simulated wing to reproduce the failure.
In the future, those kinds of supercomputer models will be done during a mission to avoid a disastrous accident.
"The future of high-performance computing is being able to use it in real time," Perry said. "They will be an interactive part of a process in order to inform decision in real time."Staff writer Doug Beizer can be reached at firstname.lastname@example.org.
Doug Beizer is a staff writer for Washington Technology.