Big weather needs big computers
- By Doug Beizer
- Aug 14, 2005
A computer-generated NOAA satellite image shows Hurricane Emily (right) in the Gulf of Mexico and Tropical Storm Eugene (left) off the western coast of Mexico July 19.
Using wind tunnels to test airframe designs was a pivotal scientific breakthrough. Moving from wind tunnels to computer modeling was another leap.
But as modeling has become more complex, scientists have turned to supercomputers. Nowhere was that more evident than in NASA's return-to-flight effort after the space shuttle Columbia accident in 2003.
"In the wake of the Columbia disaster, NASA undertook a pretty substantial effort of doing a lot of computer analysis to try and understand what went wrong," said Dave Parry, senior vice president and general manager for Silicon Graphics Inc.'s server and platform group. "They wanted to find out exactly what the malfunction had been and whether a piece of foam breaking off could cause damage sufficient to ultimately cause the problems on re-entry."
NASA officials tackled the problem using the Columbia supercomputer, an SGI Altix architecture supercomputer, with its 10,240 processors in a shared memory environment.
"It is as though you have a PC with tons and tons of microprocessors in it, but all sitting on a single, shared-memory system, so that you can look at the entire problem as one, holistic thing," Parry said.
A less powerful computer would force analysts to break a model into pieces, so rather than analyzing a whole plane, only its rudder would be studied.
Supercomputer power also is essential to the climate research that the Commerce Department's National Oceanic and Atmospheric Administration is doing, said Walter Brooks, chief for NASA's advanced supercomputing division at NASA Ames Research Center, Moffett Field, Calif.
In the past, computer models had to focus on specific regions, such as the Caribbean. The modeling application then made assumptions about what was going on elsewhere on the planet. Sometimes it worked, sometimes it didn't.
SGI's Altix system solves that problem by modeling much larger areas at a much higher resolution.
"We basically run the whole Earth simultaneously, so while we're tracking something such as Hurricane Emily, we're also seeing the typhoons that are evolving near Japan," Brooks said. "Sometimes the effects are important, and sometimes they're not, but if you ignore them, then you don't know when these global effects drive what's going on closer to home."
Using supercomputers, NOAA officials in the Geophysical Fluid Dynamics Laboratory can better study climate issues such as El Nino and global warming, said Ellis Bailey, Raytheon Co.'s program manager for the lab. Raytheon, ranked No. 7 on Washington Technology's Top 100 prime contractors list, is the systems integrator working with NOAA at the lab.
The studies are important because climate change can lead to severe storms, floods or droughts which can have significant economic impacts. Under its contract with NOAA, Raytheon officials help define the computational requirement needed for computer modeling.
"Numerical models of this nature are so complex and have to look with a significant amount of detail over such large periods of time, they can only be performed on the most powerful computers available," Bailey said. "Without a supercomputer, this work would be virtually impossible."
The biggest challenge for a systems integrator working with a supercomputer is maintaining a balance among all the machine's components. Raytheon is able to run fast models, such as a hurricane being tracked, as long as file access, processor speed and memory access are all balanced, Bailey said.
In the past, SGI built supercomputers using microprocessors it had designed and developed along with its own operating system. The new Altix systems use Intel Itanium 2 processors and the Linux operating system.
Use of industry standard components, such as Intel Itanium 2 microchips, speeds building of the supercomputers. SGI has built systems that use as few as two processors and as many as 10,240 in NASA's Columbia supercomputers.
NASA investigators needed that kind of power to effectively study the Columbia accident. With it, SGI's Parry said, "they were able to determine both that it was possible for a piece of foam the size of what did break off to cause the damage it did, as well as provide an accurate model so they only had to run one physical test."
Scientists used a cannon to shoot foam at shuttle tiles to verify the model. A simulation accurately replicated what had gone wrong with the shuttle, Parry said.
Other agencies and system integrators also are turning to supercomputers to help solve complex problems, he said. Homeland Security Department officials are using one to manage air space. The system uses data from radar sites around the country to track aircraft.
In the future, supercomputers will be common in all federal agencies, Parry said.
"We have virtually all of the big systems integrators as our partners and customers," he said. "They are working with us to use the Altix system architecture as a baseline on which they can add their value and their vertical integration capability."
If you have an innovative solution that you recently installed in a government agency, contact Staff Writer Doug Beizer at email@example.com.
Doug Beizer is a staff writer for Washington Technology.