The National Oceanic and Atmospheric Administration has awarded a contract potentially worth $317 million over nine years to build a high-performance computing system to support environmental modeling.
Now that hurricane season 2010 is open, there will be plenty of talk of severe weather in the coming months. And whether it is tornadoes, heat waves or floods there is little that can be done about it.
But improved forecasting would go a long way toward ameliorating some of the more devastating effects.
That's a big reason why the National Oceanic and Atmospheric Administration's mission, signed Computer Sciences Corp. to a nine-year $317 million to build a supercomputer for modeling weather patterns.
Modeling and understanding how weather patterns change is more than rainfall totals and average temperatures; it’s changing manufacturing, industry, politics, the economy, reaching into every corner of every life and all life to come on the planet.
This new high-performance computing system that CSC has been tasked with designing, building and implementing to support NOAA’s environmental modeling program needs to be some major machine.
It will be, said Mina Samii, vice president and general manager of CSC’s North American Public Sector business services division.
In fact, it will represent “the next generation of high-performance computing, a new architecture, a new design and new hardware,” she said. The agency wants it to be among the top 10 fastest supercomputers in the world.”
How big and how fast hasn’t been nailed down yet, “but it’s going to be several orders of magnitude larger than anything they have.”
Size and speed matter greatly. “When you’re doing weather or climate prediction, the larger your model, the more accurate your results,” Samii said. Over the coming year, CSC will use NOAA’s requirements as a starting point. “Our team will look and them and at how we can get the most compute power.”
That that’s a high priority for NOAA scientists goes without saying. But it’s also a high priority for the agency and figures prominently in NOAA CIO Joe Klimavicz’s NOAA IT Strategic Plan for 2010.
“NOAA stakeholders have been calling for the agency to improve the quality and management of its environmental observations, data, monitoring, forecasts and predictions,” Klimavicz wrote in his 2010 plan.
“Federal, regional, state and local decision-makers rely on NOAA for credible climate information at finer scales to support strategies to mitigate and adapt to climate variability and change, including long-term resource management practices and public infrastructure decisions.”
Additionally, “the public relies on NOAA for accurate and timely hurricane, tropical cyclone and tropical storm monitoring and forecasting that provides the lead-time and storm intensity information needed to make decisions regarding coastal evacuations and mobilization of resources.”
That’s no idle talk, Samii said. “We talked with the CIO, and his vision is that he wants to take availability of the hardware out of the equation for predicting climate change.” As with any computer technology, the hardware for high-performance computing continually evolves, she said.
Additionally, “the hardware is not nearly as expensive as it was,” she said. Klimavicz’s “vision is: If they need compute power, it should be there for them. They should be able to concentrate on running their models and analyzing the results and getting their data out.”
“Because NOAA is at the leading edge of climate research, they want to make this [supercomputer] state of the art,” Samii said. And her team has the expertise and experience to deliver, she said.
CSC has a substantial record in supercomputing: It maintains a high-performance computing center of excellence with about 150 subject-matter experts and dedicated facilities.
This summer, it will hold two-week summer school sessions in scientific and high-performance computing for graduate students working with computational sciences, such as computational chemistry, physics, biosciences or engineering, to teach them how to use supercomputers to solve scientific problems.
The company also holds supercomputer design contracts for NASA high-performance computing centers at Ames Research Center and Goddard Space Flight Center. NASA in March extended CSC’s contract (for an additional $57 million) for a year to support the Ames Advanced Supercomputing Division.
“We just did a big upgrade to Pleiades, their latest system,” Samii said. Pleiades operates at 544 teraflops (544 trillion floating point operations per second) and was No. 6 in November’s Top 500 world supercomputer rankings.
The new NOAA machine will be comparable to NASA’s Pleiades at Ames, she said, but her team will also be drawing from its work supporting high-performance computing for climate research at Goddard.
Currently, NOAA has research and development supercomputers at three locations; part of CSC’s design consideration must take into account that the agency’s strategic IT plan calls for consolidating and migrating those operations onto the new supercomputer at one location plus an additional back-up site, Samii said.
Centralizing the hardware “will make it a lot more efficient, easier to access and upgrade, and add compute capability,” she said. Researchers will run the software and get the results remotely.
Working with such mega-power users has its own challenges. “They’re very sophisticated about the technology, so we have to stay on top of things to be able to bring value to them,” Samii said. “We had a kick-off meeting with NOAA last week. They’re very knowledgeable, but they’re also very, very nice. I think we’ll have a good time.”
Another challenge isn’t primarily CSC’s but NOAA’s she said. “Part of their strategic plan is ending up with a system that makes the transition from research to operations easier and more seamless.”
In addition to its R&D high-performance computing system, NOAA has an operations high-performance computing system, which figures prominently in the agency’s strategic plan. Klimavicz plans to “apply virtualization and cloud computing principles to HPC optimization and HPC availability,” he said in the plan.
His vision is “to have a research system that can move smoothly into development and integration and then move smoothly into operation so they don’t have to rewrite their software or have systems that look completely different,” Samii said. Effecting that transition is not part of CSC’s contract, she said, “but we have to be aware of it when we design their research system.”
That’s a little trickier than it may seem on the surface, because the way the two systems are used is so different.
“For the research system the researchers want something that gives highly reliable, accurate results, but they might not worry if the system is down for an hour or two,” she explained.
Not so for the operations side. “They can’t afford anything like that when they have to predict the weather.”
How the dichotomy in use will be resolved in CSC’s design is something for the future, like most of the project — “we don’t even know where the primary location will be,” Samii said. “Come back in six months and I can tell you more.”
The Research and Development High Performance Computing System contract is an indefinite-delivery, indefinite-quantity contract that has a four-year base period, one four-year option and one one-year transition option.
The first year will be funded at $49.3 million by NOAA using funds from the American Recovery and Reinvestment Act of 2009, CSC said.
NEXT STORY: Lawmakers call for improved counterterrorism IT