IBM Sees Future in Grid Computing

IBM Sees Future in Grid Computing<@VM>Future Use <@VM>Products On the Street<@VM>Recent IBM Supercomputing Contracts

Mukherjee & Turek

George Strawn

Richard Hirsh

While the new concept of grid computing has started to flourish in the academic community, officials with IBM Corp. already are eyeing corporate and government applications that could follow recent experimental projects the company has won.

On Aug. 9, along with Qwest Communications International Inc., Denver, IBM was chosen to support the National Science Foundation's three-year, $53 million grid computing project. This news came shortly after the company announced it was developing two grids abroad: one for the British government's e-Science Core Programme, a $72 million package to develop a global scientific collaboration infrastructure; and a second one for the Netherlands.

David Turek, IBM vice president of emerging technologies, said IBM of Armonk, N.Y., will build $4 billion worth of grid-related projects over the next few years.

"It especially has attracted a lot of interest by large companies that can use it to connect geographically diverse offices and unify the supply chain," he said.

It can also be used to create virtual databases that pool the content from different offices, and take on large engineering and modeling projects beyond the scope of any single computer.

Computational grids aggregate computers to pool data, applications and processing power, said Richard Hirsh, a National Science Foundation program officer of the Distributed Terascale Facility project. For some time now, servers have been networked in clusters close to each other to offer virtual-supercomputer levels of performance.

Grid computing goes to the next step: It links computers over geographically remote areas to perform as a single, virtual supercomputer.

The National Science Foundation's network will link four high-performance computers into a single, virtual entity, allowing researchers to tackle problems in weather modeling, protein folding, nuclear modeling and other areas previously beyond today's computational powers.

This project is a joint undertaking of the San Diego Supercomputing Center, the California Institute of Technology in Los Angeles, the National Center for Supercomputing Applications (NCSA) in Champaign-Urbana, Ill., and the Argonne National Laboratory in Chicago.

"Anyone can build a smaller scale version of a cluster like this," said Hirsh, pointing out that the NCSA already offers free basic software called "Grid-in-a-Box" to build demonstration grids.

IBM's role in the National Science Foundation grid is primarily one of systems integrator, Hirsh said. IBM will provide clustered Linux servers based on next-generation 64-bit Intel Itanium family processors. IBM supercomputing software will handle cluster and file management tasks.

The grid networking protocols and application program interface tools were developed by the open source Globus Project, an effort by the Department of Energy, the University of Southern California and independent developers.

IBM also will use cluster-scalable, fiber interconnects from Myricom Inc., Arcadia, Calif., switching equipment from Cisco Systems Inc., San Jose, Calif., database solutions from Oracle Corp., Redwood Shores, Calif., and its own TotalStorage products. Qwest will provide a dedicated 40 gigabit optical network to connect the locations.

The facility will go online by mid-2002 and be fully operational by April 2003. According to George Strawn, the executive officer for National Science Foundation's Directorate for Computer & Information Science and Engineering, the grid will be stable enough for everyday use by scientists. But as researchers are drawing on this resource, it will also allow developers to work out the bugs of the design.Work in new technology-based projects, such as grid networking, often have a value that goes beyond the dollar amounts for some companies.

Through cooperative agreements and contracts such as the grid network, companies can place themselves on the developmental edge of a new technology, allowing them to build expertise that can be used later in the commercial marketplace, Strawn said.

For instance, Strawn said Sprint Corp. and WorldCom Inc.'s UUNET and MCI divisions worked with the National Science Foundation during the early 1990s to enable widespread Internet access. As a result, they "put themselves in the position to be market leaders" in Internet delivery, he said.

The National Science Foundation sees a similar commercial adoption for grid networking. "We wouldn't mind if it gets picked up by the commercial community," Hirsh said.

Hirsh said future users of grid technology could include law enforcement agencies, which could monitor an entire city for gunshots and other disturbances via a citywide network of sensors. Weather could be predicted more accurately with larger-scale computations. Engineering teams could use the technology to go through large design testing without purchasing a supercomputer.

Even agencies not needing the power of a supercomputer can benefit. Those with a vast number of outposts, such as the U.S. Customs Service, could greatly simplify operations by using a central computing infrastructure with distributed nodes, Hirsh said.

Grid networks can aggregate information into single, virtual databases, Turek said. A pharmaceutical company or a health agency could use a computer grid to do protein folding, using genomic information that might otherwise be scattered across multiple sites in different formats.

"It can work across six, eight, 10 or 20 organizations," he said.IBM has announced it will grid-enable many of its own products in the upcoming years.

Sun Microsystems Inc. of Palo Alto, Calif., offers Grid Engine, a software that manages spare Sun workstation process cycles. The company claims an animation-rendering job that normally takes nine and a half hours to do on a workstation can be cut to 10 minutes when spread among 20 workstations.

In December 2000, Silicon Graphics Inc. of Mountain View, Calif., announced NASA's grid team successfully integrated its SGI Origin 2000 server cluster into NASA's grid's infrastructure. A Silicon Graphics spokesperson also said that the NASA grid project used Origin 2000 clusters almost exclusively.

On the academic front, Strawn said that around the end of September the National Science Foundation will announce the winner of a $10 million project to develop network-centric middleware to allow non-research-related applications to be used in gridlike networks. Technologies such as these may shift grid computing into everyday use.

"Grid computing has possibilities that go far beyond use in the academic community," Turek said.In addition to its grid contracts, IBM Corp. has been busy on the old-fashioned supercomputing front. Announcements from August include:

? Blue Gene Research Project: IBM is undertaking a $650,000 cooperative research and development agreement with the Department of Energy's Oak Ridge National Laboratory to apply advanced protein science to IBM's next-generation cellular architecture supercomputer design. Unlike today's computers, cellular servers will run on chips containing processors with memory and communications circuits. In this field, IBM also signed a partnership with biological modeling software maker Physiome Sciences Inc., Princeton, N.J., Physiome will use IBM's supercomputers for biological research, and IBM will license the company's biological modeling technology for its internal use.

? ASCI White: Commissioned by the Department of Energy's National Nuclear Security Administration's Accelerated Strategic Computing Initiative, this IBM supercomputer went online this month with, the company claims, more speed than the next three most powerful supercomputers on Earth combined ? more than 12.3 trillion floating point operations per second. This $110 million supercomputer, the third in five successively faster machines to be built for the project, will help scientists maintain the safety and reliability of the U.S. nuclear stockpile by simulating in three dimensions the aging and operation of nuclear weapons.

? Department of Energy's Weather Modeling Project: Oak Ridge National Laboratory will install an IBM eServer supercomputer to better predict long-range climate trends, as well as tackle a wide spectrum of other scientific projects, particularly in studying how global warming affects agricultural output and water supply levels.

About the Author

Joab Jackson is the senior technology editor for Government Computer News.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

What is your e-mail address?

My e-mail address is:

Do you have a password?

Forgot your password? Click here
close

Washington Technology Daily

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.

Trending

contracts DB