SGI's Linux superclustering to open new fed markets

Silicon Graphics Inc. is trying to break open a new government market for high-end computers with a new technology that aggregates memory banks of multiple Linux machines.

On Jan. 7, the Mountain View, Calif., company released a family of Linux-based servers, the SGI Altix 3000 line, which incorporate the company's memory sharing technology. Up to 64 of these computers can be linked together under a single operating system.

The computers can share up to 512 gigabytes of pooled memory. Multiple 64-machine clusters can also be linked, said Addison Snell, who is SGI's manager for high-end marketing.

This advance will open a market for SGI for labs and other government offices that need more computational power than can be offered by traditional Linux clusters, but can't afford or don't need the full power of a supercomputer, said Jan Silverman, senior vice president for global marketing for the company.

"Within technical and creative users, there is a huge market that has been trying to get the greater performance out of standards-based environments as a low-cost alternative to proprietary environments. That's the biggest growth area in high-productivity computing," Silverman said.

Clustering Linux computers to aggregate processing power is a growing market, but SGI's solution is unique because only one image of the operating system is used across all machines, he said. All the memory is also shared across all machines. This allows data to be transferred at a speed 200 times faster than that of traditional Linux clusters.

The company also sees it as a lower-cost alternative to other 64-processor high performance solutions offered by IBM Corp., Armonk N.Y., and Hewlett-Packard Co., Palo Alto, Calif.

Although the memory-sharing technology, called NumaLink, has been available for SGI's proprietary Irix system, its use within a Linux architecture will introduce the technology to a wider audience, according to SGI.

"The biggest interest in the federal space has been in government labs. Certain government labs have been tasked to find scalability in standards-based environments for economic reasons," Snell said.

Silverman said applications might include global climate prediction or laboratory work that requires immense amounts of number crunching, work that cannot be divided up and solved by a number of less powerful machines. *

Staff Writer Joab Jackson can be reached at

About the Author

Joab Jackson is the senior technology editor for Government Computer News.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

What is your e-mail address?

My e-mail address is:

Do you have a password?

Forgot your password? Click here


  • POWER TRAINING: How to engage your customers

    Don't miss our Aug. 2 Washington Technology Power Training session on Mastering Stakeholder Engagement, where you'll learned the critical skills you need to more fully connect with your customers and win more business. Read More


    In our latest Project 38 Podcast, editor Nick Wakeman interviews Tom Romeo, the leader of Maximus Federal about how it has zoomed up the 2019 Top 100. Read More

contracts DB

Washington Technology Daily

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.