The Latest Word in Storage Area Networks ... Or Is It?<@VM>A Whole New Game For Integrators

Understanding SAN

A storage area network is a high-speed subnetwork of shared storage devices, which are machines that contain nothing but a disk or disks for storing data.

Virtualization is the pooling of physical storage from multiple network storage devices into what appears to be a single storage device that is managed from a central console. Using virtualization, administrators can easily allocate as much or as little storage as desired to any application or user on the network.

When the storage area network first came onto the scene, its proponents promised that managing storage would be faster, simpler, easier and less resource-intensive.

As many industry observers see it, however, the technology hasn't always lived up to its promise, and very often ends up adding extra cost, complexity and headaches for storage administrators.

Virtualization ? the next step toward automated data storage ? has finally arrived, and not surprisingly, its proponents are beginning to make their own lofty promises.

Virtualization is a software management tool. It improves upon traditional storage area networks, or SANs, by allowing administrators to pool all or portions of a SAN's disks, and then hand out logical slices to application servers on an as-needed basis, without having to re-cable or rezone the SAN.

"Virtualization is simply a technique to hide the undesirable properties of physical storage," said Augie Gonzalez, director of product marketing for DataCore Software Corp., Fort Lauderdale, Fla. "When you do that successfully, it makes disks more manageable, and it presents a more robust storage subsystem to the application."

With a traditional SAN, administrators have to take down the server when they want to move storage capacity from a SAN to a server. With a virtualized SAN, they simply drag and drop storage onto servers that require it.

Proponents say companies get a significant drop in total cost of ownership, need fewer personnel to manage more storage capacity, can easily integrate and use storage products from multiple vendors, and see a surge in performance.

A recent benchmark test performed by Fujitsu Softek, Sunnyvale, Calif., found that its Data Core-powered product, Softek Virtualization, outperformed standalone hardware arrays by two to one, more than doubling the rate of input and output and data transfer speeds.

What's more, the product configuration, which includes a single terabyte Fibre Channel storage subsystem and uses an entry-level Intel-based server, costs just $150,000. By contrast, a traditional SAN, using one terabyte of storage, would cost $1 million.

"Bottom line is virtualization is the nirvana of storage," said Spencer Sells, manager of product marketing for Gadzoox Networks Inc., San Jose, Calif. "It's what the storage industry has been working toward for years."

A small number of bleeding-edge firms, such as DataCore Software, TrueSAN Networks Inc., San Jose, Calif., and StorageApps (recently acquired by Hewlett-Packard Co.), Bridgewater, N.J., already have virtualization products on the market.

Over the next year, a number of major players will jump into the market, according to Bob Passmore, research director for storage at Gartner, an IT research firm in Stamford, Conn. These include Compaq Computer Corp., Houston; Veritas Software, Mountain View, Calif.; IBM Corp., Armonk, N.Y.; and Hitachi Data Systems, Santa Clara, Calif.

The new technology couldn't come at a more opportune time. The data storage market ? driven by the use of Web-based applications, multimedia data, and data warehouse and business intelligence implementations ? is expected to grow from $6.6 billion in 2001 to $16.7 billion in 2005, according to Gartner Dataquest, Stamford, Conn.

Data replication products have more recently been driving the growth in storage infrastructure, said Carolyn DiCenzo, chief analyst for Gartner Dataquest's storage management software and SAN appliances group. "But [now] virtualization software will further enhance that growth, as new vendors enter the market to support storage resource optimization via pooling of disk storage in SANs," she said.

Passmore said demand from customers for easier SANs implementation and management is one of many drivers behind the market.

"I think customers will grab hold of virtualization pretty enthusiastically, because it seems like a basic thing you ought to be able to do," he said.

In fact, federal agencies are already showing keen, if still guarded, interest in the technology, as they are faced with an exploding need for storage capacity, offsite backup and disaster recovery, as well as shrinking personnel budgets.

Dave MacRae, president of InfraStor Technologies Corp., a systems integrator in Princeton, N.J., said his firm recently made virtualization presentations to several agencies. A research laboratory within the Defense Department, in fact, required very little convincing before moving forward to implement this cutting-edge solution, he said.

"They simply could not do what they needed to do without virtualized storage," MacRae said.

With 10 terabytes of storage and a complex mix of nearly 50 storage devices, the Defense Department research laboratory, which MacRae declined to identify in keeping with a request by the laboratory, had requirements for users in various locations to get at the stored data without sacrificing bandwidth speed.

What's more, officials were in the process of expanding their storage capacity sixfold, adding even more complexity to their storage problems.

To give the laboratory the functionality it required, InfraStor implemented SANsymphony, the virtualization product from DataCore Software. This solution allows multiple devices to see the same storage volumes at the same time, and allows data to flow on the SAN without being forced to move to the corporate LAN. The product also features a drag-and-drop mechanism that allows an administrator to carve up storage and allocate it to multiple devices at the same time.

MacRae said one of his other federal clients is implementing a virtualized SAN, and another one is leaning toward doing so in the near future.

He said several factors sparked the government's interest, but the biggest one is cost reduction. Not only can virtualization bring down the cost of storage management by as much as 85 percent over direct-attached devices, but it allows organizations to use more of what they've already got in the way of storage capacity.

"You're never going to reach 100 percent, but with virtualization, you're going to increase your efficiency of utilization by anywhere from 30 percent to 50 percent," MacRae said. "With large systems, that translates into a whole lot of
dollars, because you don't have to buy additional storage devices."

Still, for all its benefits, virtualization is not a panacea. It faces a number of hurdles, such as overcoming inertia with such a young, unproven technology, and effectively showing the likely return on investment to skeptical buyers.

The term virtualization can also be confusing to potential buyers. Virtualization in this context refers to the ability to pool devices from storage arrays made by various manufacturers into a common storage pool, and then allocate portions of that storage to servers attached to a SAN.

By contrast, some storage vendors will tout virtualization when they are referring to storage provisioning within their specific storage box rather than across boxes. There is also virtualization in tape, which is a completely different technology.

Proponents must be prepared to clearly explain the distinctions.

"I would venture to guess that less than 10 percent of current or potential customers have even heard the word 'virtualization,' " said Erich Flynn, vice president of worldwide marketing for Fujitsu Software. "It's really new, so it's going to take a fairly visionary IT manager to explore it and, ultimately, put money down on it."For systems integrators, virtualization promises to remove one of the chief obstacles to playing in the storage area network space: lack of expertise in Fibre Channel technology.

Fibre Channel is a technology for transmitting data between computer devices at a high data rate. Up until now, the necessity for in-depth knowledge of Fibre Channel has forced many vendors to shy away from the market. Virtualization allows integrators to sidestep this barrier.

"One of the most frustrating points I hear from integrators is, 'I've got some SAN implementations, but I don't have or I can't find enough people who know Fibre Channel,' " said Erich Flynn, vice president of worldwide marketing for Fujitsu Softek, Sunnyvale, Calif. "But with a relatively basic understanding of Fibre Channel, you can implement a virtualized SAN."

Fujitsu Softek, for example, has already announced that the next release of its Softek Virtualization product will support storage over Internet Protocol in addition to Fibre Channel.

"There is a significant role for integrators when it comes to virtualization," said Augie Gonzalez, director of product marketing for DataCore Software Inc., Fort Lauderdale, Fla. "It's no longer a matter of pushing commodity products for the lowest price. Customers need consultation on how to rearrange storage management processes to take advantage of the virtualized storage pool, and how to move data through this well-managed environment."

Although virtualization will allow additional integrators to enter the SAN space, they still need to possess a number of traditional SAN qualifications. For example, they should be familiar with storage arrays from major vendors, such as Hitachi Data Systems, Santa Clara, Calif.; EMC Corp., Hopkinton, Mass.; and LSI Logic Corp., Milpitas, Calif.

Integrators also must understand SAN software pieces such as shared file access software, networking software and hierarchical storage software, and they need a fairly sophisticated knowledge of server technology.

Dave MacRae, president of InfraStor Technologies Corp., a systems integrator in Princeton, N.J., said government integrators need to be prepared to engage in integration testing upfront.

"You can't do it just by sitting at a desk," he said. "You've got to actually try this stuff out, and be able to show the customer and model for the customer what virtualization is able to do on their storage systems."

About the Author

Heather Hayes is a freelance writer based in Clifford, Va.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above.

WT Daily

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.