Virtualization: The Latest Word in Storage Area Networks ... Or Is It?
- By Heather Hayes
- Nov 30, 2001
When the storage area network first came onto the scene, its proponents promised that managing storage would be faster, simpler, easier and less resource-intensive.
As many industry observers see it, however, the technology hasn't always lived up to its promise, and very often ends up adding extra cost, complexity and headaches for storage administrators.
Virtualization ? the next step toward automated data storage ? has finally arrived, and not surprisingly, its proponents are beginning to make their own lofty promises.
Virtualization is a software management tool. It improves upon traditional storage area networks, or SANs, by allowing administrators to pool all or portions of a SAN's disks, and then hand out logical slices to application servers on an as-needed basis, without having to re-cable or rezone the SAN.
"Virtualization is simply a technique to hide the undesirable properties of physical storage," said Augie Gonzalez, director of product marketing for DataCore Software Corp., Fort Lauderdale, Fla. "When you do that successfully, it makes disks more manageable, and it presents a more robust storage subsystem to the application."
With a traditional SAN, administrators have to take down the server when they want to move storage capacity from a SAN to a server. With a virtualized SAN, they simply drag and drop storage onto servers that require it.
Proponents say companies get a significant drop in total cost of ownership, need fewer personnel to manage more storage capacity, can easily integrate and use storage products from multiple vendors, and see a surge in performance.
A recent benchmark test performed by Fujitsu Softek, Sunnyvale, Calif., found that its Data Core-powered product, Softek Virtualization, outperformed standalone hardware arrays by two to one, more than doubling the rate of input and output and data transfer speeds.
What's more, the product configuration, which includes a single terabyte Fibre Channel storage subsystem and uses an entry-level Intel-based server, costs just $150,000. By contrast, a traditional SAN, using one terabyte of storage, would cost $1 million.
"Bottom line is virtualization is the nirvana of storage," said Spencer Sells, manager of product marketing for Gadzoox Networks Inc., San Jose, Calif. "It's what the storage industry has been working toward for years."
A small number of bleeding-edge firms, such as DataCore Software, TrueSAN Networks Inc., San Jose, Calif., and StorageApps (recently acquired by Hewlett-Packard Co.), Bridgewater, N.J., already have virtualization products on the market.
Over the next year, a number of major players will jump into the market, according to Bob Passmore, research director for storage at Gartner, an IT research firm in Stamford, Conn. These include Compaq Computer Corp., Houston; Veritas Software, Mountain View, Calif.; IBM Corp., Armonk, N.Y.; and Hitachi Data Systems, Santa Clara, Calif.
The new technology couldn't come at a more opportune time. The data storage market ? driven by the use of Web-based applications, multimedia data, and data warehouse and business intelligence implementations ? is expected to grow from $6.6 billion in 2001 to $16.7 billion in 2005, according to Gartner Dataquest, Stamford, Conn.
Data replication products have more recently been driving the growth in storage infrastructure, said Carolyn DiCenzo, chief analyst for Gartner Dataquest's storage management software and SAN appliances group. "But [now] virtualization software will further enhance that growth, as new vendors enter the market to support storage resource optimization via pooling of disk storage in SANs," she said.
Passmore said demand from customers for easier SANs implementation and management is one of many drivers behind the market.
"I think customers will grab hold of virtualization pretty enthusiastically, because it seems like a basic thing you ought to be able to do," he said.
In fact, federal agencies are already showing keen, if still guarded, interest in the technology, as they are faced with an exploding need for storage capacity, offsite backup and disaster recovery, as well as shrinking personnel budgets.
Dave MacRae, president of InfraStor Technologies Corp., a systems integrator in Princeton, N.J., said his firm recently made virtualization presentations to several agencies. A research laboratory within the Defense Department, in fact, required very little convincing before moving forward to implement this cutting-edge solution, he said.
"They simply could not do what they needed to do without virtualized storage," MacRae said.
With 10 terabytes of storage and a complex mix of nearly 50 storage devices, the Defense Department research laboratory, which MacRae declined to identify in keeping with a request by the laboratory, had requirements for users in various locations to get at the stored data without sacrificing bandwidth speed.
What's more, officials were in the process of expanding their storage capacity sixfold, adding even more complexity to their storage problems.
To give the laboratory the functionality it required, InfraStor implemented SANsymphony, the virtualization product from DataCore Software. This solution allows multiple devices to see the same storage volumes at the same time, and allows data to flow on the SAN without being forced to move to the corporate LAN. The product also features a drag-and-drop mechanism that allows an administrator to carve up storage and allocate it to multiple devices at the same time.
MacRae said one of his other federal clients is implementing a virtualized SAN, and another one is leaning toward doing so in the near future.
He said several factors sparked the government's interest, but the biggest one is cost reduction. Not only can virtualization bring down the cost of storage management by as much as 85 percent over direct-attached devices, but it allows organizations to use more of what they've already got in the way of storage capacity.
"You're never going to reach 100 percent, but with virtualization, you're going to increase your efficiency of utilization by anywhere from 30 percent to 50 percent," MacRae said. "With large systems, that translates into a whole lot of
dollars, because you don't have to buy additional storage devices."
Still, for all its benefits, virtualization is not a panacea. It faces a number of hurdles, such as overcoming inertia with such a young, unproven technology, and effectively showing the likely return on investment to skeptical buyers.
The term virtualization can also be confusing to potential buyers. Virtualization in this context refers to the ability to pool devices from storage arrays made by various manufacturers into a common storage pool, and then allocate portions of that storage to servers attached to a SAN.
By contrast, some storage vendors will tout virtualization when they are referring to storage provisioning within their specific storage box rather than across boxes. There is also virtualization in tape, which is a completely different technology.
Proponents must be prepared to clearly explain the distinctions.
"I would venture to guess that less than 10 percent of current or potential customers have even heard the word 'virtualization,' " said Erich Flynn, vice president of worldwide marketing for Fujitsu Software. "It's really new, so it's going to take a fairly visionary IT manager to explore it and, ultimately, put money down on it."
Heather Hayes is a freelance writer based in Clifford, Va.