Online extra: Utility computing -- IT on a meter

On the last day of 2003, IBM Corp. won a 10-year deal with retail chain Target Corp. to provide IT services. The deal was unusual in that Target would only pay for those services it actually used, under a variable on-demand pricing model, as IBM called it. Could government agencies be far behind in signing such deals?Large enterprises have long relied on systems administrators to service computers, freeing employees from fretting about their machines' well-being. Many agencies have outsourced the job, most notably the Navy, which gave the responsibility to Electronic Data Systems Corp. under the $6.9 billion Navy-Marine Corp. Intranet contract.Utility computing takes consolidation a step further. In much the same way organizations get their electricity through a wall socket and then get a bill based on metered usage, a utility computing services customer gets power through a network, using personal computers only as gateways to stronger resources elsewhere. "You pool your computing resources as opposed to running a number of isolated systems, which tend to have lower utilization, higher management and software costs," said Bill Mooz, senior director of utility computing for Sun Microsystems Inc., Santa Clara, Calif.The industry hasn't defined utility computing clearly, but the idea is taking root with agencies. "We're seeing more customers consider the idea of us running their custom applications," said Tim Hoechst, senior vice president of technology for the public-sector unit of Redwood Shores Calif.-based Oracle Corp. "It speaks to the idea that a lot of services from a product suite ought to be served up like a utility."A systems integrator can profit from the buzz surrounding utility computing in many ways. It can provide an agency with these services, or use them to simplify management of its own large government contract jobs. Or an integrator can help agencies build their own utility services, where one office would provide IT services for the rest of the agency. INTO THE GREAT UNKNOWNUtility computing came to wide recognition in October 2002, when IBM of Armonk, N.Y., announced it would spend $10 billion to develop what it called on-demand services, in which it would offer IT services on a metered basis."A lot of companies asked 'How is IBM making money out of this?' " said Jeremy Burton, senior vice president and chief marketing officer of Veritas Software Corp., Mountain View, Calif. "The reason is you can do certain things you couldn't do before, such as sharing storage and service." Utility computing promises to solve several problems long associated with IT spending, though it requires a new billing model to work ? one that isn't commonly used in government.Traditionally, agencies have faced a cyclical investment model for IT spending, said Peter Marshall, assistant vice president at Candle Corp. Camberley, U.K. Every few years, an agency would need to replace computers. As a result, IT budgets spike on those years. "For the life of the IT industry, buying computing resources has been an upfront capital expense," Marshall said.Purchasing not only involves taking stock in what an agency needs now, but what it will need for the entire life cycle of the equipment, which is always a chancy bet.An agency may not buy enough equipment, then pay premium rates to get more once the greater demand hits. Or it overpurchases and winds up wasting money on unused goods.By buying computer power on a pay-as-you-go basis, agencies can both smooth the spikes of spending and purchase more accurately to fit their needs. The challenge is in shifting more of their IT budgets from capital expenses to operating expenses. Likewise, service providers still must determine how to bill computing. One company with some work in this area is Sun Microsystems. Sun offers a pay-per-use package in which bills are calculated by the amount of processor and storage use per month, Mooz said. The contracts have a minimum monthly fee. For these contracts, the company installs servers at the customer site. Sun then bills per central processor unit-minute used. It bases the charge for storage on the average number of gigabytes used per month, and Sun uses its own Net Connect software tool to track usage. The packages include installation and maintenance. For systems integrators, Sun will bill on what Mooz called a "back-to-back basis." If an integrator bills its customer on a per-transaction basis and outsources the computation to Sun, Sun will bill the integrator per transaction as well," he said. Thus far, Sun has signed with Affiliated Computers Services Inc., Dallas, and Schlumbergersema, the IT consulting division of Schlumberger Ltd., New York. Schlumbergersema does a lot of international work in government-run transportation and public utility systems.Both of the integrators "are using our infrastructure to build out a specific solution for their customers to use on a pay-per-use basis," Mooz said. "We're sharing the commitment and sharing the risk." IN-HOUSE SERVICEUtility computing is not necessarily the same thing as outsourcing IT services. However, a large agency can use many of the same tools as a utility computing service provider in order to simplify its IT architecture by centralizing the services under one roof. Walter Reed Medical Center took this approach. Its Directorate of Information Management Office wanted to respond faster to Army security bulletins. This often involved reconfiguring or applying patches to large numbers of computers, usually on short notice. "It was a very difficult task," said Jeffrey Goldberg, director of enterprise management systems for Management Solutions and Systems Inc., Capital Heights, Md. Walter Reed oversees the North Atlantic Medical Command, including 12,000 desktop computers and 300 servers across 22 states. Administrators in each office would have to walk around a large facility, applying patches to each computer."All these sites had local staff, and everyone had the same tasks. It was being done 40 times in 40 different places," said Goldberg, whose company provides IT support to Walter Reed. Also, the directorate has to report the status of the repairs back up the chain of command, so getting updates from each office was crucial. To automate this process, Walter Reed bought Unicenter software from Computer Associates to remotely install the patches and reconfigure desktop computers and servers over the network. "Now, we can disseminate a pitch to all those locations without having any remote people involved," Goldberg said. Government Computer News Associate Editor Joab Jackson can be reached at jjackson@postnewsweektech.com.XXXSPLITXXX-





























































BladeLogic Operations Manager (BladeLogic Inc, Bedford Mass.): Software for provisioning and configuring large numbers of servers running Unix, Linix and Microsoft Corp.'s Windows operating system. This software is designed for large data-centers, where configuration changes can be enacted across large numbers of computers. ( http://www.bladelogic.com )

InCharge (System Management Arts Incorporated, White Plains, N.Y.): InCharge monitors network equipment, systems, applications and services, alerting administrators when problems arise and ranking problems by priority. ( http://www.smarts.com )

N1 (Sun Microsystems Inc., Santa Clara, Calif.): Virtualization and provisioning software that allows administrators to view and share Sun-based servers and storage. ( http://wwws.sun.com/software/solutions/n1/index.html )

Oracle 10G (Oracle Corp., Redwood Shores Calif.): The latest version of Oracle's relational database system, due to be released as of press time, includes improved abilities for sharing work across servers, including improved handling of diagnostics, load balancing and monitoring. The company has also released the 10G application server, which utilizes emerging grid technology standards to pool server power to handle Oracle-based Web services. ( http://www.oracle.com )

Platform JobScheduler (Platform Computing Inc., Basingstoke, U.K.): Graphical software for scheduling multiple jobs on many different types of computers located in multiple locations. ( http://www.platform.com/products/Symphony/ )

Tivoli Orchestrator (IBM Corp., Armonk, N.Y.): IBM's enterprise management software for its blade servers, allowing customers to turn on or off additional servers to meet heavy periods of traffic. ( http://www-306.ibm.com/software/tivoli/features/web-svr-prov/ )

Unicenter (Computer Associates Inc., Islandia, N.Y.): Unicenter is software for managing many computers remotely over a network. It can monitor computers and send out alerts when servers, network equipment and storage devices aren't working properly, and shift work away from troubled areas. Unicenter can also update software and apply security patches to computers. ( http://www3.ca.com/solutions/solution.asp?id=315 )

Veritas OpForce (Veritas Software Corp., Mountain View, Calif.): OpForce balances the load on multiple servers on a network. It automatically detects heavy use and moves work from overworked servers to those underutilized. ( http://www.veritas.com/van/products/itautomation.jsp )

How is utility computing different from grid computing? From on-demand computing? From autonomic computing? With so many buzzwords flying about these days, it's difficult to keep the concepts straight. Nonetheless, distinctions do exist, so vendors say.

In utility computing, computational power is drawn from a central resource, and its usage is metered. In the same way that electricity is generated at a power station and distributed, computational power can be generated at one central location, and users can tap into it for large jobs, said Tim Hoechst, senior vice president of technology for the public-sector unit of Redwood Shores Calif.-based Oracle Corp.

Just as building a large power plant is cheaper than equipping each office and home with its own electric generator, pooling computational resources could cut the costs and management duties for everyone.

In contrast, grid computing is a more distributed approach, allowing computers to share resources with one another instead of having a system draw computational cycles from one main location.

"Rather than have one big machine, you have lots of resources," Hoechst said. Likewise, when your machine sits idle, other selected parties can use the resources.

Autonomic, or self-healing, computing is the ability of computer systems to automatically route around internal problems, said Alan Ganek, vice president at Armonk N.Y.-based IBM Corp, who gave a presentation on the subject at a conference in Washington last October. An autonomic system monitors its own resources and diverts jobs when it senses equipment faltering. If a hard drive starts to go bad, data moves to another drive. If a server becomes infected with a virus, another server assumes the load.

Like grid computing, autonomic computing requires the ability to quickly shift work to areas where it can be best executed.

Some companies have coined their own unique terms to describe solutions that draw on these technologies. Last month, Hewlett-Packard Co. announced its Adaptive Enterprise initiative. The Palo Alto, Calif.-based company has teamed with several integrators and vendors, such as Accenture Ltd., BearingPoint Inc. and PeopleSoft Inc., to offer enterprise software and hardware support with the ability to contractually scale up or down, according to demand.

IBM is marketing on-demand services, a phrase often used interchangeably with utility computing. However, Roland Harris, managing partner of IBM's public-sector global and Americas business consulting services, insists the two are different. Utility computing describes IT services specifically; on-demand is oriented more toward business process management. In particular, it looks at how IT software and hardware can be developed and procured such that it can handle a variety of conditions, some not even thought of yet.

These terms may seem like a muddle, but the direction they all point to is clear: Agencies and the contractors that support them see the importance of consolidating resources to make them run as smoothly and efficiently as possible.