Client/Server COMPUTING ARCHITECTURE PENDULUM Swings Again
COMPUTING ARCHITECTURE PENDULUM
By Ed McKenna
A decade after the networked desktop began its assault on the supposedly doomed mainframe, the market is shifting back to a compromise hybrid that combines features of both computing styles.
"The pendulum, which started out with mainframe computing and then swung all the way to other extreme, which was the client/server environment, is now moving back toward the middle," said John Leahy, group manager for government affairs and public relations at Sun Microsystems Federal, McLean, Va.
The mainframe is making a return as a mega-server for a network of Internet desktops and for high-powered, low-cost desktops. However, this centralized architecture, which is being revived by the promise of the emerging Java computer language and low-cost, smart terminals dubbed network computers, faces stiff competition as the purchase cost and management burden of client/server computers continues to drop month by month.
Just what the compromise solution will be is still in dispute. Leahy foresees a distributed computing environment incorporating a range of systems, including mainframes, a variety of client/servers and the Internet.
The Java programming language "is ideal for the network environment."
Sun Microsystems Federal
"Although client/server technology is by no means new, many firms, including government organizations, are still struggling with enterprise-scale, mission-critical systems deployment and will continue to do so," said Steve Bonadio, an analyst with the Sentry Group Inc., a Westborough, Mass.-based market research firm.
Issues of cost, reliability and scalability have scuttled many attempts to move from the mainframe to the client/server environment, according to industry experts. This is especially true of the public sector, with only 21 percent of federal and state government organizations having deployed their enterprisewide, mission-critical systems on client/server technology, compared to 29 percent for nongovernment firms, Bonadio said.
Despite the difficulties and cost of implementing client/server networks, these systems offer users several big advantages. "The major reason for going to client/server is to provide flexibility and capability for all of your people, giving them access to the resources they need to get their jobs done without having to be geographically right next to the system," said Bill Dwyer, chief technologist for government business at Hewlett-Packard Co. of Palo Alto, Calif.
Seeing the advantages of client/server over the older style mainframes, many government organizations are opting for such systems, creating prime opportunities for vendors.
The Navy-sponsored Super-Minicomputer Program, valued at up to $2.5 billion, is one example of a major government client/server project under way. The project, managed by the General Services Administration, includes a roster of about 50 software, hardware and service providers, including such industry heavyweights as Microsoft, Hewlett-Packard and Oracle. Litton-PRC Inc., McLean, Va., is the prime contractor on this indefinite-delivery, indefinite-quantity contract begun in 1993 and slated to run through 2002.
On a smaller scale, client/server specialist RMS Information Systems Inc., Lanham, Md., helped NASA migrate its huge Center for Aerospace Information data repository from a mainframe to client/server system built on large IBM and Sun Microsystems servers. RMS also has been enlisted by the Federal Aviation Administration, the Department of Defense and Coast Guard to help those agencies devise and deploy distributed systems.
But high costs and technical complexity have made these systems difficult to install and manage, some industry officials said. Research by Standish Group International Inc. of Dennis, Mass., indicates that only 27 percent of client/server projects in the United States have been completed on time and on budget.
"A lot of people that have tried to implement the client/server model have found out that, first of all, they're costly to implement and then they're even more expensive to maintain," said Jerry Sheridan, an analyst with Dataquest, San Jose, Calif.
Chip Bumgardner, senior vice president and chief technical officer at BTG Inc. of McLean, Va., said there are a lot of costs associated with client/server systems that customers don't realize at first. For example, they may discover that they need 64 megabytes of memory and Pentium II microprocessors on every desktop to run planned applications, he said.
"On a per-user basis, PC/LAN installations and distributed Unix systems are typically between three and six times more expensive than centralized computing," said Robert Simko, executive director of the International Technology Group, Mountain View, Calif.
Beyond the initial investments in the servers, which can cost up to $1 million, as well as the PCs or workstations, there are considerable costs involved in managing the dispersed and often disparate systems on the network, or ensuring that the system can grow with increasing demand.
The issue of system availability is a downside for the client/server approach. Mainframes are consistently available greater than 99 percent of the time, said Mike Miller, senior vice president of federal sales at Computer Associates International Inc., Islandia, N.Y.
"When you deploy critical applications and systems in the distributed [client/server] world, the reality of being able to guarantee a 99 percent plus availability just doesn't happen," he said. "In fact, a lot of our clients within DoD and the civilian agencies strive for 80 percent availability."
In addition, the explosive growth of World Wide Web-based technologies is straining the capacity of the [client/server] systems, especially those linked to small or mid-range servers. "In the past you could go to a distributed environment and use a [Windows] NT server because you were accessing maybe a couple of 100 users internally in the corporation," Simko said. "Now you're opening it up through Internet, intranet and extranet, and suppliers and customers will all become users."
"You must have high availability," he said. "If I hit your Web site and I can't get to you because you ran out of steam, I'm an irritated customer."
Trends in Client/Server
In the last year, Computer Associates, Sun Microsystems and Hewlett-Packard have introduced network management programs that have eased the difficulties of administering client/server systems for many government agencies.
In late 1997, for example, the U.S. Air Force Air Mobility Command selected Computer Associates' Unicenter TNG to manage all IT resources of the AMC Command and Control Information Processing Systems, a worldwide network of computer sites for which Computer Sciences Corp. of El Segundo, Calif., is prime contractor. The network is used by AMC to plan, monitor, and control airlift and tanker aircraft operations. The Social Security Administration and the U.S. Army are among other users of Unicenter, said Miller.
Another trend picking up steam over the last year has been a movement to consolidate or re-centralize client/server networks to cut costs and boost the scalability of systems.
"We're seeing a lot of technology and activity built around supporting that activity," said Sheridan. At the end of last year, for example, major computer makers, such as NCR and Hewlett-Packard, rushed so-called eight-way systems with eight CPUs to the market to boost the capacity of the Windows NT environment.
Availability and scalability are critical issues, said Dwyer. He noted, however, that a lot of Unix systems are rapidly approaching, if not surpassing, the availability associated with the older mainframe environment, he said.
Both Hewlett-Packard and Sun offer enterprise servers that are much more powerful than the NT machines. The HP9000 V2200 server scales up to 32 CPUs and the Sun 10,000 up to 64 CPUs. Both systems can then be clustered to boost their capability to 256 CPUs.
The consolidation movement also seems to have boosted the fortunes of the mainframe, now looked at as a potential super server.
"The mainframe has had a history of managing large workloads with high availability, mission-critical type applications, and we think that suits it very well to be the super server in this new consolidation role," said Wendy Culberson, large systems brand manager for IBM Federal.
The mainframe has clearly not been put out of business by network computing, as many pundits in the late 1980s said it would. A 1996 survey of information system spending by Computer Economics, an IT consulting company based in Carlsbad, Calif., shows across-the-board increases in mainframe spending for all market sectors, with government spending up 25 percent.
Agencies such as NASA are consolidating legacy applications back onto mainframes as they strategically deploy client/server systems.
"When you consolidate, you can't consolidate on a small computer, you have to increase the hardware," said Steve Kooms, RMS program manager for NASA.
"Frankly, the mainframes have been very reliable over the years," he said. Under its Computing, Communications and Networking Support contract with NASA, RMS provides a broad range of computing infrastructure and applications support for the Computer Services Division of NASA's Lewis Research Center.
IBM's Culberson said 1997 was a positive year for her division last year because of the resilience of mainframes. "It's the resurgence of the mainframe in general," she said. "People are keeping their applications on the mainframe and those applications that were there are growing, and they're Web-enabling their mainframes."
Despite its longevity, the mainframe's future still remains in question.
"It is very much limited and will continue to decline over time," countered Dwyer. "The reason being there is essentially only one company now, IBM, that's supporting that kind of environment, and the new applications are not being developed for those kinds of environments,."
For now, many organizations are adopting multitiered systems incorporating a number of different systems, including the mainframe.
"When people first started formulating the client/server paradigm, it was a series of clients and one server; and probably in early 1996, we saw a phenomenon where some of the installations were incorporating legacy systems and then certainly Web servers," Sheridan said. "Now you might have a mainframe with all of the data on it, a Web server fronting that, and that Web server would be attached to the multitude of clients."
Java in the Future
Some in the industry, especially IBM and Sun, believe that eventually the Java programming language will tie this heterogeneous environment together. "It is ideal for the network environment and incorporates object-oriented technology and it supports the distribution of software," said Leahy. Developed about two years ago for the Internet by Sun, it was Java that spawned the network computer or NC.
Out of the limelight, a number of government organizations are opting to bypass the client/server model altogether and go to Web-based systems using the Internet and intranets, Bumgardner said.
"There are some distinct advantages of the Web-based technology over client/server," he said. For example, in the client/server network, there is a sizable amount of software on client machines. Therefore, "every time you want to update software, you have to roll that out to each of your users, and when things break you have to fix them at the users' locations," he said.
"With the Web-based approach, the only thing that's running on your local desktop is the browser and most of your application software is running back in the server, the same way as in the mainframe days," he said. "In that environment, you can update your software off the server rather than having to send upgrades to thousands of potential users."
Because of cost, complexity and system availability issues, many organizations, especially in the public sector, have been slow to migrate their key enterprisewide applications onto distributed systems, industry officials said.
The government has been slow to adopt client/server technology, but recent initiatives reforming procurement and information system management have provided momentum for change, Dwyer said. Unlike companies in the private sector, where the bottom line rules, the government typically is not under a lot of competitive pressure to do things differently, he said. Complicating the situation, he said, has been a lack of central management and control in many government departments when it came to IT decisions.
New General Services Administration multiple award schedules and blanket purchasing agreements have helped to eliminate long procurement cycles, making it easier for agencies to acquire up-to-date systems, Leahy said.
In addition, each agency has been required to draw up a strategic plan for its use of IT and appoint a chief information officer under the Clinger-Cohen Act, Dwyer said.
Dwyer and others point to the need to solve the year 2000 software flaw in current systems as another impetus for both government and commercial entities to address their client/server and enterprise needs.