On the outskirts are the continuing moves toward full convergence of data, voice and video communications at the desktop, replete with Internet telephony, messaging and videoconferencing. Then there are the likes of Sprint buying a network service provider and a local area network integrator.
These changes have expanded the range of responsibilities for public sector network managers across a diverse landscape of platforms and processes, from the ilk of Unix through client/servers to extranets.
One reflection of the new network management order is the array of niche products that serve both the narrowest of network operations and the broadest of administrative assignments - from monitoring system traffic in pulsating packets to maintaining system security beyond the fierce firewall.
Another reflection is the range of demands made on systems integrators and vendors servicing the government marketplace, from outsourcing to turnkey support.
One person close to the frantic action is Saverio Merlo, senior vice president of marketing and a 17-year veteran with Boole & Babbage, San Jose, Calif., who has broad experience guiding both North American and European marketing strategies. His company provides enterprise automation solutions for distributed systems management.
"There is no good answer to the question about the current state of network management," declares Merlo. "It lost its identity with the encroachment of other disciplines. The traditional evolution of the network from voice and data to open networks and standards has now come to the point where we have to differentiate between network environments and client/server environments."
Not only has the client/server environment eroded the traditional network, but the cost of managing the network is more expensive now, Merlo says. Part of that cost increase is attributable to the difficulty that individuals and organizations are experiencing in trying to organize the workplace to the new paradigm that technology has molded.
"Quite simply, organizational change is not occurring as rapidly as technology is developing," explains Merlo. "With different divisions pursuing solutions at their own pace, distributed systems management is on a collision course."
The solution, he suggests, is to bridge the departmental gap with technology and systems management - the space in which Boole & Babbage competes. Viewing the overall management of information technology, he segments the corporate workspace into people, processes and tools and the management functions into operations, production and administration.
"Our approach is to deliver an end-to-end solution, to bridge any IT element and bring it under a central point of control," Merlo says. "We focus strictly on the top end of the market, whether industry or government, where big customers confront big problems. Here you find large organizations with thousands and thousands of objects whose businesses will suffer immediately if the network falters."
Currently there is strong demand for turnkey solutions, both implementation and operation, especially in the federal government, he says. The focus on open systems and standards is a plus for customers who are trying to come to grips with the dizzying array of network management issues, Merlo adds. Also good news is the fact that World Wide Web technology, including browser programs, has basically solved the graphical user interface issue, decoupling it from a specific technology or platform.
His own company's flagship product, Command/Post, gives managers a pilot house from which to control fault management, end-to-end availability and automated operations throughout the enterprise. And its MainView product line provides performance management and automation in Multiple Virtual Storage systems.
The view of network management across the industry as well as inside the government, is changing almost as rapidly as the network's product life cycles, which in some cases have gone from two to three years to six months.
One of the clearer statements of the transformation shows up in the recently published report, "Information Systems Trustworthiness: Interim Report," issued by the National Research Council's Commission on Physical Sciences, Mathematics, and Applications, Computer Science and Telecommunications Board, Committee on Information Systems Trustworthiness.
"Our nation's infrastructures are undergoing profound change," the report notes. "Networked information systems are becoming critical to the daily operation of increasingly large segments of government, industry, and commerce. Moreover, in responding to the needs of subscribers, critical infrastructures like the electric power utilities and public switched telephone network are increasing their dependence on computers and communications networks.
"But this growing dependence on networked computers is accompanied by increased risk. In short, our nation's infrastructures could well evolve into an interdependent system of fragile and vulnerable subsystems. Understanding how to ensure that they will operate reliably is thus vital."
This message is not lost on Stephen M. Cohen, a computer specialist with the U.S. Department of Labor's Employment Standards Administration, the home of four major program offices: Office of Federal Contract Compliance Programs; Office of Labor-Management Standards; Office of Workers' Compensation Programs; and Wage and Hour Division.
Ahead of the curve of many of his colleagues, Cohen returned to school for an MBA after getting his undergraduate degree in information science. His approach to network issues reflects that decision.
"I consider where I work not just an [information systems] department anymore; it's now a department that solves business problems. More and more, IT staff will be called upon to serve as business consultants offering guidance in applying appropriate technology to help the organization maintain its competitive edge. To help staff make decisions, you will need to focus the standard business school mind-set on organizational mission, strategy and direction," says Cohen.
"You can see how far we've progressed in just this decade. The basic end user today can almost perform the functionality of an IS department, building his own applications and database and making ad hoc queries."
More and more, network management is becoming an issue of keeping up with user requirements, which change almost daily, as well as monitoring the increased use of the network, including the Internet, for example, to present content and graphics to target audiences. It's all part of extending reach and expanding services to ever-demanding customers.
The advent of HTML (hypertext markup language) and the desire to publish along with developments in Java, applications, imaging technology and multimedia tools, such as VRML (virtual reality modeling language for 3-D), should keep user demand of the network on a continuing growth curve.
Cohen echoes the comments of Boole & Babbage's Merlo with his own observation that the push had been to distributed computing with "servers all over the place." He sees that changing, with a new thrust more toward centralization with greater demand placed on network reliability. In fact, Cohen notes that additional redundant lines had to be installed in his division to support the increase in wide area network traffic.
"Clearly, there's more demand on the network, which makes sense from an administrative standpoint. However, there is chaos on the server end. We are now in the process of cleaning that up," says Cohen.
Asked about the software he's using for network management tasks, he cites as one example FireWall-1 from Check Point Software Technologies Ltd., Redwood City, Calif., an enterprise security solution that integrates access control, authentication, encryption, network address translation, content security, auditing and connection control.
"We purchased it to address Internet security issues and are pleased with it. Among the key features for us were how easy it was to deploy and that it really provides good information about what's going on with our interface to the Internet," says Cohen.
Still, with all the hardware, middleware and software available, he feels that network management and the present crop of tools are basically in a state of chaos, demanding continuing re-evaluation of products.
"You could end up with five, six, seven packages depending on what devices you are monitoring, whether concentrators, routers, bridges, whatever," admits Cohen.
One product that's caught his eye and which he plans to evaluate soon is the Spectrum Enterprise Manager, an integrated systems and network management platform, from Cabletron Systems of Rochester, N.H. Promising not only to reduce network downtime and associated costs, but to simplify and distribute network operations throughout the organization, the product offers a client/server architecture with different levels of scalability and flexibility. It also allows, according to company literature, everyone from the manager of information services to less-technical, high-level executives to have the tools to run a network in line with the goals of the business.
A different viewpoint of network management is offered by Jay Sriram, a senior systems administrator and database administrator for a division of the U.S. Department of Education. Experienced in both the mainframe and client/server environments, he admits that the changes brought about by distributed computing and increased end-user activity have not been wholeheartedly accepted by all.
"As you might expect, the transformation has been painful for some. Many users are really happy that they can do more. On the other hand, there are those who don't want to do a lot of computing and prefer the old dumb-terminal technology," says Sriram.
His role in the Education Department is unique in that he is responsible for an entire isolated system that he helped develop and which he now operates.
"Running on a Hewlett-Packard machine, the system contains the latest data on post-secondary education for which we are totally accountable. That data includes institutional information from school business offices, for example, the name of the school, when it was started, when accredited, its default rate for loans. Covered are barber shop schools, for instance. It's all public domain data and owned by the business end of the Education Department. We are simply the custodians of the data, responsible, for example, for keeping it up to speed with upgrades. It's a turnkey operation, the first type on this scale," says Sriram.
As he points out, this effort originated from soul-searching about why the department was doing everything on the mainframe when users were demanding to share data. One reason the issue was even raised is explained by how quickly client tools have evolved, according to Sriram. "It has made it easier for someone to say they need this kind of facility for this kind of system."
What he installed was Unicenter TNG (The Next Generation) from Computer Associates International Inc. of Islandia, N.Y., which had all the modules that Sriram wanted. The suite offers enterprise management solutions through a set of functions built on an object-oriented architecture and a scalable manager/agent infrastructure operating across heterogeneous networks. It integrates the management of systems, networks, databases and applications and provides a view of this environment.
For example, according to company literature, a management function in Unicenter TNG, such as performance covers network devices and networks as a whole, systems and databases running on them, client/server applications and Internet applications. The integration permits a picture of the performance of all the IT resources involved in a business process.
While he intended to use it on multiple nodes, Sriram wound up hosting it on the single system.
He feels that hindsight has improved the quality of the data. That is, that the information dissemination facilitated by the system has helped better inform users how to tailor their demands. That's been made possible by a new breed of user. Previously solely business analysts, they are now much more computer literate, for example, able to perform ad hoc queries from their system. This has led in many cases to improved quality in their work.
"There is no question that all these developments have raised the level of expectations among users. Many would now like to see increased interactivity, with information returned after they supply input via forms. And many are now offering their input on how the forms should be designed," says Sriram.
With all this change, he admits it is hard to say where the network system will evolve. With broad experience in different environments, he remains philosophical about the future direction.
"There is no right way or wrong way to set up a system to manage information. I don't think it will go in one direction. You just have to align yourself with the technology current at that time. In my experience, every time you centralize or decentralize operations, you have to replace technology to modernize. What's clear is that the online systems have changed the ways that people think about how they can do business. Now different parties to a transaction can call an 800 number and see that a payment has been taken care of," says Sriram.
A third perspective of network management from inside the public sector comes from Ken Wong, an electronics engineer who functions as a network engineer in the National Institutes of Health's Network Systems Branch in the Division of Computer Research and Technology. His role is to help design and maintain the NIH backbone, the high-speed network connection.
Seeking finer granularity for network management - that is, an in-depth look at how traffic was flowing, what hosts were running and what protocols were operating - Wong started using RMON (remote monitoring) products last December from NetScout Systems Inc. of Chelmsford, Mass. One of the key features for him was the clear presentation of the data through a specialized management software program that is highly graphical.
For example, NetScout Expert Visualizer presents a customizable three-dimensional view of the network. Alongside physical views, it offers logical views for traffic, for internetworking and for applications.
The NetScout Manager software provides over 40 integrated diagnostic and analysis tools, letting users view the enterprisewide traffic on any network segment at the physical, protocol or application level. It displays alarms for problem conditions, analyzes the data for troubleshooting, produces reports for policy compliance and generates reports for long-range planning.
The tools also allow network managers to try and offer the maximum quality of service, which means the availability and reliability of distributed client/server applications.
Users are able to view business-critical LAN/WAN segments without regard to protocol or topology; supported are switched LANs, VLANs, frame relay, T1/T3, Ethernet, Fast Ethernet, FDDI (fiber distributed data interface) and token ring. Ethernet is a baseband LAN specification invented by Xerox Corp. and developed jointly by Xerox, Intel and Digital Equipment Corp. Token ring is a token-passing LAN developed and supported by IBM.
While pleased with the product, Wong quickly adds that the limitation of monitoring the backbone is the enormous amount of data that is collected. He repeats the now familiar refrain about the status of network management.
"Network management is in chaos. There is just so much information about the network you can collect. Over time, with so much data coming at you, you tend to become desensitized to key parameters that are important to collect. There's also too much information to sort through," says Wong.
One solution, he suggests, is to have network managers protected from this flood of data and only notified when there is important information on the network. This is a function that RMON should be able to perform.
Even so, he finds the software valuable, both to answer questions that occur from rising expectations about network performance and to reduce costs. For instance, he often receives complaints that the network is slow. With RMON, he can put instrumentation on the network to see why it is slow.
"Here's a good example. Several months ago, an end user was saying the network was slow. He proposed a solution that would have cost about $250,000. I told him, 'Let's take a look at it.' I put NetScout probes on the network to monitor performance. We saw lots of intranet traffic, where the traffic was going and what was generating it. After that, we recommended an Ethernet switch for $25,000 and it solved the problem. Admittedly, it's not perfect data, but it offers excellent clues about how to spend money and upgrade the network," says Wong.
The popular RMON market is served by a number of companies, among them SolCom Systems Inc. of Livingston, Scotland, which has a suite of network management products for managers of Ethernet, token ring and FDDI LANs. Their SNMP/RMON standards-based products support all 20 groups of RMON (1 and 2). As enhancements to current versions, the company plans to release an updated Automatic Data Gatherer and a Web-based interface with expanded reporting capabilities including both RMON and RMON2 reports. Probes for frame relay and asynchronous transfer mode (ATM) will be introduced later this year.
Another company with an enterprisewide network management solution is Tivoli Systems of Austin, Texas, part of IBM Corp. The company offers an open, end-to-end management solution from mainframes to the desktop with network and systems management in between. This month the firm introduced TME 10 LAN Access, which it claims is the first product to manage disparate work group environments from an enterprise level.
According to Tivoli literature, TME 10 LAN Access lets customers manage enterprise desktops directly from the TME 10 console. LAN management solutions now supported by TME 10 LAN Access include Intel LANDesk Management Suite, Microsoft Systems Management Server and IBM NetFinity. The product is also the first implementation of the Multi-Platform Manager Application Programming Interface, the initiative announced by Tivoli and Intel in September 1996 to establish integration between work group and enterprise management solutions.