Getting Connected

By 1998, nearly 80 percent of all new computer application systems will be based on client/server technology

A Navy pilot enters the cockpit of his F-15 aboard the deck of an aircraft carrier somewhere in the Indian Ocean. He turns on the on-board computer and views a series of intelligence reports from the Defense Intelligence Agency and Central Intelligence Agency describing the target of tonight's mission. The pilot scrolls through the documents and clicks on a link that looks similar to a hot link on a World Wide Web page. Audio sound bites play of the enemy's commanding general, which were intercepted by the National Security Agency. Then the pilot clicks on another link, this one a visual image. It is satellite data downloaded just a few minutes ago, which shows the enemy's troop positions and their path of movement during the last hour. Finally, the pilot reads a written report by the DIA analyst that interprets the entire package of data.

Such a scenario is under development in the military today thanks to advanced client/server computing. Though on-board PCs in aircraft for such purposes are not yet available, plans call for such capabilities in the coming years. For now, pilots must settle for listening to intercepted sound bites, viewing fresh satellite data and scanning reports on the ship, according to Dave Nahmias, federal product marketing manager at the government group of Informix Corp., a Menlo Park, Calif., database technology company. Its DataBlade software modules are being used by an "unnamed military intelligence service" to develop such capabilities. The standard software modules plug into a database and extend its capabilities to do anything a customer desires with any type of data. Informix has a contract for this technology, but other database companies such as Oracle are developing tools to access different types of data.

"Effectively managing advanced data means going beyond the limited capabilities of traditional database systems," said Nahmias. "You need to be able to plug intelligence into your database management system for your specific kinds of data."

Client/server computing advances are becoming essential to the operations of major government agencies -- especially the military. By 1998, nearly 80 percent of all new computer application systems will be based on client/server technology, according to the Gartner Group, Stamford, Conn.

While large enterprises continue to run legacy applications and back-office systems such as payroll and accounting on mainframe computers, the ease of use and fast information delivery typical of client/server make it the technology of choice for new information management initiatives, Gartner indicates.

Within three to seven years, most of today's legacy applications will disappear, Gartner says. However, hundreds of millions of dollars are invested in these aging systems, some of which are approaching their third decade. As a result, the switch to client/server tends to occur gradually, with operational requirements driving the change process.

Client/server computing inherently supports process re-engineering on a local level and makes it possible to use legacy systems and data throughout the transition to new technology, and even beyond, Gartner indicates. After several years, management will be positioned to review the legacy applications that are still running. If there is still enough economic value to the surviving systems to maintain their existence, companies may decide to migrate the remaining applications or simply to leave them running on the old mainframe platforms. In any event, the decision should be driven by economic rather than technology considerations. This perspective is imperative because client/server computing is typically more expensive in terms of information processing costs than the mainframe alternative. While client/server system maintenance costs tend to be relatively low, the expense of new application development and the concurrent business price re-engineering is fairly high.

"Long-term, client/server technology is a real cost-saver for companies," said Ronald Sammons, a senior vice president at Technology Solutions Co., a Chicago-based business and technology consulting firm specializing in delivering client/server solutions to Fortune 1000 organizations. "They just have to bear with the initial expense. Eventually, there will be improved productivity due to greater access to information. You can learn more about your customers or about your target. The data is there somewhere on your mainframe. You just have to find a way to find it. And client/server leads you there."

"These results of the re-engineering process, often accompanied by radical organizational changes, are where the true payback and savings made an option by client/server technology are finally realized," according to Sammons.

Now the military services are positioning themselves to migrate older, legacy systems that are on central mainframe platforms to today's client/server environments. And they are moving rapidly to take advantage of the flexibility, ease of use and advanced capabilities of client/server technology. The Pentagon's ADMAPS system is another example of client/server. CD-ROM players containing visuals, maps, charts and more are connected via a network server to end-users with PCs. This commitment is essential to competitive survival, which depends on adding new and greater value to the services delivered.

If client/server represents the technology of the future, the first question for most government executives is, according to Sammons, 'How do we get there from here? How can we migrate rigidly structured, highly centralized, business-critical legacy systems' "smokestack" applications that automate isolated functions? How do we most effectively perform the process re-engineering needed to shorten the insurance product development cycle, measurably improve productivity, achieve vastly improved service and position for rapid incorporation of new technologies as these become available? How can we ensure we make the best decision on prioritizing which areas should first receive the benefits that can be derived from client/server approaches?'

With each client/server application, finding the best answer to that question depends on the following elements:

- Enterprise Data Model. The information sharing that is an inherent characteristic of client/server approaches makes a business-oriented, top-down data view essential. What are the data elements required to satisfy requests? Odds are favorable that most of this data already resides in the agency's mainframe systems but isn't readily available to distributed knowledge workers and occurs in multiple formats on multiple databases.

- Technical Architecture. Client/server computing comes in many sizes, shapes and forms. Every company, therefore, must clearly specify the technology enablers that will be used for client/server applications. Specialists in the information systems organization already know -- and those on the business side of the fence soon learn -- that the products and tools selected to begin the process of moving to client/server probably will change throughout implementation and even after the project is complete. Given the fact that typical technology product cycles are approximately 18 months, the impact of change is unavoidable. However, it is essential that each project begin with a clear statement of the technical architecture that can be modified on an ongoing basis.

- Applications Architecture. Specification of the technical architecture primarily is the responsibility of the information systems organization. The applications architecture, however, is a framework used to specify all of the functions needed to enable business to proceed. Since the company already is performing most of these functions, this is a relatively simple process of taking an inventory of what is in place. However, this is also where the interface points of different systems are identified to ensure that information users in every area can access needed data. Therefore, ongoing communication among representatives from operations and technical areas is an enterprise imperative for definition of the applications architecture.

- Network Architecture. The essence of client/server computing is communication between the PCs and the enterprise data that may be stored in several locations. Proper planning is essential to ensure current and anticipated needs will be met in the complex networks typical of client/server environments.

Defense Investigative Service

Military intelligence agencies aren't the only ones benefiting from the technology trend. The Defense Investigative Service and the Joint Chiefs of Staff are using client/server to re-engineer operations. The former conducts all personnel security checks of civilians working at the Defense Department, as well as military contractors. This typically involves 615,000 national agency checks and 210,000 personnel investigations a year.

The Defense Investigative Service is trying to modernize the process through client/server computing. It currently stores millions of files on microfiche-based legacy systems linked to a mainframe. Most files have been in that format since 1972, when the Defense Investigative Service was authorized by Congress. To meet the goals and increase the data on disk at the Defense Investigative Service from 100 gigabytes presently to 2.6 terabytes by the turn of the century, the agency is using several client/server technologies.

For the hardware, the Defense Investigative Service chose Digital Equipment Corp.'s 8400 5/300 TurboLaser computer, 64-bit symmetric multiprocessing server built to work on the UNIX operating system. The hardware was integrated to work with Oracle Corp. and its Oracle 7 Very Large Memory database product that is optimized by 64-bit computing. The technologies form what Defense Investigative Service calls the Case Control Management System.

"We needed to migrate to something that reflects the business needs of the agency while supporting the federal agencies that require our services," says Irwin Becker, director of the Defense Investigative Service's national computing center in Washington.

To accomplish the goal, Digital developed a solution based on a model developed by the Defense Investigative Service.

The client/server configuration ran with 200 concurrent users running Empower emulation software on the 8400 computer with two gigabytes of memory. When testing was complete, the average query time to find records was 1.8 seconds. Three of five queries came back in 0.1 second, the lowest time that could be measured. The Defense Investigative Service estimates the investment in the new technologies should save them $900 million over a five-year period when modernization is completed later in the decade.

Joint Chiefs of Staff

While the Defense Investigative Service is finishing installation of the cutting-edge, client/server system, another military organization, the Joint Chiefs of Staff, has some impressive client/server plans, as well.

According to Mike Miller, senior vice president of federal sales at Computer Associates International Inc., the Joint Chiefs is employing technology Com-puter Associates developed for object-oriented searches in databases via client/server computing. The technology is called Jasmine and was developed with Japan's Fujitsu. Jasmine features a small footprint multimedia execution system, which operates alone or as an add-in. Going beyond conventional add-ins, the applications execute in a distributed architecture.

"It is being used at the Joint Chiefs and at the Unified, Specified Commands. One of the things that the Defense Modeling and Simulation Organization has done is to look at the stovepipe systems and new technologies being developed and find how to integrate the new technologies into the system," Miller said.

Another Computer Associates technology, the Open Object Rapid Application Development tool, allows the Joint Chiefs to incorporate object-oriented views into the database. "If you have a tank and you have all the attributes that describe the tank in a relational database, you can present that tank as an object to the end user on the screen," Miller said. "It provides a much better understanding of the object and its attributes. This is very powerful in the modeling and simulation community. The Central Command has done a lot of development in this arena. And they have been able to pass that on to the European Command and the Pacific Command."

These tools allow for simultaneous access to many databases, such as Ingres, Oracle, Informix and others, and proprietary, custom-made mainframe databases.

"Government has been reluctant to use object-oriented technologies because they are too bleeding edge," said Miller. "But now, with these developments you have a common look and feel across the environment."

With these tools, the military can use animation, graphics, audio and video, and organizations can construct a new breed of applications that mirror their business environment. These applications define a new level of intuitiveness and ease of use that let the Defense Department data reach a new community of users, according to a spokesman for the Defense Modeling and Simulation Organization. "The application can be deployed over common client/server environments and over intranets and the Internet, using Web browser add-ins," said the spokesman.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above.

What is your e-mail address?

My e-mail address is:

Do you have a password?

Forgot your password? Click here

Washington Technology Daily

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.


contracts DB