Information Assurance Keeps Data Rolling
Third-party information assurance service providers can help keep a government agency's data safe and available <@VM>Information Assurance: Four Main Pieces
- By Edmund X. DeJesus
- Mar 16, 2001
Information assurance is an important and growing need for government agencies. One estimate by the Government Electronics and Information Technology Association pegged information assurance spending in the federal government at more than $2 billion in 2000, growing to more than $6 billion by 2005.
"Information assurance developed out of the concept of high-confidence computing," said Terry Benzel, director of NAI Labs and vice president of advanced research for Network Associates Inc., Santa Clara, Calif. NAI Labs does funded research for government agencies including the Defense Advanced Research Projects Agency, the National Security Agency and the Central Intelligence Agency on topics such as adaptive network defense, cryptography and security architecture.
Commonly called IA, information assurance traditionally has included three main areas: data integrity, access control and availability. Plus, a new area is emerging: responding to and managing intrusions when they occur.
It is no longer sufficient to hide data in hardened bunkers with strict control over who gains entry to the inner sanctum. No system is perfect in thwarting intruders, and every headline detailing the latest horror story hints at the many successful intrusions that never reach the public's attention.
For this reason, information assurance has grown more pragmatic. Intrusions and failures will occur, so what should the response be?
The new direction in information assurance deals with how to survive and recover from intrusions and failures in the traditional IA system. In a sense, this is an emergency management role to deal with unfortunate eventualities.
The Defense Department is probably the prime customer for information assurance. However, few IA procurement opportunities in the Defense Department or civilian agencies are stand-alone contracts. Most are part of larger proposals. For example, the Defense Enterprise Integration Services III contract for the Defense Information Systems Agency will include information assurance in its $3 billion proposal.
Similarly, the General Services Administration's $1.5 billion contract for smart-card systems, the Transportation Department's $1 billion Nexcom contract, the Veterans Affairs $1 billion PCHS II project, and the Justice Department's $750 million Justice Management Division project all contain information assurance components.
For these and other agencies, using third-party service providers can ease the difficulties of implementing an information assurance system. Ideally, the provider becomes part of the process early in the game, because trying to fit the provider piece into a nearly finished puzzle can be painful rather than helpful.
In addition, different providers offer different suites of services. Some, such as BMC Software Inc., cover all aspects of IA, while others, such as a partnership between SAS Institute Inc. and Veritect, specialize in a few.
For agencies that handle secret or proprietary information, it may be very difficult to contemplate trusting any outsider with any part of the mission. It may be helpful to learn that some providers have a considerable history of aiding government agencies, particularly those engaged in heavily secret work.
As an example, NAI Labs, formerly Trusted Information Systems, has participated with DARPA in fundamental research into many aspects of information assurance. For instance, NAI Labs is researching cryptography in high-speed environments, middleware for sharing information securely and network defense technologies.
Agencies may be tempted to use existing IT staff to implement information assurance solutions. This is usually a mistake.
"The sheriffs of the system should be separate from the operations personnel who run the system," said John Casciano, senior vice president for enterprise security for Litton TASC Inc., Chantilly, Va., and former director of Air Force intelligence, surveillance and reconnaissance. Litton TASC provides services for government agencies, including the Air Force, Army, Navy and the National Institutes of Health. For example, Litton TASC helps the Land Information Warfare Activity at Fort Belvoir, Va., set information assurance policies for the Army.
The problem, said Casciano, is that the information assurance staff and the systems administration staff have different goals. For example, in an effort to ameliorate a system slowdown, a system administrator may decide to take down a firewall. Since jurisdictional disputes such as this can get messy, it often makes sense to bring in an outside party to handle the information assurance duties.
For new projects, it is wise to include information assurance in plans right from the start. Providers can assist with design and architecture decisions that will lay the proper base for future work.
"Phased solutions often make sense, both as a means of minimizing the impact on existing systems and to build trust with the agency," said Valerie Clayton, regional sales manager for BMC Software's Federal Operations in Bethesda, Md. BMC Software, based in Houston, provides software, training, management and support for agencies including the FBI, the Internal Revenue Service, the Interior Department and various defense agencies.
Another way to ease into the third-party approach is to try passive solutions. Many providers offer monitoring solutions that only watch what is going on in an organization, without interfering at all.
"The system learns the normal patterns of usage, so that it can recognize abnormal behavior when it occurs," said Don Walker, president and CEO of Veritect, a subsidiary of Veridian Corp., Arlington, Va.
Veritect works with data mining technology from SAS Institute of Cary, N.C., to provide defense and intelligence agencies the ability to search for new kinds of attacks beyond the known techniques. This kind of monitoring can look for known patterns of attacks and alert administrators to possible threats. Using advanced data mining techniques on the monitored information is a notch up in sophistication.
"Data mining can look for complex behavior by intruders, turning up new types of attacks," said Jeff Mudd, an SAS Institute official who manages the company's alliance with Veritect.
An overall assessment by an independent service provider can identify what important information an agency handles, what threats to that data exist, what vulnerabilities there are in the agency's systems and what the possible solutions are.
"Since all these items are constantly changing, dynamic solutions work best. You can't just put up a wall and expect nothing to get by," said Walker. Many providers are able to suggest such adaptive solutions to present an active defense to intruders.
Some areas seem like a no-brainer decision for a third-party provider. For example, no agency wants to use its resources to catalog and combat computer viruses. There are vendors with vast expertise in this area, such as Symantec Corp. of Cupertino, Calif., and McAfee.com, a part of Network Associates. It makes more sense to turn this over to the experts. The same is true for encryption.
The greater an agency's availability to the public it serves, the greater the risk of intrusion and compromise. This is an unfortunate corollary of the move today to openness and electronic government. Agencies that must expose their systems in this way must also take extra precautions against external attacks.
Conversely, agencies need to demonstrate to citizen users that they can have confidence in the government system. They must also plan for surviving and recovering from intrusions, because there will be intrusions despite all precautions.
Lack of public access is not much of a safeguard. Casciano estimated most system security breaches come from current and former authorized users. These may be inadvertent mistakes and oversights, or they may be attempts to stray beyond established borders.
However, a determined insider is not the only threat. The disgruntled former employee is a dangerous enemy.
"Too often, security does not fully remove all the privilege and access of a user when the user leaves the agency. In fact, most systems have no mechanism for tracking all the access privileges a given individual has," Clayton said.
That's an open door to mischief. Such problems usually don't involve national security as much as privacy: peeking into confidential tax or medical records, for instance. Outside providers can address this in a number of ways, within both the computer system and the business rules of an organization.
An agency's major problem may not be with security as much as with availability. It needs to keep its servers up and running. It is possible to define the desired response to service outages in advance.
The information assurance provider can tailor recovery plans to the client's need. Once in place, this solution can be set so the agency's preferences automatically go into effect, ensuring uninterrupted access.
An outside provider's response to incidents can run a gamut of possibilities, depending on the agency's needs and preferences. The provider may simply be available as an adviser during an incident, or the provider may actually manage the system and actively defend against incidents.
Even here, the agency can control whether the provider is on the premises or operating remotely. Post-incident analysis is also valuable, since many providers are experts in the forensic skills necessary to establish responsibility.
"The [Defense Department] has established its own institute for computer forensics, specifically to address evidence after an incident," Casciano said. Review and improving procedures after an incident can make the overall system better.
Many information assurance service providers advise an end-to-end solution. This approach has benefits, since it avoids compatibility issues that may be present with piecemeal solutions. However, it is also in the agency's best interests to seek best-of-breed solutions in the specific areas that concern them most, and these may not be part of a comprehensive solution.
The end-to-end solutions of providers may not offer the flexibility an agency may require for integrating with existing systems, possible mergers with other entities, future change or other possibilities.
New technologies in information assurance will find their place in field solutions. For example, the mandate to use off-the-shelf software of questionable security with sensitive data is a common problem.
An emerging technology called secure execution puts a layer of security between the data and the application. Similarly, dynamic restructuring capabilities are finding their way into network components to improve the chances of surviving disruptions. Data mining of system information is already proving itself valuable in recognizing patterns of attack.
Information assurance will remain important in computer systems for government agencies. The Defense Department's commitment to information assurance is reflected in a recent Government Accounting Office report that noted the Defense Department has established an agencywide Information Assurance Program under the jurisdiction of its chief information officer. This will monitor Defense Department computer networks and defend against hacker attacks and other unauthorized access.
"Protecting against today's threat is not enough. Many agencies are addressing security concerns two to five years out," said Network Associates' Benzel. The right foundation in information assurance can make those long-range goals achievable.? Data Integrity.
Many who speak of information assurance tend to focus on data integrity as the primary, if not only, goal. Data integrity certainly involves data storage in suitable formats with sufficient backup, possibly in physically secure areas. Encryption of data can play a role, as may firewalls and virtual private networks. Physical and logical separation of systems is one simple and effective way to achieve different levels of security. ? Access Control.
This means deciding who gets to see what. Logically, this starts with distinguishing different kinds of data, then defining roles for who should have access to each kind of data. Assigning those roles to people may require training specific to that role. Eventually, there must be ways for the computer systems to recognize users and their access limits. Usually this involves at least user names and passwords, may include smart cards or other proofs of identity and may even extend to biometrics, such as fingerprint, voice or retinal identification. Use of physically and logically separate systems may overlap here.? Availability.
Availability of data is often more oriented toward hardware than human aspects. Networks with redundant pathways, servers with automated failover and smart routing schemes all play a role. Fault-tolerant hardware may be required to satisfy mandatory availability situations. However, even here there may be overlap, since protecting the hardware from interference can fall within the area of data integrity.? Survival and Recovery.
This refers to how an organization responds to a security intrusion. Forensics, for example, may involve acquiring evidence of tampering and safeguarding that chain of evidence for use in criminal, disciplinary or even foreign relations procedures. Backup and redundant systems may parallel efforts to ensure availability. Analysis of situations is vital, not only for responding in the moment but in planning proactive changes to the system.