GAO: Agencies fall behind goals to certify IT systems

Federal agencies are making little progress on mandates to certify and accredit their information systems, and the poor showing is causing some lawmakers and IT security experts to be leery of agencies' efforts to secure federal IT systems.

Federal agencies are making little progress on mandates to certify and accredit their information systems, and the poor showing is causing some lawmakers and IT security experts to be leery of agencies' efforts to secure federal IT systems.The Government Accountability Office found in a recent study of 24 major federal agencies that 63 percent of their systems were reported as certified and accredited for operation in the first half of 2004. GAO's June report was published late last month. The 63 percent figure is only slightly larger than last year's figure of 62 percent and falls far short of the Office of Management and Budget's goal that 80 percent of federal IT systems be certified and accredited by the end of 2003. An Aug. 9 OMB report had a better assessment of information security, however. OMB's quarterly report on the President's Management Agenda said 70 percent of the federal government's IT systems are certified and accredited, compared to 26 percent three years ago. Karen Evans, OMB's administrator for e-government and IT, said the Aug. 9 report reflects more current data than does GAO's assessment. She said the agencies will meet the 80 percent goal this year. "We put a lot of focus into this," Evans said, and OMB is helping lagging agencies plan for better information security.Even so, GAO said its 63 percent figure might be high. Though agencies claimed their certification and accreditation processes met requirements that include risk assessment and security control testing, that wasn't always the case, GAO found. OMB requires agencies to certify and accredit their information systems and report the number of systems authorized to operate following certification and accreditation. These processes support requirements of the Federal Information Security Management Act of 2002. Accrediting an information system means an agency official authorizes its operation and accepts the risk it places on agency operations or assets based on an agreed-upon set of security controls, according to the National Institute of Science and Technology. NIST publishes guidance on certification and accreditation for federal agencies. To decide whether to accredit an information system, an agency certifies the system by completing a security review of the system's management, operational and technical security controls. The GAO report was done for Reps. Tom Davis (R-Va.), chairman of the House Committee on Government Reform, and , chairman of the Government Reform subcommittee on technology, information policy, intergovernmental relations and the Census. Putnam "continues to be disappointed in the progress of some agencies," said Bob Dix, subcommittee staff director. "Agencies that are at 90 percent certification and accreditation are showing evidence of some commitment. But still, some agencies are not doing that."The GAO report showed spotty agency progress in certification and accreditation. Seven agencies reported more than 90 percent of their systems were certified and accredited, but six agencies said that fewer than half of their systems were certified and accredited.Dix said Putnam's subcommittee would work with OMB and agency inspectors general to identify obstacles to more rapid progress. According to the GAO report, some agencies said they did not have enough money or staff to certify and accredit their systems. Certification and accreditation can cost $25,000 to $1 million per system, according to Alan Paller, director of research at the Bethesda, Md., SANS Institute, a systems administration, networking and security research and education organization.Paul Bello of Dallas IT security firm Entrust Inc., said lack of funds and other resources is a familiar refrain among his federal customers. But Dix and OMB officials said more money isn't the solution. "We have not seen any empirical evidence that folks don't have adequate resources to accomplish the requirements of the law," Dix said. "This is about leadership commitment and getting the job done."Agencies that don't comply with FISMA requirements could find themselves with less money to spend. OMB monitors the certification and accreditation of major systems through the budget process and can withhold funding for projects that don't meet security requirements. In fact, OMB has already done so, Evans said. Both the White House and Putnam have established scorecards measuring agencies' progress. The quarterly President's Management Agenda scorecard rates agency progress on a red-yellow-green scale in areas such as financial management and budget and performance integration. In the e-government area of the scorecard, agencies must certify and accredit 90 percent of their operational IT systems to get the top rating, a green score. Putnam's scorecard, issued annually by the subcommittee, is based on how well agencies carry out FISMA requirements. FISMA requires agencies to implement agencywide risk management programs to secure their information and information systems. The 2003 scorecard gave the government an overall "D" grade, up from an "F" the previous year. Eight of 24 major agencies failed, but for the first time, two agencies got "A" grades.Several factors identified by GAO indicate that agencies' reported level of certification and accreditation may not be accurate. Only 13 of the 24 agencies said they had completed system inventories. The lack of complete system inventories will skew upward the percentage of systems that are certified and accredited. Paller said it's obvious that agencies "are putting things in the certified and accredited category that don't belong there." He estimates that about $2 billion is spent on certification and accreditation every three years. Lack of complete security testing means much of that money is wasted, he said. About half of federal agencies merely gather the reports required by FISMA and never actually check the systems to see if they are secure, Paller said. The other agencies verify system security, using either contractors or staff, Paller said. At the Transportation Department, for example, a small contractor team has handled certification and accreditation partly by automating vulnerability scanning. "They cut the cost of it by almost 80 percent, and they did a better job than everybody [else]," Paller said. Lisa Schlosser, deputy chief information officer at the Transportation Department, said the department has now reached 90 percent certification and accreditation of its system by undertaking a multipronged strategy that includes baseline configuration standards and strong monitoring, detection and response capabilities. In addition, even after certification and accreditation is complete, the department regularly tests the security of all its systems, she said.Bello, vice president of U.S. federal government business at Entrust, said continual testing of IT security testing should be a priority at all agencies. Evans said OMB requires it."Doing the initial certification and accreditation testing is fine, but systems and processes change. Performing regular audits is essential," Bello said.Staff Writer Gail Repsher Emery can be reached at gemery@postnewsweektech.com.

Falling short on C&A

The Government Accountability Office studied certification and accreditation practices at four federal agencies: the departments of Commerce and Energy, the Environmental Protection Agency and NASA, which had 32 information systems among them. GAO found gaps in the agencies' security practices in all seven areas it studied.

*Percentage based on the 19 systems with contingency plans.

**Percentage based on 21 systems for which plans were required to correct identified weaknesses.

Source: GAO










































Criterion Systems meeting criterion Percentage
Current risk assessment  23 72%
Current security plan  26 81%
Controls tested 22 69%
Contingency plan 19 59%
Contingency plan tested  8 42%*
Plan with milestones prepared for weaknesses  17 81%**
Residual risk identified 17 53%






















Adam PutnamAdam Putnam (R-Fla.)





















































NEXT STORY: Vulnerability reporting