IT Performance Guide Takes a Page From Business

BR IT Performance Guide Takes a Page From Business By Dennis McCafferty Staff Writer Federal decision makers are studying scorecards that look like they were lifted from a Business Management 201 textbook. They toss around terms like "value chain'' and "investment targeting'' like accountants happily cramming before a tax audit. What to make of this infusion of Wharton School-isms into the federal mindset? Hopefully, gove

BR>

IT Performance Guide Takes a Page From Business

By Dennis McCafferty

Staff Writer

Federal decision makers are studying scorecards that look like they were lifted from a Business Management 201 textbook. They toss around terms like "value chain'' and "investment targeting'' like accountants happily cramming before a tax audit.

What to make of this infusion of Wharton School-isms into the federal mindset?

Hopefully, government officials say, a better sense of whether technology performance matches its investment.

"We're trying to gauge how investment in technology contributes to bottom line results,'' said Patrick Plunkett, program manager for performance measurement at the General Services Administration's office of governmentwide policy. "In organizations, it's difficult to see how buying a PC and putting it on your desktop [will] help you meet performance goals. It's difficult to link actions with results.''

To overcome this, Plunkett has compiled a guidebook called "Performance-Based Management: Eight Steps to Develop and Use Information Technology Performance Measures Effectively.'' At 106 pages, the free guidebook is concise and reader-friendly (for a government product, at least). And if much of what's in there sounds like something plucked from the pages of Harvard Business Review, well, that's because it was.

The issue of performance measurement is gaining steam. It was among the hot topics discussed at last week's Virtual Government '97 conference, sponsored by the Armed Forces Communications and Electronics Association.

"Who is watching the actual work and who is going to guarantee that the millions being spent on these systems is helping?'' asked Renny DiPentima, a former infotech honcho at the Social Security Administration and now the vice president and CIO of SRA International of Arlington, Va.

Measuring infotech's value is far more difficult to accomplish than to describe.

At the Department of Health and Human Services, for example, it is relatively easy for food inspectors to estimate how many cans of tainted salmon will be detected by a new computer system. However, for disease researchers to figure out whether a prospective computer system can help in fighting cancer would be a more difficult exercise, said Neil Stillman, the deputy assistant secretary for information resources management at HHS.

So far, Plunkett's office has gotten more than 300 orders for the guidebook since it was first advertised last month. About 40 percent of the interest has come from private companies, with the rest coming from the federal agencies for which the guidebook was designed. Plunkett's office is now in the process of putting the guidebook on the World Wide Web.

Much of the push for performance measurement was prompted by recent procurement reform. The Clinger-Cohen Act of 1996 requires agencies to weigh the benefits against the cost of information technology projects and requires agencies to tie investments to specific goals. While Plunkett said documents have cropped up occasionally on the subject, nothing has gone into the kind of detail that's in the guidebook.

Mainly, the book depicts how user surveys, performance logs and other compiled data can show whether high-priced hardware and all its trimmings do the job. Success stories are included, with examples coming from the Department of Defense, the Federal Aviation Administration, Immigration and Naturalization Service and other federal agencies.

One method offered by the INS distinguishes wide-ranging goals (promote public safety) from specific tasks (deter illegal alien-smuggling) to help bureaucrats evaluate its computerized detection system.

The INS suggests that one way to measure technology's performance is to compile the number of alien and drug-related apprehensions that the computer system directly aided.

Technology consultant Paul Strassmann said the government spends far too much time defining a technology system's virtues without getting a handle on what it produces. He said the GSA guidebook is an admirable effort to confront this, but lacks enough specifics on measuring output.

"It's not the first time that lots of institutions have offered all kinds of prescriptions,'' said Strassmann, a professor at the National Defense University in Washington, D.C., and author of "The Squandered Computer,'' to be published this spring. "If there's anything to be said about the federal computer environment, it's overdefined.''

At the Social Security Administration, telecommunications systems are evaluated with respect to how many calls are lost and how many calls can be handled before a busy signal kicks in.

But the measurement of technology performance is still relatively new at the SSA, so there aren't any letter grades as to how the systems are working.

"You do it one step at a time," said Fritz Streckewald, deputy director for strategic management at SSA. "You have to characterize this as another step in a series for our investment policy."