Contractors will have a chance to play a significant role in the government's move to make data more open to the public and other agencies, which likely will involve cloud computing.
As part of the push from the White House for federal agencies to better share data with the public, NASA posted a dataset called “Land Surface Temperature at Night” on Data.gov.
Although that collection of data might seem obscure, the hope is that technologists, researchers and public interest groups will take all this new information from agencies and mash it into useful applications.
Although it is a potential boon for the public, the open-government mandates create challenges for federal agencies.
One way agencies might be able to adhere to the requirement is by moving to cloud computing, Chris Kemp, chief information officer at NASA Ames Research Center, said at a recent conference.
For example, cloud computing could make it easier for NASA to share huge volumes of photographs of the surface of Mars. Rather than maintaining one database for using the photos internally and another database for pushing the data to the public, NASA could host all the information on a cloud computing infrastructure.
With the cloud, updates would be available in real time, both internally and to the public.
Also, cloud computing is elastic, so the agency could increase resources when needed and scale them back when that need dissipates. So if someone outside NASA built an application with the Mars data and usage spikes, the cloud would be able to deliver more computing capacity.
The example demonstrates that agencies need to consider all information technology demands in a holistic way and not individually, Kemp said.
“As a federal agency, we really need to start thinking about overall enterprise architecture because I think open government has profound implications on federal IT enterprise architectures,” Kemp said.
Rethinking IT infrastructures at federal agencies could be a boon for contractors and systems integrators, industry sources say.
Because agencies have security concerns about public cloud computing infrastructures, many government organizations are creating private clouds protected behind firewalls. Agencies also are moving to server virtualization, a close cousin to cloud computing, said Russ Fromkin, director of Intel Federal.
Although existing hardware can support virtualization and cloud computing, newer technology is more efficient and capable, Fromkin said. Industry should help agencies understand what technology they need to set up an efficient cloud computing environment, he said.
“If you take servers from five years ago and compare them to the servers today that have hardware virtualization technology built in, you’re talking about 20-times performance improvements and a reduction of 90 to 95 percent of your power,” he said.
“If you add that all up, it means by replacing that infrastructure today, you can save within a year what you would have spent by continuing to run cloud on older hardware or basic virtualization,” Fromkin said.
With a infrastructure tailored for cloud computing, it will be easier for agencies to start using the public cloud when that becomes possible, he said.
Although they are not identical, cloud computing is closely related to software as a service. Cloud computing generally means that a provider develops a multitenant service that is delivered via the Web. Customers pay a recurring service fee for their ongoing investment in the platform and its operation.
For example, instead of the Environmental Protection Agency building one data-sharing platform and the Interior Department and the Health and Human Services Department building another, they all invest a little in a shared platform.
The benefits of cloud computing work internally and externally for an agency, said Kevin Merritt, chief information officer at Socrata, a provider of a social data platforms for government organizations.
When agencies adopt cloud computing, they will likely see agency-to-agency collaboration improve immediately, Merritt said.
“If I'm the CIO of EPA and you're the CIO of HHS and I call you asking for some data, there might be all sorts of reasons why that's hard to accomplish,” he said. “But if your data is online, I can access it as though I'm a consumer even though I represent an agency.”
Every agency needs to share data, but they face different restrictions. It is not realistic that a shared services model can be applied behind firewalls across all agencies, Merritt said. The differences among agencies' sharing limitations have spawned many redundant IT systems. To make cloud computing work for the federal government, there needs to be a mechanism for smoothing out those requirements, Merritt said.
“The cloud greatly reduces the political, bureaucratic and technical challenges that prevent sharing of services and instead makes a compelling economic case for sharing,” Merritt said.
For example, government organizations could continue to run their systems of record internally and then push their data to the cloud-based dissemination platform. The data could be pushed manually or automatically. In that model, agencies would still own and control the data.
“The agencies can keep the data up-to-date at the schedule that makes the most sense for the data,” Merritt said. “Some agencies like statistics bureaus probably will want to snapshot the data monthly; other agencies may want to push on a less regular recurring schedule; other agencies may want to update their systems of records to push to the cloud data platform in real time.”
Although the idea of letting a third party host government data might sound scary, it is just part of a natural evolution, Fromkin said. Virtualization and data center consolidation are a couple of the links in the cloud computing evolutionary chain, he said.
“The cloud basically brings all these technologies together that we’ve been working on for years,” he said. “It enables us to combine all the different things together and make it optimal.”
Despite the existing groundwork for cloud computing, agencies still face many challenges when implementing cloud computing for open government or any other task, said Dale Wickizer, chief technology officer at NetApp.
Much of the hype behind cloud computing is that it will save the government money, he said. However, there are no specific cost saving targets, he said.
“The scary thing is it almost looks like they are doing cloud just to do cloud without some clear metrics, and that's kind of frightening,” Wickizer said.
Before considering a cloud computing project, agencies need to do their homework. Wickizer said agencies should first determine how much it costs to do the function in-house instead of with a cloud provider. Second, agencies should prepare to manage a service level agreement, which is a different arrangement than building and hosting an IT system.
For example, after an agency determines how much it costs to store and serve video in-house, it might be able to find a vendor that can do the same thing for less money. But until the agency figures out the cost baseline, it can only guess whether the cloud option is a good deal.
To get used to working with service level agreements, Wickizer recommends that agencies provide cloud-like services to smaller departments and offices in the agency.
“View the departments and offices as customers, and get experience being accountable and managing service level that way,” he said. “Then you might be a more informed consumer when looking at some of those things you can put out in the cloud.”
NEXT STORY: Breakdown of NASA's $4.3B IT upgrade