How long will cloud's reality trail the hype?

Detractors and supports agree momentum is building

Whether you love cloud computing or hate it, the one point of agreement seems to be the inevitability of cloud computing’s adoption by federal agencies. Its speed of approach is uncontested.

“Cloud is moving faster than any other breakthrough technology I’ve seen over the past 10 years,” said Thom Rubel, research vice president for IDC’s government insight division. “The combination of potential cost savings and the business case have been powerful drivers.”

At Computer Sciences Corp., a big player among the new “cloud integrators,” companies that port organizations’ networks, applications and business processes to the cloud, the view is that “the horse has left the barn, the train has left the station, use whatever metaphor you want — this is very real,” said Ronald Knode, CSC director of global security solutions and adjunct professor at Towson University.

Related stories:

NASA cloud goes open source, but where are feds heading?

Microsoft pushes new directions at Partner Conference

Cloud computing and the changing role of the CIO

Even skeptic Michael Daconta, chief technology officer at Accelerated Information Management LLC and former Homeland Security Department metadata program manager, concedes that “over the short term, there are some non-mission-critical applications that would represent a good use of cloud technology.”

Cloud has even gotten crucial buy-in from the top; in his data center consolidation memo in February, federal CIO Vivek Kundra urged agencies to consider implementing cloud solutions.

So where are all the cloud implementations?

“Federal agencies have been doing a great job of expressing interest in cloud,” said Chris Bell, vice president of enterprise project management at project management software developer Deltek Inc. “Adoption of cloud computing has not been as common.”

Business users “get” the potential value of cloud; software as a service puts applications on their desktops when they need them and whisks them away when they’re done. And they pay only for what they use.

“But often they’re not ready for or they’re surprised by the amount of work, of cost and the time to deal with the security and the procurement that’s involved in getting to that point,” Bell said. “They figure it should take one to three months, when it’s more like six months to a year.”

Some cloud procurements that began as calls from agencies following up on federal mandates are beginning to show up, said William Perlowitz, vice president of advanced technology at communications integrator Apptis Inc.

“But a lot of what we end up doing now is educating [agencies] about how they would move into cloud in compliance with the federal mandates and law,” he said.

Cloud procurements will come, said Susan Zeleniak, president of Verizon Federal Business. Verizon provides cloud services under the General Services Administration’s Networx telecommunications acquisition. “The economies it offers government are too good to be ignored.”

Cloud’s savings can be significant, and that’s not unimportant, said CSC’s Knode. “But in concentrating too much on saving money, I think we’re setting our sights too low.”

The better measure is what it can do for the business, how it can advance an agency’s mission. “Ask: ‘Does it solve your business problem?’ ” IDC’s Rubel advised. “If it does, then look at costs. If the two add up, then it’s a good fit.”

NEXT: Who is leading the federal pack?

Cloud leaders emerge

NASA had been on the brink of spending $1.5 billion for a traditional data center, Kundra said, but has stepped back to consider what the next-generation computing environment will look like and “how to allocate that capital more intelligently, in the context of cloud computing.”

Of course, NASA is far from new to cloud. As Ames Research Center CIO, Chris Kemp in 2008 spearheaded development of Nebula, a cloud infrastructure built to let NASA scientists make massive calculations with massive data. It can handle an individual file system of 100 terabytes; Amazon’s EC2 tops out at 1 terabyte.

Nebula’s core technology will be a component in OpenStack, a new open-source cloud initiative that NASA hopes “will form the foundation of a new open-source cloud ecosystem,” Kemp said. “NASA will be uniquely positioned to drive standards that will ensure products and services powered by OpenStack will meet federal interoperability, portability and security requirements.

“We’ve been looking at cloud, not for a particular application but for use across the enterprise and how we might benefit from it,” said Kemp, now NASA’s first chief technology officer. SaaS is a strong candidate, as is e-mail. (The Interior Department is moving 80,000 e-mail addresses to the cloud.) Platform as a Service (PaaS) also “has a lot of potential for private clouds,” which is where the first enterprise cloud implementation is most likely to be, he said.

NASA is not alone among agencies in remaining leery of public clouds but receptive to private ones.

For one thing, “it lets them more effectively use their infrastructure,” Rubel said. “They’ve got huge infrastructure; a private cloud lets them use its untapped value without really making a new investment.”

Infrastructure may also be the lever that gets agencies out of strictly private clouds. “It’s a way of avoiding having to build new infrastructure to handle your peaks,” Zeleniak said.

SaaS is one of the handful of cloud implementations, which also include some storage applications, that Daconta supports. “For some kinds of user transactions where you don’t know how many processes you’ll need at one time — whether for one user or 5,000 users — cloud is a good way to go,” he said.

NEXT: Let’s not forget about security

Security remains a challenge

A second point of agreement about cloud: The No. 1 impediment to cloud adoption is security. There’s a certain irony to that statement, as cloud potentially offers improved security.

“I’m not ready to say empirically it is better,” Rubel said, but it does give a better enterprise view, a singular view of the whole ecosystem.

It’s not only the state of security impeding cloud among agencies. “Today, each time an agency wants to adopt a cloud solution they have to go through individual security assessments; it takes time and it costs,” which is why FedRamp’s work is so important, Deltek’s Bell said.

Working with GSA, the Federal CIO Council’s FedRamp, for Federal Risk and Authorization Management Program, will do security authorizations and monitoring of an agency’s cloud implementation. Other agencies that want to use the application will be able to shortcut the certification and authorization process.

“We’re excited about FedRamp and the work [GSA Associate Administrator] Dave McClure and others are doing at GSA,” NASA’s Kemp said. “It’s going to give us more confidence in the capability of the cloud offerings to deliver security.

“We’ll be able to look at a particular system inside [NASA] and see how its security characteristics compare with those of [a commercial cloud provider] and make a determination whether it meets our needs or not,” he said.

Key to agencies’ openness to private clouds is control; key to expanding agencies’ acceptance of other cloud solutions is transparency, CSC’s Knode said. When applications leave an agency’s infrastructure, “we lose some transparency, the ability to verify your control plan, to generate evidence-based confidence, and you’ve lost trust,” he said.

CSC looked for a way to restore that trust, and answer what Knode calls the three big questions: Where is my data? Is my data on separate platforms? Are you applying all the processes I told you to apply or not apply?

“So we’ve offered to the community a very high-level, ask-and-answer asynchronous protocol, which we refer to as the Cloud Trust Protocol, which anyone can implement.”, an Apptis portal, allows visibility into machine status, service usage, system availability, and system performance, Perlowitz said. When agencies go to FedCloud to select the company’s cloud services, which run on Amazon’s cloud platform, “they can only select services that will meet FIPS 199 medium compliance,” he said.

Verizon recently partnered with Terremark Worldwide Inc. to build a data center in Virginia intended to provide high-level security cloud services for government.

NEXT: What will does the future hold?

Cloud projects will come

However standards and policies, technological approaches and advances all shake out, the cloud is coming, Zeleniak said.

“Look at the Internet years ago,” she said. “How would people ever have considered doing business on the Internet, putting their personal information, credit card information, their tax information out there? Today we’re all doing it and we barely think about it.

“We’re going to have some of those same challenges with cloud, and we’re going to have to prove that it can be done,” she said, “but I don’t think you can hold it back.”

Daconta is less sanguine. “Look at any of the existing clouds — they all come as a full proprietary stack, from the low-level to the high-level APIs,” he said.

“All of those layers have to come together perfectly to run the application. In fact, moving an application from one cloud to another is the best acid test there is to test cloud interoperability. An application will either be able to be moved, without code changes, to another cloud or it won’t,” Daconta said.

Mostly, they won’t, at least not yet nor for the foreseeable future, he said. “Every cloud is going to change over the next five years, I guarantee you. Then you’d have to completely redo the whole thing.”

Like the technology itself, cloud standards and requirements are evolving, Zeleniak countered. They could all change in two years, but “because they won’t have invested a lot of capital to do it, it won’t be such a financial burden to migrate to whatever the new security requirements are,” she said. “Really, much of the financial burden will be incumbent on the vendors.”

Daconta remains almost reassuringly skeptical of cloud. “It’s a great technology,” he said. “But it’s early days, and until the standards are in place, it’s too early to get into it too deeply.”

GSA intends to award multiple blanket purchasing agreements to let agencies buy infrastructure as a service. A GSA spokesman said the agency anticipates the awards coming in the fourth quarter of fiscal 2010.

Reader Comments

Fri, Jul 23, 2010 MAC Metro East St. Louis

I appreciate Michael Daconta’s perspective on page 3 that begins: “Look at any of the existing clouds — they all come as a full proprietary stack…” What that means to the procurement community is that buying into a proprietary solution may create a significant barrier to switching from one cloud to another, once an initial contract is awarded. My Agency recently hosted a video conference where Microsoft, Google, and IBM discussed their cloud offerings and strategy with respect to the Federal Government. While most of the discussion focused on requirements and capabilities, toward the end someone asked “What happens when it comes time to re-compete? What happens when we want to consider moving everything to one of your competitor’s offerings, in order to get a better price?” The silence in the room was profound. Finally one of the vendors cleared his throat, gurgled a little, and said “Well… uh, if at some point… uh… for some reason… a client were to make the decision to move to one of our competitors… we could, uh… I guess provide the appropriate consulting to help them make the switch, for an appropriate fee.” That comment spells it out pretty clearly: Getting into a particular cloud will be much easier (and potentially less costly) than getting out later on. Agencies should tread very carefully in the proprietary cloud universe, and include an “exit strategy” as part of initial contract considerations. Government technology and procurement leaders need to show an above average sensitivity to guarding the public trust as they edge toward the cloud. This isn’t one of those cute, trendy technologies that can demonstrate how cool, sophisticated, and eager to embrace change the Government has become. Proprietary clouds have the potential to create situations where a single vendor, without meaningful competition, could wind up owning an agency’s IT for a very long time.

Thu, Jul 22, 2010

The computing industry keeps cycling between centralized services with terminals (i.e. mainframes) and client-server at such speed, that the fluff they spew without solving any problems has made me dizzy for the past 25 years. I guess it's time to cycle back to computer-as-a-centralized-utility. This will, of course will fail again after oh 5 years. Then back again to decentralized computing and its fluff. Without solving the fundamental problems -- all the software has been and is crap! You all know the reasons why.

Thu, Jul 22, 2010

NHTSA has a methodology called the Readiness for Cloud Methodology (RCM) that helps them walk through a formal study on whether they are organizationally ready for using cloud services and what adjustments need to be made. The RCM helps them figure out if they have the right budgeting and provisioning processes, the right organizational model and performance plans for employees, and the right management decision making and reporting processes and capabilities to successfully transition to being a receiver of cloud services rather than a provider of infrastructure services.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above.


WT Daily

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.