Four cloud hurdles and how to clear them
- By David Blankenhorn
- Jun 19, 2014
Today, you can’t attend a technology-focused conference in the Washington, DC area without hearing at least one panel or presentation on cloud computing.
The benefits have been touted loudly and often: reduced costs, increased efficiency and agility, and enhanced productivity. According to a 2013 Gartner report, between fiscal 2012 and fiscal 2013 the value of awarded cloud contracts in the federal government nearly quadrupled. By 2018 the market is expected to hit $10 billion.
Despite a few agencies that were early adopters of cloud computing technology, many agencies have remained slow to leverage the technology.
I see four major roadblocks to accelerated adoption across the federal government: education on the different cloud computing models, security, procurement, and industry noise.
Over the past few years, the conversations I have had most often with customers center around “What cloud computing options are available?” There are a variety of definitions available and understanding how the various cloud service models (Infrastructure-as-a-Service (IaaS), Software-as-a-Service (SaaS) and Platform-as-a-Service (PaaS)) and deployment models (Public, Private, Hybrid and Community) fit together to meet an agency’s objectives can be challenging.
Conversely, many agencies who have been quick to adopt the technology have taken a “cloud hosting” versus “cloud native” approach in which they simply fork lift their existing applications into the cloud without any adjustment.
Taking full advantage of a cloud platform’s capabilities requires the use of new design patterns that utilize services that are native to that cloud platform. While fork lifting out on-premise architectures into a cloud will work, it’s not until we fully leverage the native capabilities of that cloud platform that we will fully benefit from all of the elasticity, agility, scalability and resilience that cloud platform offers.
The key lesson is that cloud computing comes with a steep learning curve, and agencies must be committed to educating themselves before making a decision on strategy and path to implementation.
I’ve seen firsthand how agencies that have taken the time to understand the intricacies of cloud are already making great strides with adoption and seeing tangible benefits.
For instance, I have seen several customers use public cloud platforms for high performance computing and research. The scalable, flexible, and agile nature of the cloud has allowed them to significantly cut down on infrastructure costs and eliminate traditional procurement and deployment lag time.
The ability to instantly scale resources up and down, paying for only the resources used, is having a dramatic impact on the way research is conducted and helping to support a more rapid path of innovation. For the research community, less money spent on infrastructure means more money spent on actual research.
The second major roadblock is security.
Significant progress has been made with programs like FedRAMP, but agencies still face challenges as it relates to data privacy and shared responsibility. Cloud computing has very similar characteristics to legacy “contractor owned, government operated” or “government owned, contractor operated” models and a lot of the same lessons learned there can be applied to cloud.
It is important to have a clear understanding of which control objectives fall to the agency and which fall to the service provider, and eliminating the notion that by moving to the cloud everything is taken care of.
Agencies can apply a lot of their prior learning from traditional IT models, but must adapt to take advantage of the unique aspects of the cloud. The key here, again, is education and actual experience. Those agencies who invested in learning the nuances of cloud base shared responsibility security models are now successfully using those cloud platforms – securely. It is important to remember that agencies will need to delegate certain responsibilities to the cloud provider, but ultimately, an agency cannot abdicate responsibility.
The third remaining roadblock is procurement.
A traditional firm fixed price (FFP) approach simply won’t work for purchasing cloud services, as they are nothing like traditional hardware and software. Cloud computing isn’t an asset, nor is it the same as consulting services that have historically been purchased through FFP contracts.
The answer is for agencies to revisit annuity-based procurement models and see how those can be applied to cloud services to simplify the process. Another challenge agencies face as it relates to procurement, is allocating the correct amount of services.
The on-demand, self-service nature of the cloud allows users to spin up resources whenever and wherever they need them. This can be tricky when those who are actually consuming the services are unaware of contract ceiling limits. Agencies need to ensure they have the proper controls in place to better understand which services have been budgeted for and which have not.
The final hurdle to rapid acceleration of cloud in the federal government is industry noise.
Today, we see a lot of cloud “washing” in the industry, especially as it relates SaaS models. It is important to look at the proposed service and ensure it maps closely to the definition and reference architecture of cloud offered by the National Institute of Standards and Technology (NIST).
Some SaaS providers are actually just repackaging conventional enterprise licenses rather than offering a real, flexible, elastic service. Note that it’s okay to host a private installation of a software package on a public IaaS, and this model does have some short term benefits relating to security, control, and compliance. Ultimately, you’ll want to cut through all of the noise and hype – from vendors, panelists, presenters, reporters, and yes, people like me – and gain an understanding of the reality of what is being offered.
The power and promise of cloud is real. And if agencies and industry can work together to overcome the remaining roadblocks outlined above, I believe we’ll see rapid adoption of the technology across the federal government.
David Blankenhorn joined DLT Solutions in early 2011 as the Chief Cloud Technologist. Today, David serves as the Chief Technology Officer. As the leader of the Office of the CTO, David and his team drive the DLT portfolio strategy and messaging, assist customers and partners with the rapidly evolving technology landscape, and provide market intelligence to our vendors and partners.
s an executive with a proven record of leadership and business execution within professional and managed services, David applies his extensive experience to assisting public and private customers with IT strategy, design, delivery, and management. His expertise lies in Cloud Technologies, Virtualization, Data Center Optimization, and IT Service Management.
David began his career as a System Administrator on a Defense Data Network (DDN) contract before moving into multiple executive positions in the industry at various value added reseller companies. He pulls from a diverse set of experiences across different aspects of the information technology landscape including manufacturing, system integrators, value added resellers, and aggregation. Prior to joining DLT, he spent more than a decade at Sun Microsystems where he was a Chief Technologist, Principal Engineer, and Global Manager in the Professional Services and Advanced Services divisions.