Can you move from buzz to revenue?
If one reflects upon technology adoption by government agencies, a pattern emerges whereby conversational volume (i.e. – buzz) around hot, emerging technologies often precedes tangible use cases.
Cloud computing, big data and bring your own device dominated rhetoric in the public sector well before strong demand existed for related products and services. This is not surprising, nor is it different from what occurs in the private sector, but the hype nonetheless can obscure the true impact these technologies have on agency efficiency, productivity, and cost savings.
2014 witnessed the emergence of numerous buzz technologies and processes, but four in particular have the potential to evolve from concept and early-stage usage to more widespread adoption if government contractors and federal IT providers can effectively communicate the benefits these technologies can deliver.
Internet of Things
Data is becoming a cornerstone of enterprise-wide operations at the federal, state and local government level. The internet of things (IoT) challenges agencies through an exploding volume of sensors, while also opening up worlds of potential to collect data from sources never previously imagined. Applications for the IoT are far reaching: monitoring public infrastructure for signs of decay and required repair; low-cost, low-power, multifunctional sensor devices that can be affixed to military vehicles, satellites and soldiers; as well as body cameras increasingly worn by police offers at the local level that transmit hours and hours of video footage that must be stored.
Transforming the promise of IoT to reality depends on enabling agencies to derive actionable insights from raw data, then rapidly disseminating that information to key users across the public sector. Budgets will never be able to keep pace with data volume, requiring solutions to become more efficient and cost-effective in storing data. An inability to analyze and store data as volume explodes reduces sensor devices to little more than “window dressing” that offers little value to decision makers.
Not only are new techniques and technologies needed to store this immense amount of data, but agencies must act as "digital archivists" in classifying the data, ensuring it is stored in the right location, and preserving it for years, decades, or even centuries. Increasingly, agencies are looking at object-based storage as a natural complement to the internet of things. Object-based storage allows for the storage of structured and unstructured data as flexible-sized objects rather than fixed block sizes – ultimately leading to the ability to store massive datasets that the IoT creates with reduced costs and secure data management.
Multi-Vendor Hybrid Cloud Deployments
It is safe to say that every federal agency used the cloud in 2014…in some form. 2015 is poised to be the year when the cloud evolves from a point solution for individual agency projects to a true piece of shared infrastructure that is utilized across multiple agencies as needed.
Agencies seeking that balance between scalability and security gravitated towards a hybrid cloud environment, using public compute and public networks and servers, along with private cloud storage that allows an agency to retain control over their most sensitive asset – data. As a result, advancing the hybrid cloud in 2015 will come down to a data-centric approach that offers agencies full flexibility to pursue a multi-cloud provider strategy.
Government contractors and federal IT providers, for their part, must deliver a data fabric that allows agencies to keep the right data on-site, move other data to cloud service providers, and take advantage of the tremendous capabilities offered by hyperscale cloud providers. Just as most agencies were reluctant to bet on a single vendor for their on-premise IT, they will choose to work with multiple cloud providers if a reliable data fabric can be assured. This is because the data fabric allows agencies to control, integrate, move, and consistently manage their data across the hybrid cloud, while taking full advantage of the economics and elasticity of the cloud while maintaining control of their data across a hybrid cloud environment.
Avoidance of lock-in, leverage in negotiations, or simply a desire for choice will drive agencies to seek a hybrid cloud that does not lock them in to any single provider. Software as a service vendors that offer no way to extract data will suffer in this scenario in 2015, while software technologies that can be deployed on premise and in a range of clouds will find favor with customers thinking strategically about their model for IT.
Acquisition of Services, Not Products
There will be a continued shift in acquiring cloud services as opposed to products. But to date, there aren’t necessarily all the mechanisms to acquire these cloud services in a flexible, expedient way as CIOs sort out what applications they seek to control fully (on-premise private cloud), control partially (enterprise-grade public cloud), or hand over to a cloud service provider.
We are witnessing a transition of CIOs and agency IT decision makers from being builders and operators to brokers of services. That doesn't mean that agencies won't continue to build data centers and utilize on-premise computing. But increasingly for each new initiative, agencies are breaking down the requirements to service level agreements rather than to individual products like compute, networking and storage.
Ensuring success in 2015 as acquisition shifts from products to services requires building out private clouds on top of leading edge integrated infrastructures, moving to cloud service providers, or utilizing the scope and scale of hyperscalers like Amazon Web Services, Microsoft Azure, or SoftLayer.
One key transition is the development of the previously referenced data fabric that enables dynamic data portability across all clouds and that can enable extensive agency choice for application, technology, and cloud partner options – all while allowing agencies to maintain control of their data and provision it across the fabric as appropriate.
Software Defined Storage
Software defined storage (SDS), with the ability to be deployed on different hardware and supporting rich automation capabilities, will extend its reach into cloud deployments and build a data fabric that spans premise and public clouds. SDS provides a means for applications to access data uniformly across clouds and simplifies the data management aspects of moving existing applications to the cloud. Software-defined storage offerings that deliver storage efficiencies and reduce the cost of moving data to and from the public cloud – and storing active data in the public cloud for long periods of time – will be best positioned to gain broader adoption.
Government contractors and federal IT providers have an opportunity to convert the buzz around these government technologies to tangible use cases and revenues – if the right solutions are delivered and the benefits of these solutions are effectively communicated to the market.
Rob Stein is vice president, U.S. public sector at NetApp, a leading cloud storage provider.