New data center policy could be industry boon
OMB's new Data Center Optimization Initiative should drive more opportunities for energy monitoring, data integration, middleware and virtualization. Here is what an immixGroup expert says you need to know.
The newest regulation in government IT infrastructure is daunting to many who sell tech solutions to federal agencies. But unlike other IT reforms that go nowhere, the new Data Center Optimization Initiative (DCOI) offers several opportunities for solution providers, especially those in energy monitoring, data integration and middleware, and virtualization.
Let me explain how.
Opportunities are being created through DCOI’s new data center definition and a requirement that federal departments figure out their data center plans by fiscal 2018. One of their options is moving their IT to private cloud services or making existing data centers more efficient.
A LOOK AT THE PAST
Before we start sharpening our sales pitches, let’s look at government data center history.
Data centers have had a reputation among policy makers and legislators as a cost drain. The reputation started with the government’s first budget cut-inspired memorandum in 2010--the Federal Data Center Consolidation Initiative (FDCCI). It called for the closure of hundreds of government-wide data centers and forced departments and agencies to define timelines and performance metrics for their infrastructure reduction plans.
As a consequence, government agencies have been forced to get crafty in adopting cutting-edge data center technologies that help infrastructures handle the increasing raw volume of data, applications, and systems.
WHAT’s NEW
In March, the Office of Management and Budget issued DCOI for more cost savings and efficiency than previous mandates. It has similar goals as the old FDCCI, but with more data center reduction requirements. The mandate specifically says that any room with a single server is considered a data center, which is further dichotomized into tiered-data centers and non-tiered data centers, with the exception of high-performance computers.
The new regulation means the number of data centers subject to consolidation is increasing. Data centers with a separate space, uninterruptible power supply, cooling system, and a backup generator are considered tiered data centers, while ones without these features are now known as non-tiered data centers.
As part of the mandate, each agency must identify the number of data centers it will close or consolidate by the end of fiscal 2018. Agencies have three options for meeting their targets: Transition to shared services, start using private cloud services, or consolidate and make more efficient existing data centers. This last bucket is where vendors will see the most opportunities.
INDUSTRY TARGETS
The following technologies are key to DCOI’s success and will be in hot demand for the next few fiscal years:
Virtualization—This is perhaps that most significant part of the new memorandum since it greatly reduces the number of physical servers, networking technologies (routers and switches), and wires that make up so much of data center costs. Technologies such as hyper-converged computing are rapidly gaining attention with public sector data centers, considering its tremendous impact in the private sector. Agencies that want to store and keep their own data in-house must virtualize their machines. Data centers now have a goal of four or more virtualized machines for every single physical server. Data centers that haven’t exceeded virtualization efforts will be scurrying in the next two years to acquire commercial virtualization solutions and get rid of physical infrastructure. Tech companies should reach out to data center managers and directors and start talking about how their product can help data centers achieve or exceed the DCOI virtualized machine ratio by 2018.
Automation and Monitoring—DCOI establishes power usage effectiveness guidelines, requiring energy metering and monitoring tools for all tiered data centers. Agencies are also required to replace manual collection of system data and inventory with automation technologies. That means if you offer network and server monitoring and management tools, it’s a good time to target data center leads to help them optimize their environments and meet their 2018 deadline. Piggybacking on the new policy, the General Services Administration (GSA) is establishing a new acquisition vehicle that offers automated monitoring and management tools. Tech companies should look at becoming a schedule holder for GSA’s planned vehicle or teaming with a partner who will be.
Consolidation—Technologies like data integration and middleware are going to be key as data centers consolidate. Legacy infrastructure environments, including their systems and applications, will be top targets for reduction. To cut back on infrastructure, agencies are going to need data integration to reduce the number of databanks and warehouses that take up physical hardware server space. Middleware will also play a bigger role as systems and applications will have to migrate to prime data centers that may be running off of different operating systems and software components. The trick to finding fruitful success here means showing how your solution allows migrated IT to function and communicate with other systems to agencies’ enterprise architects and system owners.