A two-part strategy for getting value from data center consolidation

A lot of work is getting consolidating data centers, but how do you know it's worth it? Here's how.

Let’s face it. Being a government CIO in the age of Big Data is a daunting job, no matter what.

After all, every day more apps arrive and there will never be enough storage in the cloud or in a physical facility to satisfy all of the stakeholders involved.

Hence, it is no wonder that the new report by the Congressional Research Service which asserts that the Defense Department “must find new ways to curb energy cost and satisfy its ever growing needs for data center services” was received with mixed feelings

But in my opinion, the critical meat of the report actually provided a strategically insightful underlying observation that all government CIOs and government contractors should take under-advisement. After almost two years into this endeavor, our collective community still doesn’t have clear business impact metrics that could help promote transparency and operational efficiency across the multiple layers of IT governance and operations of data center consolidation and virtualization.

In other words, lots of work is being done, but not a lot of accurate and mutually acceptable measurement standards are in place. The outcome: misalignment in perceptions about the most important IT undertakings in government.

Just take a look at the bottom line. According to the CRC report, the federal data center consolidation initiative requires plans with explicit cost benefit analysis. But alas, such analysis requires either metering of data-center energy use or accurate estimates derived from modeling.

Neither source of information “seemed” available for most federal data centers. The CRC report further challenged the Environment Protection Agency’s estimates including “the lack of accurate metering by data centers and the aggregation of center electricity costs with their host facilities.”

In other words, they don’t know if the estimates are too high or too low. Moreover, CRC claims that “even if the energy-use projection is accurate, the cost estimate might be low given increases in electricity rates.”

So here is the question: If one consolidates data centers to save money, will the agency’s business mission be accomplished over the long haul or will it turn into a bigger management nightmare than the current inefficient enterprise?

Because the original impetus of the mission is predominately economic, the optics of the “getting it done” approach may cause some decision makers to think about the now without truly analyzing the long-term business implications on governance, operational frameworks and resource allocation that ultimately define the overall business performance of your data center.

Therefore, thinking long-term from the start is a strategic and practical way to ensure critical decisions are not made to be “penny wise and pound foolish.” Rather, they are made to critically meet the mission of the federal CIO without compromising quality, efficiency, and operational excellence.

So what can be done to help meet the data center consolidation initiative's mission? Here is a proposed two-pronged approach.

1. Proactively mitigate long-term business risk with a strategic IT baseline plan.

In order to mitigate the business risk of making short-sighted decisions, we need to put special emphasis on establishing a strategic IT asset inventory baseline. This creates a comprehensive inventory of all hardware and software assets at the data center, and captures the proper metrics for utilization and energy.

Keep in mind; stakeholders have differing opinions of which factors are controllable at the baseline level. In optimization, for example, IT controls the decisions related to network bandwidth, server count, etc. However, it does not control the total cost of operations, which depends on internal client applications.

Identifying the controllable disconnect upfront allows you to account for potential risk areas, and strategically accommodate baseline activities that otherwise would be taken for granted. This enables higher situational awareness, and promotes proactive decision-making.

2. Apply ‘what if’ simulation scenarios to select the most feasible consolidation approach to your business situation.

Once the moving parts in a data center environment have been fully identified, management must analyze and examine the strategic investment decisions. Since there are multiple approaches with which can be used to consolidate the data center, performing a detailed energy and cost evaluation helps to gain control over several “what if” scenarios and intelligently select the most feasible scenario for your center’s environment.

No agency requirements are alike, therefore mapping how a center’s applications operate and inter-relate to the center’s server/mainframe, database/platform, security, usage, dependencies and architecture creates the bigger picture where each piece of the puzzle depends on the other.

To meet the federal data center consolidation initiative’s mission, federal CIOs must first establish a baseline, considering all of the fluid factors at play, then build plausible scenarios up from the baseline and analyze the outcomes to then chart the most appropriate course.

If this process is completed effectively, many unpredictable factors should be accounted for. Or Else.