Debunking the 5 myths of Big Data

Accenture's Chris Smith takes on the five most common myths that undermine the value of Big Data.

As the new era of Big Data steams ahead, federal agencies have to quickly come to terms with how to access, prioritize, manage, analyze, store and exchange the crush of data coming their way.

It is straining their ability to handle the tremendous volume. Structured, unstructured and semi-structured data are in documents, websites, social networks, mobile channels and relational databases.

Data is culled from countless sources, including internal file networks, data centers, sensors, other agencies and the cloud. And, it’s being created at a rapid-fire pace. In fact, according to IDC, the data being created around the world more than doubles every two years.

It’s obviously important for government to get its arms around this enormous influx of data, but it’s even more important to be able to realize the great potential that can be derived from managing and using data effectively in making good decisions and providing necessary services to citizens.

A flurry of Big Data activity has begun across the federal community. The Obama administration has announced a $200 million Big Data initiative. The Defense Department is investing $250 million annually and a host of other federal agencies – from the Homeland Security and Education departments to NASA and the Food and Drug Administration – all have Big Data programs under way.

Harnessed correctly, Big Data is poised to truly change the way government works. But first, agencies will have to break through the buzz – and clear up a number of myths about Big Data – in order to create strategies that can unlock new opportunities.

Myth #1
Big Data is someone else’s problem

No matter what kind of system a federal organization has – legacy, in-house, public or private cloud – the organization has a Big Data problem. Those who subscribe to this myth assume that because they are ultimately moving to the cloud, Big Data will be their managed services provider’s problem. The truth is, however, that moving to the cloud requires agencies to understand the data going into the cloud, integrate cloud-based data with other systems, and retain a snapshot of their entire data picture.

Myth #2
Big Data is only about size

Size does matter, but Big Data is much bigger than size. It’s an entirely new paradigm comprised of data that keeps growing from seemingly infinite sources and locations, all of which must be accounted for in the processes, tools and technologies used to collect, store, analyze and share it.

Myth #3
Big Data is solely a technology issue

Although federal organizations must adapt technology to handle the influx of Big Data, technology is only part of the solution. Advanced analytics will help agencies make meaning of and extract insight from data and provide the bridge required to shift from creating reactive to proactive solutions. Agencies must commit to building the internal capabilities to make insight-based decisions; secure people with the right skills to collect, store, structure and format data from analytical and statistical perspectives and develop governance and workflow structures to support an entirely new way of working.

Myth #4
Government cannot learn from private sector Big Data solutions

Although government data management is often centered on securing citizens’ personal information, administering government services and sharing intra- and inter-agency data, there are lessons government can learn from business. Industries such as financial services and retailing have been on the front lines of the Big Data revolution and have great experience to offer as government seeks to challenge the status quo and work in new ways. The U.S. Postal Service, for example, has been harnessing Big Data to create a new revenue stream by selling a variety of licenses, for up to $175,000 annually, giving organizations access to its National Change of Address database. Commercial enterprises conduct similar licensing transactions regularly.

Myth #5
There is a silver bullet for solving the Big Data problem

The complexity of managing unprecedented amounts and types of data from every quarter makes it impossible to find a single solution to the Big Data challenge. It’s clear that agencies need a suite of tools and platforms, processes and personnel to effectively collect, store, manage and draw insights from this data. And, in such a dynamic environment, agencies must adopt a continuous improvement mindset, where the Big Data roadmap is continuously reassessed and recalibrated.

Big Data is here to stay. While the move from massive information overload to insight- powered operations will be difficult, federal agencies that are grounded in the reality of today and take a holistic approach to tackling Big Data will reap bigger rewards—and so will the citizens and businesses they serve.

NEXT STORY: STG names new president