IBM, Microsoft Build Presence at Federal Data Warehousing Table
IBM, Microsoft Build Presence at Federal Data Warehousing Table<@VM>Business Intelligence Solution Center<@VM>Customer Relationship Management
By John Makulowich
In broad-based efforts that may portend a sign of what is ahead for mass customization in data warehousing, two major information technology players are fine-tuning intriguing wrinkles to gain a greater foothold in the federal market.
Already among the market leaders on a number of IT fronts, both Microsoft and IBM hope to increase share of mind as well with their initiatives.
Armonk, N.Y.-based IBM Corp. opens its so-called Teraplex Integration Centers to company customers and business partners at no charge for proof-of-concept testing of IBM and non-IBM vendor applications and products. These centers represent a $63 million investment.
There are four centers at three sites, one for each of its business intelligence server platforms: S/390 and RS/6000 in Poughkeepsie, N.Y., AS/400 in Rochester, Minn., and Netfinity in Raleigh, N.C.
One of the recent users of a Teraplex Integration Center, or TIC, was the Health Care Financing Administration, an agency of the Department of Health and Human Services. HCFA is responsible for operating the Medicare and Medicaid national heath care programs that benefit about 75 million Americans.
The agency also works with the Health Resources and Services Administration to run the Children's Health Insurance Program, which covers many of the 10 million uninsured children in the United States.
For Joseph Catucci, Teraplex Integration Centers manager at IBM and an expert in large-scale databases, what makes the HCFA case interesting is that it shows the future challenges federal and commercial organizations will face as they start to build the groundwork for large-scale data warehouses.
When starting to build a claims data warehouse on a Systems 390 server that would amount to some 10 to 12 terabytes, HCFA sought assurance that its system could manage this level of data and the number of users likely to access it.
The agency planned to use the warehouse to provide value-added information on health care to Congress and the public. Among the HCFA business requirements were the ability to load the data from claims sources, let users access key information for research and policy development and refresh the data.
The agency also needed assurance that a multiterabyte warehouse could be refreshed in the nearly 64-hour window between the end of the workday Friday and the start of work Monday morning.
The Teraplex Integration Center in Poughkeepsie offered the technical resources to pre-test the data warehouse before taking it live at HCFA headquarters. That testing at the IBM S/390 TIC allowed HCFA to sidestep some of the potholes connected with large-scale databases and revealed some of the ways HCFA could maximize the use of the data warehouse, according to Catucci. It also showed HCFA how new business intelligence solutions could support research, policy development and strategic decision-making as well as continued data growth.
"We have done a lot of testing with pure test data. The difference in the Teraplex Integration Centers is the use of live data and queries from the customer. We deal directly with customers and account teams to understand and work with either subsets or entire databases, which we recreate on our hardware," said Catucci.
In performing its proof-of-concept test, HCFA built a 1.5 terabyte database and produced a set of health-care-related queries by surveying business users on their needs. One example of a typical query was the following: Of the total number of hip fractures in a given year, how many patients had surgery? Of those who had surgery, how many had infections?
A test at any given TIC can last from four weeks to several months. Using a phased approach, TIC staff work with the customer to identify success criteria and realistic goals and prepare a step-by-step plan.
Execution amounts to setting up the hardware and software, creating and loading the real data from the customer, creating the test scenarios and doing the tests. That might include generating queries and simulating user workloads. The system then is reconfigured and tested again to optimize the settings.
To Catucci, one profile of a TIC user is the customer who has a previous generation of hardware and expects to buy new equipment because she expects her data warehouse to grow in both the amount of data and number of concurrent users. That user seeks assurance that she will be able to carry on her business and conduct the necessary business intelligence.
For IBM, like most other IT firms in the data-warehousing field, business intelligence covers the collection, management and analysis of data to transform it into useful information. That information is then distributed throughout the enterprise or organization.
Using such intelligence can help reduce operating costs through more effective financial analysis, risk management,
fraud management, distri-
bution and logistics management and sales analysis.
"Clearly, business intelligence will grow by default. In banking, for example, branches will want to check if a customer has other accounts that it could service for added revenue," Catucci said. "And there is more analysis of data just by default. More companies are looking to uncover patterns in more and more data."
While Catucci said he sees many companies continuing to demand quicker response times, he said many of those customers do not get optimal performance from their existing data warehouse.
The reason why is many are reluctant to change, working on the mental model that if it works today, it will work tomorrow.
For Catucci, while it may work, it will not necessarily perform as well.
The reluctant executive and the application of business intelligence are the drivers behind a new partnership among CIBER Inc. of Englewood, Colo.; Data General, Westboro, Mass., a division of EMC Corp.; and Microsoft Corp., Redmond, Wash., in developing the Business Intelligence Solution Center, or BISC, in McLean, Va.
Data General supplies enterprise and departmental computing solutions worldwide. Its products include Windows NT as well as UNIX AViiON servers. CIBER provides management consulting and IT solutions for e-business.
The BISC is pitched by its collective marketing mavens as a unique combination of technology solutions that will let organizations gain competitive advantage by quickly designing and deploying a business intelligence platform. It combines Microsoft's SQL Server 7.0 and OLAP services technology, Data General's AViiON Windows NT server solutions, EMC's CLARiiON fiber channel storage solutions and CIBER's data warehousing expertise.
The partners hope that could amount to a turnkey solution that lets organizations make better decisions faster, fine-tune business processes more quickly and improve customer service. The goal of the BISC is to aid the functional decision-maker and to deepen awareness of what benefits OLAP and business intelligence can bring.
Somewhat similar to the IBM Teraplex Integration Centers concept, only on a somewhat smaller scale, the key BISC service is the so-called Business Intelligence Boot Camp. It covers data warehouse design, development and training and lets customers use and test SQL Server 7.0 with their own data.
According to Carol Kerins, Microsoft business intelligence program manager, the business model for the BISC is to introduce business intelligence solutions to the federal and commercial communities.
"The BISC allows us to jump-start a program that lets federal customers see what goes into building a data mart," said Kerins. "It is an opportunity for potential users to see how structured data can be used to gain knowledge and improve performance, how data can be placed in a framework that will work for them."
Among the BISC targets of opportunity are agencies such as the Census Bureau, the Defense Department and the Internal Revenue Service, which share a need for more refined data in analyzing acquisitions, in decision support systems, in customer relationship management and in business intelligence.
For Brian Moran, CIBER database solutions practice leader and an avowed evangelist for technology, the BISC exists to show customers how they can leverage technology to make business intelligence a reality for their organizations.
"The BISC was formed as an idea to make it a little easier and safer for federal and commercial customers to experiment with business intelligence, to see how to turn mountains of data into information for competitive advantage," Moran said.
That is done at the BISC through training, briefings and boot camps. The training covers use of software and hardware, while executive briefings are educational opportunities targeted to customers. As an example, Moran cited an unnamed civilian agency that needed to support litigation. The task was to receive information from a defendant and then perform an analysis that could be used by its attorneys to make better decisions in pursuing the case.
For the serious customer, the BISC has a data warehouse assessment, which can take from three to six weeks. It is geared, as Moran put it, to reality checking and state of readiness.
"The CIBER 5-step methodology helps an organization perform a readiness check in key areas to ensure the most common [data warehousing] project mistakes are avoided," said Moran.
In the first step, a business review ensures the data warehousing project is designed to measure and analyze so-called key performance indicators that are necessary to make better business decisions. The next step, the organizational review, ensures both the technical and the functional project teams are operating on the same frequency. Here, Moran said, it is important that both sides of an organization are teamed to support the same business drivers.
During the third step, the project planning and methodology review helps an organization understand the complexities of a full life-cycle data-warehousing project. This step helps start an organization down the path of designing for the enterprise by building incrementally.
In the fourth step, the information architecture review, there is an analysis of logical and physical data models and the extraction, transformation and loading processes necessary to produce the key performance indicators that will be used to improve decision-making.
Finally, the technical architecture review analyzes the current hardware, software and network infrastructure. This step identifies technical risks and constraints on performance, maintenance, scalability, data distribution, disaster recovery and sizing.
Wally Birdseye, president of e-government services for CIBER, said the major difference between the public and private sectors is that the government wants to increase services, while the private wants to increase profits. What they have in common is the desire to pursue their goals with leaner resources.
He said many senior executives and chief information officers lack an understanding of how they can exploit data warehousing. Yet driving the explosion of interest is knowledge management and the search for good, solid data contained in the organization.
"It is helpful to classify data into structured and unstructured sources," Moran said. "Structured data lives nicely in a database or OLAP cube, while unstructured data tends to live in corporate messaging systems and collaborative document environments. Both sources of data have tremendous knowledge potential that can be unleashed to achieve competitive advantage."
Beneath and beyond the IBM centers and the Microsoft-inspired teams to expose more executives and organizations to the power and promise of data warehousing and knowledge management is the important role of data warehousing in customer relationship management, or CRM, specifically in the federal government.
CRM is defined as a process where information gathered about customer transactions is used to improve the business relationship with that customer.
Steven Taylor, NCR Corp. industry consultant, believes CRM can provide enhanced relationship quality results to government. He is co-author, with Douglas Friedman, a senior data warehouse consultant, of a recent paper entitled, "Citizens Are Customers, Too: Customer Relationship Management in the Government."
An effective CRM initiative can help government agencies better understand and respond to large populations of constituents desiring personalized treatment, Taylor said. And while government is not measured by earnings, constituent satisfaction makes a difference to elected officials and dedicated public servants. "Part of the drive for CRM in the government is the declining staffs and the search for ways to be more effective and efficient," said Taylor.
Every organization, whether public or private, needs a metric to measure the impact of the data warehouse initiative on the bottom line. For private-sector firms, greater revenue or deeper savings in overhead are possible metrics. But the case is different for the public sector. "The harder situations are those where there really is no revenue or financial benefit to measure. Then, increases in efficiency and effectiveness become the metrics," Taylor said.
In his paper, Taylor noted that one thing the government can learn from studying the private-sector model is that to make any CRM program work effectively, a centralized data warehouse is imperative. Not only does a data warehouse allow a company or agency to get a better pulse on its customers, it also gives it an opportunity to interact with customers in a one-on-one, personalized manner.
For example, Taylor cited state and local tax agencies that are considering techniques such as market segmentation analysis to uncover problems that often affect taxpayers with similar profiles. By identifying problems such as tax underreporting, agencies can take steps to correct the problem through education, legislation, simplification or even auditing.