Information icon IBM Information Server, Version 8.1
Feedback

WebSphere Information Analyzer capabilities

IBM® WebSphere® Information Analyzer automates the task of source data analysis by expediting comprehensive data profiling and minimizing overall costs and resources for critical data integration projects.

WebSphere Information Analyzer represents the next generation in data analysis tools, which are characterized by these attributes:

End-to-end data profiling and content analysis
Provides standard data profiling features and quality controls. The repository holds the data analysis results and project metadata such as project-level and role-level security and function administration.
Business-oriented approach
With its task-based user interface, aids business users in easily reviewing data for anomalies and changes over time, and provides key functional and design information to developers.
Adaptable, flexible, and scalable architecture
Handles high data volumes with common parallel processing technology, and leverages common services such as connectivity to access a wide range of data sources and targets.

Scenarios for information analysis

The following scenarios show how WebSphere Information Analyzer helps organizations understand their data to facilitate integration projects.

Food distribution: Infrastructure rationalization
A leading U.S. food distributor had more than 80 separate mainframe, SAP, and JD Edwards applications supporting global production, distribution, and CRM operations. This infrastructure rationalization project included customer relationship management, order-to-cash, purchase-to-pay, human resources, finance, manufacturing, and supply chain planning. The company needed to move data from these source systems to a single target system.

The company uses WebSphere Information Analyzer to profile its source systems and create master data around key business dimensions, including customer, vendor, item (finished goods), and material (raw materials). They plan to migrate data into a single master SAP environment and a companion SAP BW reporting platform.

Financial services: Data quality assessment
A major brokerage firm had become inefficient by supporting dozens of business groups with their own applications and IT groups. Costs were excessive, regulatory compliance difficult, and it was impractical to target low-margin, middle-income investors. When the federal government mandated T+1, a regulation that changed industry standard practices, the firm had to find a way to reduce the time to process a trade from 3.5 days to 1 day, a reduction of 71.4 percent.

To meet the federal mandate, the brokerage house uses WebSphere Information Analyzer to inventory their data, identify integration points, remove data redundancies, and document disparities between applications. The firm now has a repeatable and auditable methodology that leverages automated data analysis. By ensuring that all transactions are processed quickly and uniformly, the company is better able to track and respond to risk resulting from its clients’ and its own investments.

Transportation services: Data quality monitoring
A transportation service provider develops systems that enable its extensive network of independent owner-operators to compete in today's tough market. The owner-operators were exposed to competition because they could not receive data quickly. Executives had little confidence in the data that they received. Productivity was slowed by excessive time reviewing manual intervention and reconciling data from multiple sources.

WebSphere Information Analyzer allows the owner-operators to better understand and analyze their legacy data. It allows them to quickly increase the accuracy of their business intelligence reports and restore executive confidence in their company data. Moving forward, they implemented a data quality solution to cleanse their customer data and spot trends over time, further increasing their confidence in the data.

WebSphere Information Analyzer in a business context

After obtaining project requirements, a project manager initiates the analysis phase of data integration to understand source systems and design target systems. Too often, analysis can be a laborious, manual process that relies on out-of-date (or nonexistent) source documentation or the knowledge of the people who maintain the source systems. But source system analysis is crucial to understanding what data is available and its current state.

Figure 1. WebSphere Information Analyzer: Helping you understand your data
Information Server capabilities with Understand highlighted

Figure 1 shows the role of analysis in IBM Information Server. WebSphere Information Analyzer plays a key role in preparing data for integration by analyzing business information to assure that it is accurate, consistent, timely, and coherent.

Profiling and analysis
Examines data to understand its frequency, dependency, and redundancy and validate defined schema and definitions.
Data monitoring and trending
Uncovers data quality issues in the source system as data is extracted and loaded into target systems. Validation rules help you create business metrics that you can run and track over time.
Facilitating integration
Uses tables, columns, probable keys, and interrelationships to help with integration design decisions.

Data analysis helps you see the content and structure of data before you start a project and continues to provide useful insight as part of the integration process. The following data management tasks use data analysis:

Data integration or migration
Data integration or migration projects (including data cleansing and matching) move data from one or more source systems to one or more target systems. Data profiling supports these projects in three critical stages:
  1. Assessing sources to support or define business requirements
  2. Designing reference tables and mappings from source to target systems
  3. Developing and running tests to validate successful integration or migration of data into target systems
Data quality assessment and monitoring
Evaluates quality in targeted static data sources along multiple dimensions including completeness, validity (of values), accuracy, consistency, timeliness, and relevance. Data quality monitoring requires ongoing assessment of data sources. WebSphere Information Analyzer supports these projects by automating many of these dimensions for in-depth snapshots over time.
Asset rationalization
Looks for ways to cut costs that are associated with existing data transformation processes (for example, processor cycles) or data storage. Asset rationalization does not involve moving data, but reviews changes in data over time. WebSphere Information Analyzer supports asset rationalization during the initial assessment of source content and structure and during development and execution of data monitors to understand trends and utilization over time.
Verifying external sources for integration
Validates the arrival of new or periodic external sources to ensure that those sources still support the data integration processes that use them. This process looks at static data sources along multiple dimensions including structural conformity to prior instances, completeness, validity of values, validity of formats, and level of duplication. WebSphere Information Analyzer automates many of these dimensions over time.
Related concepts
A closer look at WebSphere Information Analyzer
WebSphere Information Analyzer tasks
Introduction to IBM Information Server

PDF This topic is also in the IBM Information Server Introduction.

Update icon Last updated: 2008-09-15