Integrate InfoSphere MDM Server for PIM with InfoSphere QualityStage to standardize product data

A web services approach

To ensure data quality, you can implement validation rules for product data at various levels (such as attribute, item, or category) in IBM® InfoSphere® Master Data Management Server for Product Information Management (MDM Server for PIM). However, rules incur processing overhead during large imports or during the data reconciliation process. InfoSphere QualityStage™, on the other hand, is a component of IBM InfoSphere Information Server that can profile and standardize data, eliminate duplicates from data sources, and ensure survival of the best-of-breed records from a duplicate set. This article looks at a real-time integration between MDM Server for PIM and QualityStage to ensure quality of product data through standardization processes implemented in QualityStage.

Share:

Amit Malla (amitmalla@in.ibm.com), Technical Lead, MDM Server for PIM (WPC-GDS), IBM

Amit MallaAmit Malla started his career with developing software for mobile (GSM) communication in 2000. Working with IBM India Pvt. Ltd. for more than six years, Amit has gained knowledge on various Master Data Management products, primarily InfoSphere MDM Server for PIM. He leads the MDM Server for PIM-GDS development and support team. His knowledge about worldwide retail industry and the use of GDSN standards for product synchronization is well appreciated in his business unit at IBM. A passion for data quality and standardization has led him to find possible ways to integrate potential Master Data Management products within IBM.



Manasa Rao (kmanasarao@in.ibm.com), Software Engineer, IBM

Manasa RaoManasa K. Rao started her career with IBM India Pvt. Ltd. about two years ago. She has worked on many projects involving IBM InfoSphere QualityStage and has been certified as IBM InfoSphere QualityStage Developer. Her deep understanding of the product and its capabilities has helped her to consider possible use cases for integration with other IBM Master Data Management products.



02 June 2011

Introduction

Organizations and companies around the world generate huge amounts of data stored in multiple data sources. Providing a single, unified, and up-to-date view of this data to disparate users becomes essential for supporting business processes and for making informed business decisions. 80 percent of companies are not confident in the quality of their product data, while 73 percent find it difficult or impractical to standardize product data. To achieve this goal, some organizations use manual workflow to cleanse and maintain quality of data. However, large data volumes cannot be handled manually because of the lag it creates in the system. Thus, a systemic way of solving this problem becomes imperative. The standardized data, which is reliable and trustworthy, is then stored in Master Data Management (MDM) software to maintain and deliver master data across an organization.

IBM InfoSphere Master Data Management Server for Product Information Management (MDM Server for PIM) is a master data management software product designed to handle product-related information in various phases of its life cycle. MDM Server for PIM is an authoritative and collaborative environment that creates and maintains a single view of product, supplier, and partner data. To ensure data quality, validation rules for product data can be implemented at various levels (attribute, item, category, etc.) in MDM Server for PIM, but these rules incur processing overhead during large imports or during data reconciliation processes. In fact, MDM Server for PIM is not meant for validating product data but to organize and maintain it properly.

InfoSphere QualityStage, on the other hand, is a component of IBM InfoSphere Information Server (IIS) that can profile and standardize data, eliminate duplicates from data sources, and ensure survival of the best-of-breed records from a duplicate set. The flexibility provided by QualityStage to add standardization rules, as and when needed, makes it the most suitable product to handle data quality woes in MDM Server for PIM.

A real-time integration between MDM Server for PIM and QualityStage aims to ensure quality of product data using standardization processes implemented in QualityStage. Data quality can be achieved at the desired level by processing the data feeds into QualityStage prior to its storage in MDM Server for PIM, thereby reducing the validation burden from MDM Server for PIM. This article looks at a real-time integration scenario between MDM Server for PIM and QualityStage for a single record sent from MDM Server for PIM. This scenario employs a web service over to QualityStage and receives the standardized data back into MDM Server for PIM for storage.


Process flow

Figure 1 illustrates a simplified process flow on which this integration is based.

Figure 1. Process flow
User adds/updates data in MDM Server for PIM screen. On-demand Action Script is called to cleanse and standardize the product data. Web services call is made to IIS job exposed as a web service. IIS processes payload and returns processed product data for write back in MDM Server for PIM. MDM Server for PIM catalog is updated with cleansed and standardized data.

Prerequisites

The scenario described in this article requires the following:

  • A standard working InfoSphere QualityStage 8.5 and Information Services Director installation.
  • Basic knowledge of writing standardization rules, developing Information Services-enabled jobs that can use these standardization rules, and deploying these jobs as services.
  • A standard working MDM Server for PIM installation.
  • Familiarity with MDM Server for PIM metadata and data creation and storage.
  • Basic knowledge of MDM Server for PIM scripting and workflows.
  • Basic knowledge of web services concepts.

Need for integration

Product data residing in MDM Server for PIM is supposed to be of high quality to be labelled as a "single version of the truth" in all senses. But this is not always the case, as data can get into MDM Server for PIM from many data sources, compromising data quality. Differences in data sets from various sources can also be due to typing errors, cultural habits or behaviors, using short versions of text, counting errors, etc. For instance, an attribute "weight," associated with a product item, can have the unit of measure as "lbs" or "pounds." This causes inconsistency and possible duplication in the system that gets difficult to control and act on as data sets grow over a period of time. Hence, standardizing these data sets prior to storage into the system would eliminate such issues.

Here are some of the advantages of integrating MDM Server for PIM and QualityStage:

  1. QualityStage in its original use functions as the data quality management(DQM) environment for product data. Its services are available enterprise-wide as well.
  2. QualityStage functions as a quality rules hub and isolates the data quality subroutines in a high-performance environment, thus eliminating the overhead on MDM Server for PIM. This helps in managing the rules hub with a greater amount of independence with little or no impact on the MDM Server for PIM customizations.
  3. The implementation of the quality rules hub in QualityStage eases the change management issues that may arise out of the introduction of new rules.
  4. De-duping subroutines have a best-practice prerequisite that matching algorithms should be run on standardized data for better capture of match candidates. The enablement of data standardization helps adhere to the prescribed pre-requisites for duplicate identification.
  5. Behavioral extensibility is provided by the flexibility in QualityStage to override rules based on a context.
  6. While priming the MDM Server for PIM system, certain sources of product data may be decommissioned, and the allied metadata in MDM Server for PIM for the attributes from these systems may be governed by a specific list of values. Initial load routines can leverage the rules in QualityStage to ensure that the initial load routines manage validations for adherence to the list of values for these attributes.
  7. This solution construct which provides a DQM environment can be extended to manage the de-duplication requirements as well.

Scenarios addressed by this integration

This integration talks about two scenarios:

  1. At the "save" of an item data in MDM Server for PIM
  2. At a step in the MDM Server for PIM workflow

For purposes of this article, the following attributes identifying the commonly used attribute types in MDM Server for PIM, associated with product data, were chosen to showcase integration:

  1. Global Trade Item Number (GTIN) Name— GTIN name is the name of the trade item as it appears on the package.
  2. Target market— The target market refers to the country where the item or product is available for sale.
  3. Net Content— Net content is the amount of the trade item contained by a package, usually as claimed on the label.
  4. Net Weight— Net weight is used to identify the net weight of the trade item. Net Weight excludes all packaging material, and also excludes the packaging material for all lower-level GTINs.
  5. Functional Name— Functional name provides information about the use of the product or service by the consumer. It helps clarify the product classification associated with the GTIN.
  6. Brand Name— The brand name is the name for the product recognized by the consumer. The Brand Name must be identical for each target market of a GTIN.
  7. Sub Brand— Sub brand is the second level of brand. It could be a trademark. It is the primary differentiating factor that a brand owner wants to communicate to the consumer or buyer. It can be populated in multiple languages.
  8. Variant— Variants are the distinguishing characteristics that differentiate products with the same brand and size, including such things as the particular flavor, fragrance, taste, etc.
  9. Color— Free text description of the color of the trade item.
  10. Trade Item Description— Trade Item Description refers to the description for a particular product. According to Global Standards Management Process (GSMP), when a Trade Item Description is absent, it is derived from concatenation of brand name, sub brand, functional name and variant.

The product data consisting of the above attributes are expected to be sent to QualityStage one record at a time. The standardized output sent by QualityStage can be stored with or without user intervention as described below.

Note: In both scenarios detailed below, the data governance principle of displaying both the user/existing input values against the standardized values for the attributes is not depicted. This can, however, be achieved in MDM Server for PIM fairly easily. The intent is to focus on the integration piece here.

Using Single Edit screen in MDM Server for PIM — Standardization of single record on item save

In MDM Server for PIM's Single Edit screen, you can fill in the product data for the defined attributes. Post-data entry, clicking Save on this screen triggers the data to be sent to QualityStage over a web service call. Data is standardized by QualityStage according to the rules defined and written there. The response sent by QualityStage is interpreted and parsed by MDM Server for PIM and can now be stored as is, without any confirmation or handling from user.

A MDM Server for PIM Single Edit screenshot showing the attributes defined earlier is pictured below.

Figure 2. MDM Server for PIM Single Edit screen
This is a single edit screen of MDM Server for PIM containing attributes and corresponding values for an item in the product catalog

To enable the web service call, a MDM Server for PIM script is used as a pre-processing script on the catalog where item resides that is "Product Catalog" in this example. A screenshot showing catalog attributes in MDM Server for PIM is as follows.

Figure 3. Catalog attributes in MDM Server for PIM
Image shows MDM Server for PIM screen containing catalog attributes. The script that makes call to the exposed web service is mentioned as the value for pre-processing script attribute.

A sample MDM Server for PIM script that can be used as a pre-processing script is as follows.

Listing 1. Pre-processing script code
//Product Item Standardization - sample script

//Create a global logger
var logFile = "/PIM_QS_Integration.log";
var logger = createOtherOut("log");

//var ctg = getCtgByName("Product Catalog");
var attrNames = item.getCtgItemAttribNamesList();
var attrSize = attrNames.size();

var reqVal = [];
var resVal = [];
var check = [];
var j = 0;
var attrpath = [];
var xml_node = 0;
var attrPath = [];

//This function is used to process the response XML from QS and
//set standardized values in item spec attributes.
function processResponseXML(my_xml_root_node) {

  forEachXMLNode(my_xml_root_node, "", xml_node) {
    logger.writeln("Start Processing XML nodes: "+xml_node.getXMLNodeName());

    forEachXMLNode(xml_node, "ProdAttrStanOperReturn", xml_item_node) {
      logger.writeln("---> Processing XML node: "+xml_item_node.getXMLNodeName());

      check = toUpperCase(xml_item_node.getXMLNodeValue("valid",true));
      if(check=="YES") {

        for(j=0;j<attrSize;j++) {

          attrpath[j] = lookup("AttrPath_XmlTag_Lkp",attrNames[j]);
          resVal[j]=xml_item_node.getXMLNodeValue(attrNames[j],true);
          logger.writeln(attrNames[j] + " is [" + checkString(resVal[j], "") + "]");
          item.setEntryAttrib(attrpath[j],checkString(resVal[j], ""));
        }
      }
    }
  }
}//~processResponseXML()

//Global variables
var xmlRequestMsg = "";
var responseXML = "";
var sUrl = "http://<SERVER_NAME_OR_IP>:<PORT>/wisd/ProdAttrStanAppl/ProdAttrStanServ";

//Create an XML message to send over to QS
var xmlHeader = "<?xml version=\"1.0\" encoding=\"UTF-8\"?>";
var soapUrl = "http://ProdAttrStanServ.ProdAttrStanAppl.isd.ibm.com/soapoverhttp";
var xmlEnvelope = "<q0:ProdAttrStanOper xmlns:q0=\""+soapUrl+"/\">";

var xmlMsgBody = "";

for(j=0; j<attrSize; j++) {

  attrPath = lookup("AttrPath_XmlTag_Lkp",attrNames[j]);
  reqVal[j] = item.getEntryAttrib(attrPath);
  xmlMsgBody = checkString(xmlMsgBody, "") + "<" + attrNames[j] + 
    ">" + checkString(reqVal[j], "") + "</" + attrNames[j] + ">";
}

var xmlFooter = "</q0:ProdAttrStanOper>";

xmlRequestMsg = xmlHeader + xmlEnvelope + xmlMsgBody + xmlFooter;
logger.writeln("Request: "+xmlRequestMsg);

//Invoke webservice call
catchError(err) {
  responseXML = invokeSoapServerForDocLit(sUrl, xmlRequestMsg);
  logger.writeln("Response: "+responseXML);

  if(responseXML == null || responseXML == "") {
    logger.writeln("FATAL ERROR: No response got from Web Service");
  } else {
    var xml_root_node = new XmlDocument(responseXML);
    processResponseXML(xml_root_node);
  }
}
if(err != null) {
  logger.writeln("ERROR: Failed reading from response xml");
}

//Save and close the log file
logger.save(logFile);
logger.close(logFile);

The sequence diagram for this scenario is as shown below.

Figure 4. Sequence diagram for standardization of single record on item save
Image shows process when user creates new item and saves

Using MDM Server for PIM workflow — Standardization of single record and storage after review

There could be cases where data is imported from external sources into MDM Server for PIM. Since MDM Server for PIM is a collaborative system, this data can be checked out into a particular step in a workflow that needs user or catalog manager's review or approval before the data is actually standardized and stored. Flexibility can be provided to the user to update only those standardized attributes the user deems appropriate and retain the original non-standardized values for the other attributes.

To achieve this integration, you need to create a workflow in MDM Server for PIM and have the web service called and response processed in one of the workflow step's OUT steps.

A sample workflow is shown below with the edit step as the one chosen for web service call to QualityStage. The processing script is written in the OUT step.

Figure 5. Sample workflow
Image shows MDM Server for PIM sample workflow screen where edit step is chosen to make a web services call to QualityStage

A sample OUT script for this workflow step is as follows.

Listing 2. Sample OUT script code
//Workflow script to handle Webservice call for a single item using a lookup table
//to handle identities
function OUT(entrySet, colArea, workflow, step, stepPath) {
  var logFile = "/WorkflowProdAttrStan.log";
  var logger = createOtherOut("log");
  var xmlRequestMsg = "";
  var responseXML = "";
  var sUrl = "http://<SERVER_NAME_OR_IP>:<PORT>/wisd/ProdAttrCleansingWithMsgIdAppl"+
    "/ProdAttrCleansingWithMsgIdService";

  var j = 0;
  var attrPath = [];
  var xml_node = 0;
  var itemPk = [];
  var reqVal = [];
  var resVal = [];

  //A lookup table is used to keep track of item identities
  var Lkpctg = getCtgByName("Item_Identifier_Lkp");
  var LkpItem = [];
  var msgId = [];
  var check = [];
  var attrpath = [];

  forEachEntrySetElement(entrySet, OENTRY) {

    var item=OENTRY;
    var attrNames = item.getCtgItemAttribNamesList();
    var attrSize = attrNames.size();

    //Create an XML message to send over to QS
    var xmlHeader = "<?xml version=\"1.0\" encoding=\"UTF-8\"?>";
    var xmlEnvelope = "<q0:ProdAttrCleansingWithMsgIdOper xmlns:q0=\""+
      "http://ProdAttrCleansingWithMsgIdService.ProdAttrCleansingWithMsgIdAppl"+
      ".isd.ibm.com/soapoverhttp/\">";
    var DateTime = today().formatDate("yyyyMMddHHmmss")+"_"+rand(10000);

    var xmlMsgBody = "<msgIdentifier>" + DateTime + "</msgIdentifier>";

    for(j=0; j<attrSize; j++) {

      attrPath = lookup("AttrPath_XmlTag_Lkp",attrNames[j]);
      reqVal[j] = item.getEntryAttrib(attrPath);
      xmlMsgBody = checkString(xmlMsgBody, "") + "<" + attrNames[j] + ">" 
        + checkString(reqVal[j], "") + "</" + attrNames[j] + ">";
    }

    var xmlFooter = "</q0:ProdAttrCleansingWithMsgIdOper>";

    xmlRequestMsg = xmlHeader + xmlEnvelope + xmlMsgBody + xmlFooter;
    logger.writeln("Request: "+xmlRequestMsg);

    //Add timestamp for item in lookup table
    put("Item_Identifier_Lkp",DateTime,item.getEntryAttrib("Item_Attrs/gtin"));

    //Process over a Web Service call
    responseXML = invokeSoapServerForDocLit(sUrl, xmlRequestMsg);
    logger.writeln("Response: "+responseXML);

    //Process response XML
    if(responseXML == null || responseXML == "") {
      logger.writeln("FATAL ERROR: No response got from Web Service");
    } else {
      var my_xml_root_node = new XmlDocument(responseXML);

      forEachXMLNode(my_xml_root_node, "", xml_node) {

        logger.writeln("Start Processing XML nodes: "+xml_node.getXMLNodeName());

        forEachXMLNode(xml_node,"ProdAttrCleansingWithMsgIdOperReturn",xml_item_node){

          logger.writeln("---> Processing XML node: "+xml_item_node.getXMLNodeName());

          check = toUpperCase(xml_item_node.getXMLNodeValue("valid",true));

          if(check=="YES") {

            msgId = xml_item_node.getXMLNodeValue("msgIdentifier",true);
            itemPk = lookup("Item_Identifier_Lkp",msgId);

            LkpItem = Lkpctg.getCtgItemByPrimaryKey(msgId);

            for(j=0;j<attrSize;j++)  {

              attrpath[j] = lookup("AttrPath_XmlTag_Lkp",attrNames[j]);
              resVal[j]=xml_item_node.getXMLNodeValue(attrNames[j],true);
              logger.writeln(attrNames[j]+ " is ["+checkString(resVal[j],"")+"]");
              item.setEntryAttrib(attrpath[j],checkString(resVal[j], ""));

            }
          }
        }
      }
      deleteCtgItem(LkpItem);
    }
    var validationErrors = item.saveCtgItem();
  }

  //Save and close the log file
  logger.save(logFile);
  logger.close(logFile);
}

In the edit step of the workflow, the item data is sent to QualityStage, and standardized attribute values are presented to the user for review in the review step, as shown below.

Figure 6. Window containing standardized values
MDM Server for PIM screen is shown where the attributes of the item have standardized values sent by QualityStage
Figure 7. Review step of the workflow
MDM Server for PIM Collaboration area console shows and item in review step awaiting action by user

The sequence diagram for this scenario is shown below.

Figure 8. Sequence diagram for standardization of single record and storage after review
User creates an item and checks it out in the MDM Server for PIM workflow

Achieving data standardizing at QualityStage

You need to prepare QualityStage to handle and process the incoming data for standardizing product data sent by the MDM Server for PIM system. Follow these steps to prepare QualityStage:

  1. Investigate, identify, and create standardization rules. Refer to "Creating standardization rulesets" for details.
  2. Develop QualityStage job(s) to understand the input from a web services call from MDM Server for PIM, identify and standardize attributes of interest, and send out the response to MDM Server for PIM. Refer to "Developing information services enabled job for product data standardization."
  3. Deploy the QualityStage job(s) as a web service. Refer to "Deploying the product data standardization service using Information Services Director."

Creating standardization rule sets

Standardization is a step in the data quality process where data occurring in freeform fields are moved to pre-defined fixed columns and molded to follow a pre-defined convention. Spelling variations and abbreviations for values in a column are handled to map to a single standard form through standardization. Standardization rule sets help achieve this goal. Standardization rules in QualityStage are written in Pattern Action Language (PAT). QualityStage has multiple rule sets delivered with the product that can be leveraged according to varying requirements. Email standardization rules, and rules for normalizing names and standardizing addresses are a part of QualityStage. Since name normalization/standardization and address standardization rules are country-specific, different standardization rule sets are delivered for different countries.

For standardizing product data attributes, new rule sets were built as part of this exercise as these are not provided by default in QualityStage. As part of this integration, rule sets were written for all attributes mentioned in the "Scenarios addressed by this Integration" section. Building a new rule set mainly involves the following activities:

  • Identification of the output columns — The output columns that the standardization process generates need to be identified for the particular rule set.
  • Classification of data — Tokens appearing in the input data for its rule set need to be identified as belonging to a particular class, and the class needs to be assigned to this token.
  • Identification of relevant patterns — Patterns can be considered as sentences formed with classes. The tokens of the input data, when replaced by their specific classes and individual tokens separated by pipe (|) delimiter form a pattern. A condition can be optionally associated with a pattern. A set of action items are written following a pattern, emphasizing the actions executed on the tokens of those inputs that map to the particular pattern.

Developing information services-enabled job for product data standardization

The web services-based methodology of communication between MDM Server for PIM and QualityStage facilitates near real-time integration and makes it the essence of this integration. Real-time extraction, manipulation, and loading of information is enabled in QualityStage through information services stages. There are two information services stages: the ISD Input stage, which facilitates reading input parameters of the web services call; and ISD output stage, which enables sending standardized and transformed output to the invoker of the service. In this integration, both of these stages have been utilized to enable invoking, standardizing, and loading of product data in near real-time, keeping the job in always-running state. The topology of the job is shown below.

Figure 9. QualityStage job topology
Standardized job contains an ISD input stage with link to standardize stage that provides data to ISD output stage

The standardize stage applies relevant standardization rule sets to applicable columns coming in the input request. Multiple columns can be standardized using different rule sets through a single standardization stage as shown below.

Figure 10. Standardize stage window
Standardize stage window shows the rules and the columns on which these rules are applied

The standardize stage refers to the rule sets for the columns and standardizes these columns to produce standardized output columns as shown below.

Figure 11. Output tab of standardize stage
Standardize stage output tab shows some output columns of the standardize stage

The job needs to be enabled for servicing the information services request. Since in real time there can be multiple requests coming in, multiple instances of the job need to be allowed for execution. This can be set through the general tab of the job properties window as shown below.

Figure 12. Job properties window
Job properties window is shown with Allow Multiple Instances and Enabled for Information Services checked

Once the job is enabled for information services, it needs to be compiled to make it available for deployment.

Deploying the product data standardization service using Information Services Director

The information services-enabled job needs to be deployed so it can be consumed by the invoking party. This is facilitated through the InfoSphere Information Services Director (ISD), residing in the services tier of IIS. ISD can be accessed through the IIS console. To deploy information services-enabled QualityStage jobs through ISD, a new information services project needs to be created in ISD:

  1. Create an information services connection, if one does not exist already.
  2. Create a new information services application.
  3. Associate a new service with the information services application. Refer to the service-binding pane shown below:
    Figure 13. Service-binding pane
    Service-binding pane is shown with SOAP style as DOCLIT and SOAP action as Operation
  4. Associate a new operation with the service. For easy reference, the screen with the input column tab selected (default) is as shown below:
    Figure 14. Operations pane
    Operations pane with the input column tab selected is shown

    The minimum and the maximum instances of the job that needs to be active at any given point can be controlled in the provider properties tab. The minimum and maximum idle time for which a job must be idle before it can be stopped can also be provided. In addition to these, a maximum runtime can be specified, after which the job would not take any more service requests. For an always-running job, this is set to infinite. The provider properties pane is as shown below.

    Figure 15. Provider properties pane
    The operations pane with provider properties tab selected containing default values is shown
  5. Deploy the information services application.
  6. Verify the deployment and view the WSDL document. A list of all deployed information services applications can be retrieved by clicking the Deployed Information Services Application from the drop-down menu of the Operate tab. By selecting the service that was created, the service can be viewed in catalog by clicking View Service in Catalog. In the resulting web page, the WSDL document containing the contract for this service can be opened by selecting the Open WSDL Document after expanding the SOAP Over HTTP in the bindings view. This WSDL document can be used by MDM Server for PIM for communicating information to QualityStage.

Note: More information on the steps detailed above can be found here.


Conclusion

The purpose of this article was to showcase the synergy you can achieve by integrating an MDM Server for PIM implementation with QualityStage. The two use cases detailed showcase examples of leveraging QualityStage for product data standardization subroutines. The over-arching logical architecture is that QualityStage provides the DQM environment in which all of the product data quality services can be hosted, and MDM Server for PIM will be the primary consumer of the same through various extension points, and secondary consumers could be other systems in the enterprise. This compilation also portrays that product data quality rules need not be platform-specific to MDM Server for PIM, given that MDM Server for PIM extensions points can leverage an external system's subroutines for important sequences like DQM.

Resources

Learn

Get products and technologies

  • Build your next development project with IBM trial software, available for download directly from developerWorks.

Discuss

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Information management on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Information Management
ArticleID=665306
ArticleTitle=Integrate InfoSphere MDM Server for PIM with InfoSphere QualityStage to standardize product data
publish-date=06022011