• Compartilhar
  • ?
  • Perfis ▼
  • Comunidades ▼
  • Aplicativos ▼

Blogs

  • Meus Blogs
  • Blogs Públicos
  • Minhas Atualizações

MDM Developers

  • Efetue login para participar
64033fdd-4a35-4733-852f-a39abcdf4fb3 Blog

SOBRE ESTE BLOG

The MDM Developers Community is open for any developers using IBM's Master Data Management software to connect with each other, stay posted on the latest MDM technical resources, learn how to get the most out of MDM tools, ask questions, and share tips and answers to technical problems.
  • Youtube
  • Google
  • RSS

Archive

  • março de 2018
  • janeiro de 2018
  • setembro de 2017
  • fevereiro de 2017
  • janeiro de 2017
  • dezembro de 2016
  • novembro de 2016
  • setembro de 2016
  • maio de 2016
  • abril de 2016
  • março de 2016
  • novembro de 2015
  • outubro de 2015
  • setembro de 2015
  • agosto de 2015
  • junho de 2015
  • abril de 2015
  • março de 2015
  • fevereiro de 2015
  • janeiro de 2015
  • outubro de 2014
  • setembro de 2014
  • agosto de 2014
  • julho de 2014
  • junho de 2014
  • maio de 2014
  • janeiro de 2014
  • dezembro de 2013
  • novembro de 2013
  • outubro de 2013
  • setembro de 2013
  • agosto de 2013
  • julho de 2013
  • junho de 2013
  • março de 2013
  • janeiro de 2013
  • dezembro de 2012
  • outubro de 2012
  • setembro de 2012
  • agosto de 2012
  • maio de 2012
  • janeiro de 2012

Marcações

TODAS AS PUBLICAÇÕES
  • Classificar por:
  • Data
  • Título
  • Curtir
  • Comentários
  • Visualizações ▼

Installing MDM Workbench v11

jtonline 110000B6Y8 | | Comments (9) | Visits (18969)

Tweet

If you've used previous versions of the workbench, one of the first changes you'll hit is that you no longer need to run the developer environment setup tool when you create a new workspace. In version 11, no projects need to be imported into the workspace, and you use the same installer to setup a local test server on your development machine as you would to install a production system.

 

Full development environment install

If you have a completely clean machine, the simplest way to get started is to use the workbench typical install. This will install DB2, Rational Application Developer, and WebSphere Application Server, along with MDM Server and the workbench, i.e. everything you need for a full development and test environment in one go. Here's how to get everything ready to run a typical install...

Firstly, you'll need to download all the typical install images. The following part numbers are required for a full MDM Workbench v11 typical install:

CIM6NEN, CIM6PEN, CIR9NML, CIR9PML, CIR9QML, CIR9RML, CIR9SML, CIR9TML, CIR9UML, CIR9VML, CIE5FML, CIE5GML, CIE5HML, CIE5IML, CI6XNML, CI6XPML, CI6XQML

You'll also need to download WebSphere Application Server Fix Pack 8.5.0.2 from Fix Central.

Important: If you are about to install MDM but downloaded the install images before 17th October 2013, you must download the product refresh first.

Once you have all the install images downloaded, the contents must be extracted into a specific directory structure required by the typical install. You'll require a third party tool to extract the .tar.gz images on Windows if you don't already have one. For example, I use 7-Zip. Alternatively, Download Director includes an Unpack option:

 

image

 

After extracting all the install images, open the install launchpad, which you can find in the MDM\disk1 directory (there are 32 and 64 bit versions). The typical workbench install link is right at the bottom of the launchpad:

image

 

 

When the install starts, you should be able to click through all the panels without changing anything:

  • Install package selection
    Make sure that you leave all packages selected, otherwise the typical install will switch to a custom install.
  • Prerequisites
  • Licenses
  • Location (shared resources)
  • Location (package groups)
  • Features (languages)
  • Features (selection)
    The required features are selected.
  • Features (configuration)
    There should just be Rational Application Developer help system configuration in the list. If you see any MDM Server configuration panels, the install has switched to a custom install.
  • Summary
    Click install!

Make sure you confirm that the IVT tests pass at the end of the install and, if they did, you're ready to start developing for MDM v11!

Note: you should change the default passwords for user accounts created by the typical install.

 

Workbench only install

If you don't want a local server to test changes on, installing the workbench is much quicker, since DB2, WebSphere Application Server and the server install are not required. In this case, you'll only need the following part numbers:

CIM7CML, CIR9TML, CIR9UML, CIR9VML, CIE5FML, CIE5GML, CIE5HML, CIE5IML

The launchpad doesn't support this scenario, so you have to install Installation Manager manually, add the necessary repositories, then install Rational Application Developer and the workbench. The MDM Workbench v11 Installation video demonstrates this type of install.

Alternatively, you can use the Installation Manager command line to install Rational Application Developer and the workbench in one step. For example, assuming you extract the install images in the same structure as for a typical install:

imcl install com.ibm.rational.application.developer.v85 com.ibm.im.mdm.workbench -repositories "C:\Temp\RAD","C:\Temp\MDMWB" -acceptLicense -installationDirectory "C:\IBM\SDP" -properties cic.selector.arch=x86_64 -showProgress

Where imcl.exe can be found in the eclipse\tools directory under the Installation Manager install location.

 

Related information

A typical install is ideal for demos or evaluating MDM but to set up developer environments I would recommend installing manually. You'll also need to do this if the typical install does not support your environment. The following blog post describes the manual install process:

  • Manually installing a version 11 development and test environment

There is also a wiki page with an up-to-date* list of install related information.

  • Installing MDM Version 11

(* Please update it if it's not up-to-date!)

 

Updates:

  • added headings and link to new YouTube video (14th October 2013)
  • new part numbers due to product refresh in October (28th November 2013)
  • included Download Director information (6th December 2013)
  • added 'Related information' section (20th January 2014)


Marcações:  install techtip prereqs mdm dest installation mdm-workbench mdm11

Manually installing a version 11 development and test environment

jtonline 110000B6Y8 | | Comments (22) | Visits (17076)

Tweet

There is a typical workstation install that automatically sets up a full development and test environment, described in the 'Full development environment install' section of the Installing MDM Workbench v11 post, however this approach has some limitations. For example, it will only work if you do not have any trace of the products it installs on your machine already, there are restrictions on which features can be installed, it is not possible to provide passwords, and so on.

The alternative is to manually install the MDM Workbench for MDM configuration and development, and an Operational Server for test purposes. In this example I will install a full development and test environment on Windows, using a DB2 database. The instructions below assume that you do not have any of the prerequisite software installed but, if you do, just skip the relevant steps.

To avoid problems with path lengths, special characters, or Windows virtualised directories, I installed all the software under a C:\IBM directory.

This blog post is accompanied by a series of videos on YouTube.

 

Downloading and extracting install images

These are all the install images I downloaded. See the Download IBM InfoSphere Master Data Management version 11.0 document for more part numbers and information about downloading from Passport Advantage. I extracted the downloads into the folder structure described in the Setting up the installation media topic, with a couple of additional directories as required. If you don't already have anything that can extract .tar.gz files, you can use the unpack option in download director:

image

 

Important: If you are about to install MDM but downloaded the install images before 17th October 2013, you must download the product refresh first.

Important: The workbench install will fail if the .tar.gz install images are extracted using WinZip. So far it looks like the Download Director unpack option, WinRAR, and 7-Zip all work but please leave a comment if you have problems with any unzip tools and I'll update the list.

 

IBM Installation Manager V1.6.0

This is required to install everything except DB2.

Part number: CIM7CML

 

DB2 Enterprise Server Edition V10.1

I used fix pack 2 to install DB2, available via the DB2 fix pack download page, rather than installing the GA version and upgrading. Alternatively, you could use the following part.

Part number: CI6WEML

 

Installation Startup Toolkit

This provides the scripts required to create an MDM database.

Part number: CIR9WML

 

Master Data Management Standard & Advanced Edition

This is the actual MDM Operational Server install.

Part numbers: CIR9NML, CIR9PML, CIR9QML, CIR9RML, CIR9SML

 

Master Data Management Workbench Standard & Advanced Edition

This is the Rational based workbench used to configure and develop MDM solutions.

Part numbers: CIR9TML, CIR9UML, CIR9VML

 

Rational Application Developer for WebSphere Software V8.5.1

I installed the workbench into Rational Application Developer but you could use Rational Software Architect for WebSphere Software instead. In either case you need at least version 8.5.1, however there is a known problem with version 8.5.5.

Part numbers: CIE5FML, CIE5GML, CIE5HML, CIE5IML

 

WebSphere Application Server V8.5.0.2

The minimum version required is 8.5.0 fix pack 2, otherwise the install verification tests will fail. I download WebSphere Application Server Fix Pack 8.5.0.2 from Fix Central. There are known problems if you are planning to use version 8.5.5.

Part numbers: CI6XNML, CI6XPML, CI6XQML

 

Installing Installation Manager

I ran install.exe to install Installation Manager in GUI mode. After installing Installation Manager you can add the required repositories individually before you run each install, as I did in the video series, or you can add all the repositories in one go as follows.

Create a repository.config file in the directory where you extracted the install images. Copy and paste in this content:

LayoutPolicy=Composite
LayoutPolicyVersion=0.0.0.1
repository.url.mdm=./MDM/disk1
repository.url.mdmst=./MDMST/disk1
repository.url.mdmwb=./MDMWB/disk1
repository.url.rad=./RAD/disk1
repository.url.was=./WAS
repository.url.wasfp=./WASFP

Edit any paths based on the directories you used before saving the file. Now you can add this single repository using the Installation Manager repository preferences and all the packages will show up on the install page.

For more information about Installation Manager, see the Installation Manager 1.6 documentation.

Note: you may have seen a suggestion to alter Installation Manager's agent data location using the cic.appDataLocation configuration setting, however it is not typically necessary, or a good idea, to change this setting.

 

Installing the workbench

Installing the workbench is straightforward once you've added the Rational Application Developer and workbench repositories to Installation Manager. Pick a suitable install location, for example C:\IBM\SDP, and you can accept the defaults for everything else.

In the MDM Workbench v11 Installation video I chose to install a few additional features from Rational Application Developer, for example 'JSF' for customising UIs. Depending on what work you'll be doing, you may want to do the same, and you can always use Installation Manager to add extra features later on if required.

 

Installing Operational Server prereqs

Before installing the MDM operational server, you need to install DB2 Enterprise Server Edition version 10.1 and WebSphere Application Server 8.5.0.2. In addition, the Installation Startup Toolkit provides the database scripts you'll need to create an MDM database.

You can watch how to run these installs in the MDM v11 Test Environment: Installing Prereq Software video.

 

DB2

Important: you must install DB2 in a directory called SQLLIB, otherwise the operational server install will not work. For example, I installed DB2 to C:\IBM\SQLLIB

I accepted most of the defaults in the DB2 install wizard, except that I chose not to enable email notifications or operating system security since this is for a development environment.

 

WebSphere Application Server and Installation Startup Toolkit

Both of these are installed using Installation Manager so I installed them at the same time. (You could even install them at the same time as the workbench to save time.)

Important: you must install fix pack 2 for WebSphere Application Server 8.5 otherwise the MDM install verification tests will fail.

I changed the install locations, to C:\IBM\AppServer and C:\IBM\MDMStartupKit respectively, but I accepted the defaults for everything else.

 

Preparing to install the Operational Server

There are several advantages to manually installing a development and test environment, however the biggest disadvantage compared to a typical install is that the installer does not create an MDM database or WebSphere profile for you. Instead, you have to prepare the database and prepare the application server before starting the install.

These are the steps I followed, which are covered in the MDM v11 Test Environment: Preparing to Install video.

 

Edit SQL files

There are a couple of SQL files provided in the startup toolkit for creating an MDM database on DB2:

  • CoreData\Full\DB2\Standard\ddl\CreateDB.sql
  • CoreData\Full\DB2\Standard\ddl\CreateTS.sql

Both these files contain placeholders which need to be replaced with suitable values before use. These are the values I used:

 

Placeholder Value
<DBNAME> MDMDB
<TERRITORY> US
<DBUSER> db2admin
<TABLE_MDS4K> TBS4K
<TABLE_SPACE> TBS8K
<TABLE_SPMDS> TBS16K
<INDEX_SPACE> INDEXSPACE1
<LONG_SPACE> LONGSPACE1
<TABLE_SPPMD> EMESPACE1
<TABLE_SPPMI> EMESPACE2

 

Notes: Authority will be granted to the user specified by the <DBUSER> value, so this should be different to the user running the scripts. The database name is easy to specify in the installer but here I used the default. The tablespace names need to match the settings used by the installer, and the easiest way to do that for a development environment is to use the values shown above.

The following PowerShell command will fill in the placeholders and I ran it for CreateDB.sql and CreateTS.sql rather than editing the files by hand:

powershell -command "(Get-Content C:\IBM\MDMStartupKit\CoreData\Full\DB2\Standard\ddl\CreateDB.sql) | Foreach-Object {$_ -replace '<DBNAME>','MDMDB' -replace '<TERRITORY>','US' -replace '<DBUSER>','db2admin' -replace '<TABLE_MDS4K>','TBS4K' -replace '<TABLE_SPACE>','TBS8K' -replace '<TABLE_SPMDS>','TBS16K' -replace '<INDEX_SPACE>','INDEXSPACE1' -replace '<LONG_SPACE>','LONGSPACE1' -replace '<TABLE_SPPMD>','EMESPACE1' -replace '<TABLE_SPPMI>','EMESPACE2' } | Set-Content C:\temp\CreateDB.sql"

 

Create database

After editing the SQL files, I ran them using this command in a DB2 Command Window:

db2 -v -td; -f C:\temp\CreateDB.sql

And the same for CreateTS.sql.

 

Create application server profile

I used the advanced option when creating an application server profile using the Profile Management Tool. I chose not to install the default application, gave the profile a meaningful name and picked the Development tuning setting. Administrative security must be enabled for MDM, and the advantage of creating the profile yourself is that you get to choose the username and password. If you run the Profile Management Tool as administrator, you will also be given the option to run the server process as a Windows process, which isn't necessary for a development environment.

Important: When creating a profile for use with the MDM Workbench, make sure you create it in the default location with a directory name that matches the profile name.

 

Set DB2_JDBC_DRIVER_PATH WebSphere variable

After creating the application server profile you need to set the DB2_JDBC_DRIVER_PATH variable. The value needs to be the path containing the directory where you installed DB2. I installed DB2 in C:\IBM\SQLLIB, so on my system the DB2_JDBC_DRIVER_PATH variable should be set to C:\IBM.

Important: Do not follow the instructions in the description for this WebSphere variable: the operational server install requires a non-standard setting.

 

Installing the Operational Server

Now everything should be ready for a successful MDM install, however the operational server installer does not do many checks so there is still a chance that you could encounter problems. When problems do occur, the install will either roll back, requiring a reinstall, or keep going without reporting any issues until the install verification tests fail. This makes it difficult to track down problems so it's worth taking time to make sure everything is configured correctly before starting the install.

In the MDM v11 Test Environment: Installing the Operational Server video I checked following prereqs that are prone to error.

 

Manual pre-install checks

First I checked that DB2 was installed in a directory called SQLLIB using the db2level command. Next I checked that the required table spaces had all been created using the command below:

db2 -td; CONNECT TO MDMDB && db2 -td; SELECT VARCHAR(TS.TBSPACE,20) AS TABLESPACE, TS.PAGESIZE, VARCHAR(BP.BPNAME,20) AS BUFFERPOOL FROM SYSCAT.TABLESPACES TS, SYSCAT.BUFFERPOOLS BP WHERE TS.BUFFERPOOLID = BP.BUFFERPOOLID

Finally I checked the DB2_JDBC_DRIVER_PATH variable using wsadmin:

wsadmin -lang jython -c "AdminTask.showVariables('[-scope Node -variableName DB2_JDBC_DRIVER_PATH]')"

This should be the parent directory of SQLLIB, which I checked first.

 

Running the install

The MDM Operational Server install is another Installation Manager based install, so add the MDM repository if you haven't already and choose the install option. I changed the installation directory to C:\IBM\MDM to avoid any confusion with the missing space in the default, 'ProgramFiles'.

In the video I chose to install the Business Administration UI feature, and there are other features you may want to select. The remaining panels cover the configuration settings required to set up MDM.

Note: There are several buttons to test connections and retrieve details from the application server during the configuration process which do not report process. It can look like the install has stopped responding but if you use a slightly long single click and wait a few minutes, you should see the status message change when the processing has completed.

 

Configuration settings

On the Database Configuration page, the Database home setting must match the value of the DB2_JDBC_DRIVER_PATH WebSphere variable, for example C:\IBM on my machine. The rest of the settings on this page should match the MDM database you created earlier.

Pick suitable values for your requirements on the History Configuration page.

Select the Base Edition option on the WebSphere Application Server Configuration page. The settings should match the application server profile you created earlier. You'll need to specify the correct SOAP port setting since the default value is wrong. This is likely to be 8880 if the MDM profile was the first profile you created, but you can check using the AboutThisProfile.txt file in the profile logs directory.

Pick suitable values for your requirements on the Application Configuration page.

There may be further configuration panels depending on the features you selected, for example I needed to complete the Business Administration UI settings for my install. There are a collection of worksheets describing all the configuration settings in the Information Center.

When you've completed all the configuration panels you can start the install. This will take some time to complete.

 

Troubleshooting

If the install finishes with no errors and there are no errors reported for the install verification tests, the install was successful. If the install fails for some reason, the installation troubleshooting topic will help you debug known install problems. There are several logs which may help with this process, including Installation Manager logs, application server logs, MDM install logs and install verification logs. For example:

  • C:\ProgramData\IBM\Installation Manager\logs
  • C:\IBM\AppServer\profiles\MdmDev01\logs\server1\SystemOut.log
  • C:\IBM\AppServer\profiles\MdmDev01\logs\server1\SystemErr.log
  • C:\IBM\MDM\mds\logs
  • C:\IBM\MDM\logs\database
  • C:\IBM\MDM\IVT\testCases\xml\response
  • C:\IBM\MDM\IVT\testCases\xml_virtual\response

The exact paths may be different on your system if you chose to install in different locations.

 

Feedback

I hope you find this post useful but if you do spot any errors or omissions, please leave a comment below. Any hints and tips based on your own install experiences would also be great. If you're having problems installing, the best place to ask questions is on the MDM forum.

 

Related information

For an up-to-date* list of install related information, see the following wiki page:

  • Installing MDM Version 11

(* Please update it if it's not up-to-date!)

 

Updates:

  • added 'Related information' section (20th January 2014)
  • added warning regarding WinZip issue (11th February 2014)


Marcações:  mdm11 install dest techtip installation mdm-workbench mdm-server

Testing MDM web services

tgarrard 120000GH07 | | Comments (2) | Visits (11591)

Tweet

As of version 10.1, MDM is secured by default. This means that using the Web Service Explorer to test your web service will not be possible. Whilst there are many web service testing tools out there like SOAPUI, there is one that is included within the MDM Workbench that you can use, the Generic Service Client. The following steps detail how to invoke the required MDM web service :

  1. To test the web service you need to select the wsdl that contains the operation you want to test (N.b. all MDM  web services can be found in the project CustomerResources in 10.1 and MDMSharedResources from v11 onwards). 
  2. Right click the wsdl and select Web Services -> Test with Generic Service Client

This will then open the following editor :

image

 

  1. Select your operation and fill in the required fields for the message (requestID, requesterName and requesterLocale and you transaction specifics)
  2. Modify the service binding i.e. your host/port. Select the transport tab specify the new bindings for the URL

image

 

  1. Specify your security details. To do this, select the Request Stack tab and select the "Override Stack" radio button. Then click the Add button and select the "User Name Token" from the drop down. You should be presented with the following :

image

 

  1. Fill in the Name and Password settings
  2. Finally click the Invoke button next to the Edit Request

The responses are saved and can be rerun, but if you want more functionality you'll probably need to look at Rational Performance Tester



Marcações:  mdm-workbench techtip security wsdl webservices mdm

Failed to connect to the JMX port on server

S Eggleston 2700002CDU | | Comment (1) | Visits (10875)

Tweet

Failed to connect to the JMX port on server

When you first connect from MDM Workbench to WebSphere Application Server (AppServer) where MDM Server is installed, for example to deploy a configuration project or to run a virtual job, you might see this error:

Job Manager Error - Failed to connect to the JMX port on server

There can be several reasons why the connection might fail, for background, here is the stack you are relying on when you connect to the JMX port.

image

In order for the JMX port connection to be successful, you need every component in this diagram to be in a fully functioning healthy state. And yes, that means there are a lot of places you can check! As a result, it's not practical here to explain every possible area to review, but this should give you some idea of where to start investigating.

To begin, cut the problem in half: there is a message associated with blueprint virtual bridge. Look for this, and it will help you decide whether the problem is more likely to be a runtime issue (below and to the right of the blueprint virtual bridge component) or a configuration issue

1. Look for virtual bridge messages

On the Application Server where MDM is hosted, open SystemOut.log or HPEL logs: if possible restart the AppServer first to make sure you have startup messages:

Success scenario

When the MBean starts successfully, you will see messages like these:

  • RMIConnectorC A ADMC0026I: The RMI Connector is available at port <xxxx>
  • JMXConnectors I ADMC0058I: The JMX JSR160RMI connector is available at port <xxxx>
  • BlueprintCont I org.apache.aries.blueprint.container.BlueprintContainerImpl doRun Bundle com.ibm.mdm.server.virtualbridge is waiting for dependencies [(objectClass=com.ibm.mdm.server.config.api.ConfigManager), (objectClass=com.initiatesystems.hub.mpinet.MPINetProtocolLogic)]

Note that these messages will only appear on startup, so they may not be visible if the logs have wrapped

If you have these success messages the Blueprint virtual bridge is available for JMX requests, and everything to the right of the diagram (MPIJNI, JMS, databases, filesystems) is healthy.

In this case the likely cause of the problems is to the left of the diagram, and probably relates to a configuration issue. More information is available in section 3. When the virtual bridge has started successfully

Failure scenario

When the MBean has not started, you see messages like this:

  • BlueprintCont E org.apache.aries.blueprint.container.BlueprintContainerImpl$1 run Unable to start blueprint container for bundle com.ibm.mdm.server.virtualbridge due to unresolved dependencies [(objectClass=com.ibm.mdm.server.config.api.ConfigManager), (objectClass=com.initiatesystems.hub.mpinet.MPINetProtocolLogic)]
  • This message identifies a problem with the JMX MBean listener, but does not in itself identify the root cause: to find the source, look for other messages in the logs with earlier timestamps.

If you have these failure messages the Blueprint virtual bridge is not available. More information is available in the next section, 2. When the virtual bridge has not started

No messages found

If you don't find any messages relating to com.ibm.mdm.server.virtualbridge, the most likely reason is that there were messages when the server started, but the logs have wrapped the virtual bridge messages are no longer current. The recommended action is to restart the server and collect the new logs.

2. When the virtual bridge has not started

When the blueprint virtual bridge has not started, the next step is to investigate potential runtime issues in one or more of the components on the right side of the diagram.

  1. Look for Database errors, search the Application Server logs for
    • SQL
    • DSRA
  2. Look for JMS errors, the JMX MBean for virtual relies on the SIBus messaging engine, search the Application Server logs for
    • CWSIS

Note that you can choose whether you use a datastore or filestore for the messaging engine data store: the default is datastore (database).

There may be file system errors, these will usually be reported by the component that depends on the file system, for example the database or the JMS filestore.

In many cases you will be able to find technotes or other links on the internet with information about how to resolve the errors, or if not, contact IBM support and provide the logs that show the errors.

These related links have information about resolving blueprint errors:

https://www.ibm.com/developerworks/community/wikis/home?lang=en#!/wiki/W2c3ebf603f05_4460_8e8b_a26780b35b45/page/Troubleshooting%20guidance%20for%20InfoSphere%20MDM%20installations

http://www-01.ibm.com/support/docview.wss?uid=swg21509668

3. When the virtual bridge has started successfully

Once you have found the success message, the next step is to investigate the configuration in both WebSphere Application Server and MDM Workbench.

  • com.ibm.mdm.server.virtualbridge is waiting for dependencies

Review the server logs for authorization errors

On the Application Server where MDM is hosted, open SystemOut.log or HPEL logs. Look for errors that reference one or more of:

  • LDAP
  • LTPA
  • SECJ

Errors with any of these codes suggest that you need to re-visit the security configuration in the WebSphere Application Server administrative console, and check userid and password settings in the workbench client. Review the error messages, in many cases you will be able to find technotes or other links on the internet with information about how to resolve them, or if not, contact IBM support and provide the logs that show them.

Review the firewall settings

Verify that you can ping from the Workbench machine to the machine that hosts WebSphere Application Server and MDM Server, using your preferred ping tool.

Optionally you can use "Test Connection" from MDM Workbench, although note that in an ND configuration this tool only checks the dmgr, so it may not be the correct status for the actual server where MDM is hosted.

If you can not connect to the target MDM server, the JMX connection will not work and you need to contact your networking support team to make sure the network is available and if necessary, that appropriate firewall ports are opened.

Review the port and host configuration

  1. In the WebSphere application console
    1. Go to Servers -> Server Types -> WebSphere application servers
    2. Select the server where the MDM runtime is installed
    3. Scroll down to the ports section and open it (see the screenshot below)
    4. Make a note of the ports for
      • BOOTSTRAP_ADDRESS
      • SOAP_CONNECTOR_ADDRESS
      • ORB_LISTENER_ADDRESS (if not 0)
  2. In the Workbench
    1. Go to the Servers tab (you may need to add it from Window -> Show View)
    2. In the Overview panel, under Server select Manually provide connection settings
    3. Set the RMI port to BOOTSTRAP_ADDRESS
    4. Set SOAP port to SOAP_CONNECTOR_ADDRESS
    5. Test these settings initially without any IPC port configuration
    6. If you are still not able to connect, also configure the IPC port with the ORB_LISTENER_ADDRESS and retest

Screen shot of the WebSphere Application Server ports in the Administration Console

 

 

 

 



Marcações:  jmx virtual-mdm techtip virtual

Automated Builds of Physical MDM Customisations

bakleks 270007PVJ3 | | Comments (2) | Visits (10526)

Tweet

This document outlines how Physical MDM customisations can be built from source artefacts in an automated build and test system. This document does not aim to be a complete guide on this topic, but rather to point the way to how some detailed steps can be implemented using examples.

 

1. Overview

The MDM Advanced or Standard editions both include the MDM Workbench. In version 11.0 and beyond the MDM Workbench is used by solution developers to create artefacts which customise the MDM solution for the physical, virtual and hybrid implementation styles. These source code artefacts are typically built into a Composite Bundle Archive (CBA) and deployed to WebSphere where they augment the functionality already available in the MDM Server Enterprise Business Application (EBA).

A good practice amongst MDM solution developers is to create an automated build process such that customisation source code is checked-in to a code control repository, and an automated build process takes those source files and builds the CBA ready for deploying onto post-build test systems, placing built artefacts into a second repository or shared file system.

Some automated systems take this “build” concept further, by automating the deployment of such built artefacts to test systems, which in turn report back on the “health” of the build, how many tests passed and failed, and generally quickly provide valuable feedback to developers whether recent changes broke the solution or not. Project managers overseeing such projects are able to reduce project risk by adopting this continuous delivery processes, and changes to MDM solutions become more reliable and safer as a result.

To add MDM solutions to such a continuous build environment it is necessary to: 

  1. Identify the pieces of the solution which represent the “source code” for the solution.
  2. Create a source code repository which identified “source code” can be checked into. GIT is a popular choice here, though there are many alternatives such as Rational Team Concert from IBM, CVS, SVN, etc.
  3. Recognize when a consistent set of code has been checked-in, at which point a “build” is started.
  4. Create a build environment.
  5. The build environment often “boot-straps” itself via a simple initial script which checks-out the rest of the build scripts which in turn build the artefacts from solution developers.
  6. The build scripts check out the artefacts from code control to the local file system.
  7. The source artefacts are processed, transforming them into built artefacts
  8. The build process often executes “unit tests” to further validate that the solution artefacts are healthy and do what they are expected to do.
  9. Built artefacts are published to a repository which versions every build and against which build metadata can be gathered and reported. Such build logs, unit tests results, and results of other tests indicate the “health” of the build.
  10. If the build is considered “good” then further automation can be added to deploy the built solution to a test environment, with higher-level tests (functional and end-to-end system tests) exercising the solution further. Such tests can also report back to the build repository on the health of each build.

 

This article is mostly concerned with step #7 – building source artefacts.

 

2. Materials and prerequisites

This article is accompanied by a collection of example scripts. We do not intend that these are used directly, but as an example of how you may wish to implement your own automated build process.

The current solution consists of four main files: 

  • mdm_wb_build.xml

  • mdm_wb_build.properties

  • mdm_wb_build_report.xsl

  • mdm_wb_build_inside_eclipse.xml

 

In order for the scripts to work, the machine running the scripts needs to have the following products installed:

  • Rational Application Developer (RAD) or Rational Software Architect (RSA)

  • WebSphere Application Server (WAS) with a profile set up

  • MDM Workbench (the scripts accommodate version 11.0 and above)

 

mdm_wb_build.properties contains a list of properties that need to be defined before running the scripts:

  • “InputFolder” defines the folder containing the code to be imported/built.

  • “OutputFolderPrefix” property defines the folder to which the projects will be copied and CBAs exported. Currently the script generates a time-stamp that gets added to the defined output folder prefix.

  • “AntContribHome” should be set to the folder containing “ant-contrib.jar” which provides some of the functionality used within the scripts.

  • “EclipseHome” should be set to the installation folder of RAD or RSA.

  • “WASHome” should be set to the installation folder of WAS and both “WASRuntimeTypeId” and ““WASRuntimeId”should be specified as well.

 

To run the Ant scripts the user needs to run mdm_wb_build.xml as a build file.

The script contains only the “runBuild” target.

The target checks that necessary properties, such as Eclipse Home, date and time stamps and output folder prefix are set. Provided these properties do exist, it creates a folder based on OutputFolderPrefix and date and time, within which “logs”, “CBAExport” and “workspace” folders are created.

The logs folder contains “MDM_WB_BUILD_log.html” and “MDM_WB_BUILD_log.xml”. The XML file stores debug information. The HTML file contains general information on whether the main steps were successful, for example: 

checkPropertiesExist: BUILD SUCCESSFUL

createServerRuntimeAndTargetPlatform: BUILD SUCCESSFUL

generateDevProject: BUILD SUCCESSFUL

workspaceBuild: BUILD SUCCESSFUL

exportCBA: BUILD SUCCESSFUL

End of report.

 

The CBAExport folder contains all of the exported CBAs.

The workspace folder contains a local copy of build artefacts.

After the directories have been created, the script checks which operating system it is running on and sets the isLinux or isWindows property to “true” as appropriate and calls either runAnt.sh or runAnt.bat to run a headless Eclipse process. The relevant file (either the batch or the shell script) should be available by default in the bin directory in the Eclipse installation directory.

The runAnt script then sets up the log files, environment variables and runs a second script “mdm_wb_build_inside_eclipse.xml” inside a headless RAD/RSA environment.

 

3. Step breakdown of the automated build and test system

Given that automated building and testing of MDM solutions is a worthwhile goal, the following sections provide some guidance in some of these areas where actions specific to the MDM tools and development/build environment are necessary, and some points of discussion are presented where choices exist.

 

3.1 Identify the pieces of the solution which represent the “source code” for the solution.

The source code for an MDM solution will be made up of a collection of Eclipse projects and their contents. MDM development, MDM configuration, MDM hybrid mapping, MDM service tailoring, MDM custom interface, MDM metadata and other MDM-specific projects types. CBA projects will add to the list.

MDM Development projects contain a “module.mdmxmi” file, which contains a model of the customizations which the project aims to create. This file should always be considered to be source code.

At some point the mdmxmi file will be used to generate Java, XML, SQL and other file artefacts, and there are a few different approaches you can take for these files:

The current solution is to only consider files which have been manually changed as “source code”, and “generate artefacts” from the mdmxmi model as part of the automated build process itself. This approach demands that the MDM workbench tools are installed as part of the build environment, because the “generate artefacts” process that turns .mdmxmi files into other artefacts will be a necessary part of the build process.

A project “MDMSharedResources” in the workspace can be considered “source code”, and it is recommended that this project is checked-into the code control system, together with its .project file and .mdmxmi file. The other contents of this project can generally be ignored from check-in, as the artefacts in this project often will be re-created by the automated build system.

 

3.2 Create a source code repository.

There are many choices regarding which product to use as a source code repository and covering them is not the aim of this document.

 

3.3 Recognize when a consistent set of code has been checked-in, at which point a “build” is started.

This event may be triggered manually, automated overnight, or whenever a change-set is delivered to the code stream. The capturing of this event is often specific to the code control system being used, though some solution teams augment this by adding a web page that enables build requests to be manually requested.

 

3.4 Create a build environment.

A build environment should include RAD (or RSA) which can be called in a “headless” manner such that functionality within RAD can be used without a user-interface being present.

MDM Workbench will be required in addition to RAD to perform a complete build of “module.mdmxmi” files.

For the list of platforms that MDM Workbench v11.0 and onwards support – refer to the product release documentation.

 

3.5. The build environment “boot-straps” itself.

A small script is responsible for “boot-strapping” the process by it checking-out the other build scripts which in turn build the artefacts from solution developers.

 

3.6 The build scripts check out the artefacts from code control to the local file system.

These actions are specific to the code control system so will not be discussed further here.

 

3.7. The source artefacts are processed, transforming them into built artefacts.

This phase of the automated system typically consists of a hierarchy of Ant files which decomposes the overall build process into many smaller steps and “Ant targets”. The Maven framework is a common choice of technology to oversee this phase.

These Ant files can be categorized into two types:

  1. Run outside of the RAD environment;
  2. Run inside a “headless” RAD instance.

 

For a detailed walkthrough of specific implementations of the build process refer to the Ant scripts provided with this blog entry.

 

3.8. The build process often executes “unit tests” to further validate that the solution artefacts are healthy and do what they are expected to do.

The tools and approaches used to execute unit tests vary widely dependent on technology choice. Simple Java JUnit tests offer one simple solution, which can be invoked with scripts once the tests and tested code are built.

 

3.9 Built artefacts are published to a repository.

Every build against which build metadata can be gathered and reported is versioned by the publish process. Build logs, unit test results and results of other tests indicating the “health” of the build are gathered and published to the repository as well.

Products such as Rational Asset Manager can be used here, or for a really basic solution a simple shared folder on a network drive may suffice.

 

3.10 Automated deploy and test health of overall build.

If the build is considered “good” then further automation can be added to deploy the built solution to a test environment, with higher-level tests (functional and end-to-end system tests) exercising the solution further. Such tests can also report back to the build repository on the health of each build.

The automated deployment of entire systems for testing is often one of the most complex areas of the whole continuous development process. Products such as Rational Urban Code Deploy (UCD) can be used for this stage of the process, though for some environments a set of (reasonably complex) scripts might be sufficient.

For the MDM pieces, we are mostly concerned with deploying and un-deploying SQL scripts, deploying/un-deploying CBA files, and starting and stopping the MDM solution and WebSphere server(s).

Prior to deploying extensions to the server, it is often necessary to modify the database. This is possible using the SQL scripts found in the MDMSharedResources project in the built workspace. Rollback scripts in the same location should be applied once testing is complete to reset the database back to a known good state.

For CBA deployment, Jython scripts can be used to manipulate the WebSphere server. Detailed documentation of these steps can be found in the WebSphere documentation.

Additional documentation on adding the CBAs to the WebSphere local bundle repository, extending the MDM server EBAs, starting and stopping the WAS server is available on the MDM Developer Site.

 

Sample scripts and projects are available for download.



Marcações:  cba osgi build mdm-workbench ant mdm rad techtip

Customising BPM Process Applications using CSS files

Nic Townsend 2700051ED4 | | Visits (9244)

Tweet

MDM v11.3 leverages IBM BPM technology to provide a data stewardship framework under a single UI. However, it may be desirable to modify this UI to match the look and feel of your existing solutions. While BPM Process Designer allows you to restructure page layout with the Coach designer, the best way to modify the look and feel of a Coach is with CSS.

BPM provides three ways to alter CSS out of the box from within Process Designer:

  1. Supplying a CSS file as an attached script, or editing the inline CSS section of a Coach View
  2. Adding a <style> HTML attribute on a Coach View inside a Coach
  3. Supplying CSS as a Custom HTML fragment inside a Coach

However, there are occasions where these approaches are not suitable:

  1. You may not own the Coach View so cannot directly update the attached scripts or inline CSS
  2. You want to use the CSS across multiple Coach Views rather than scoping it to the single instance of the Coach View. Or you want to customise multiple Coach Views without manually adding <style> attributes to each Coach View.
  3. You do not want to have to maintain multiple copies of the same CSS file inside each of the Custom HTML fragments.

The ideal way to insert CSS into a Coach would be to load the CSS as a managed file in BPM - that way you only need to edit the managed file and all Coaches that reference the CSS would use the latest version (pending updated snapshots). Unfortunately, BPM does not offer the mechanism out of the box.

Update 05/11 - If you upload a HTML file to BPM that consists of <style> tags wrapping the CSS, you can use the "Managed File" option of the Custom HTML component to load the "HTML" CSS into the Coach. However, this does not work if the HTML file is inside a .zip archive, or if the CSS needs to reference local resources.

A documented Javascript function in BPM holds the key: com_ibm_bpm_coach.getManagedAssetUrl. This function allows you to get a URL for a managed file. The function call is: com_ibm_bpm_coach.getManagedAssetUrl(filePath, assetType, projectAcronym);

 

filePath

This is the name of the file you want a URL for. This can take two forms:

  • Simplest case, the name of the managed file directly - "styling.css"
  • If the managed file is an archive, you can supply the archive name with the internal path appended - "styling.zip/css/styling.css"
assetType

Either web, server, or design - this is the asset type for the managed file. BPM has utility statics for this: 

  • com_ibm_bpm_coach.assetType_WEB
  • com_ibm_bpm_coach.assetType_SERVER
  • com_ibm_bpm_coach.assetType_DESIGN
projectAcronym The short name for the process app/toolkit where the managed file is stored.

 

By using this method call, you can get a relative URL to the managed file in BPM - eg /teamworks/webasset/2064.146ca826-2525-45e7-bcf5-b68b4c46eadc/J/styling.zip/css/coachstyling.css.

Using the above technique, I created a HTML file containing a <script> that would create a link to the URL for a CSS file in the document's <head> element. I then used the "Managed File" option on a Custom HTML component to load the script into a Coach. This meant that my CSS file was referenced inside the <head> element.

You can use this principle to load your own custom CSS files into a Coach. Custom CSS files can be used to override BPM Coach Views or Coaches, or alternatively CSS files can be used to override the MDM Coach Views supplied with MDM V11 onwards.

As an example, I present an updated solution to: creating a custom coach view to load CSS tree icons. Rather than producing a custom Coach View to wrapper the MDM Tree and load tree icons, the HTML script can be used to the same effect. You simply write a HTML script to link to the url for the CSS file that references the MDM Tree icons, and insert the HTML as a Custom HTML component on the same Coach as the MDM Tree. The advantage of this solution is that you do not have to create a new Coach View for every set of icons; you just use a new HTML script to link to each CSS file required for the page.

Want to know about MDM Workflow? We have the answers

jaylimburn 2700028UUJ | | Visits (9108)

Tweet

Rarely do I talk to a customer about MDM without the role of Workflow being discussed within the first 10 minutes. Correct usage of Workflow when interacting with Master Data is important both for an effective Data Stewardship strategy but also to ensure that Master Data is served up to lines of business in an organized manner. IBM InfoSphere MDM provides robust Workflow capabilities out of the box that can be utilized across many domains, addressing workflow requirements for master data stewardship and governance as well as enterprise wide consumption.

Recently IBM has published some excellent information to help you understand how you have utilize MDM Workflow to improve your master data quality and enhance your enterprise processes. Check out the links below to understand how MDM Workflow can help you and your business....

 

IBM developerWorks explains the MDM Workflow capabilities of the InfoSphere MDM platform to Ensterprize Architects:

InfoSphere MDM for master data governance with MDM workflow

 

IBM DataMag article explains how workflow with MDM can enhance your business processes:

No More Excuses: Improving Business Processes and Decisions



Marcações:  data-stewardship data-magazine governance workflow mdm

Deploying an MDM Virtual project with sample data using MDM Workbench

Nic Townsend 2700051ED4 | | Comment (1) | Visits (9055)

Tweet
This blog post details how to use the template models provided in MDM Workbench to create and deploy a new Virtual Configuration Project to an Operational Server, and then deploy the sample data supplied in the template model.

Creating a new Virtual Configuration Project

  1. Open MDM Workbench.
  2. Click File > New > Other > Master Data Management > Configuration Project.
  3. Click Next.
  4. Enter a name for the new Configuration Project. For this blog post I have named mine "Party".
  5. Click Next.
  6. Select a template to use for the project. For this blog post I selected the Virtual > Party template.
  7. Click Finish. Workbench will create a new project using the name you supplied. It can be viewed in Package Explorer.

Creating a connection to the Operational Server

  1. Switch to the MDM Configuration perspective in Workbench.
  2. Click the Servers tab at the bottom pane of Workbench, right click on the white space in the Servers window and select New >Server.
  3. Select Websphere Application Server 8.5 as the server type.
  4. Enter the server's hostname/IP address in Server's host name.
  5. Enter an identifable name for the server in Server name.
  6. Make sure the Server runtime environment is set to Websphere Application Server v8.5.
  7. Click Next.
  8. If the Operational Server is on the same machine as Workbench, ensure the correct profile name is selected and click Automatically determine connection settings.
  9. If the Operational Server is a remote machine, click Manually provide connection settings.
    1. You need to supply the correct port numbers for the Operational Server.
      1. IPC is the IPC_CONNECTOR_ADDRESS.
      2. RMI is the BOOTSTRAP_ADDRESS.
      3. SOAP is the SOAP_CONNECTOR_ADDRESS.
  10. If security is enabled on the Operational Server, tick Security is enabled on this server.
    1. Enter the username and password to connect to the Operational Server.
  11. Enter the Application server name for the Operational Server.
  12. Click Finish.    

Deploy the new Configuration Project:

  1. At the top of Workbench, click Master Data Management > Deploy Configuration Project.
  2. Select the "Party" project you have just created.
  3. Select the Server you have just defined the connection for.
  4. Leave the check boxes unchanged.
  5. Click Finish and wait for the wizard to deploy the project to the Operational Server.

Processing and loading the sample data:

  1. Inside the Configuration Project there is a DemoData_config folder. This needs to be copied into the work directory on the Operational Server. This will be located under <WAS_install>/profiles/<AppSvr01 or Node01>/installedApps/<Cell name>/MDM-native*.ear/native.war/work/<Workbench Configuration Project Name>/work
  2. In the DemoData_config folder there will be two types of files; *-mpxdata-input.cfg and *.unl. Make a note of all the file names. The "Party" project I created has four files - org-mpxdata-input.cfg, orgdata.unl, per-mpx-data-input.cfg, persondata.unl.
  3. In Workbench, click Master Data Management > New Job Set.
  4. Select the "Party" project
  5. Leave Job Set Template as None
  6. Select the correct Server
  7. Click Next
  8. Click the add button (green plus sign), and select Bulk Tools > Derive Data and Create UNLs (mpxdata)
  9. Click OK.
  10. Under the Input tab, enter DemoData_Config/orgdata.unl in the Input File field. If your Configuration project does not have orgdata.unl, enter a valid *.unl file from the DemoData_config folder. This field will look in the Operational Server's work directory (which is why we copied the folder into the work directory on the Operational Server)
  11. Leave the rest of the tab unchanged.
  12. Click the Input Config tab.
  13. Click the Import From File button. This will open a File Browser window folder structure at the "Party" project root. Open the DemoData_Config folder and select the org-mpxdata-input.cfg file.
  14. Click the Options tab
  15. Select MEMCOMPUTE under Mode. Leave everything else unchanged.
  16. Click Finish. This should start the job running. Select the Jobs tab in Workbench and wait for the job to finish. Once the job finishes, right click on the job and select Run Jobset Again. This will load the Job Configuration window again.
  17. Click the Options tab and select MEMPUT. Make sure that at the bottom of the window, Mem mode is complete, and Put type is insert_update.
  18. Click Finish. Select the Jobs tab in Workbench and wait for the job to finish.
  19. Repeat steps 3-18 for the remaining *.unl with corresponding *-mpxdata-input.cfg files.


Comparing member records:

  1. Once you have finished running the mpxdata jobs,  click Master Data Management > New Job Set.
  2. Click the add button, and select Bulk Tools > Compare Members in Bulk (mpxcomp).
  3. Under entity type, I selected mdmper - this will differ if you did not choose the Party template.
  4. Click the Options tab, at the bottom of the window select Match, link, and search for the Comparison mode.
  5. Click Finish. Select the Jobs tab in Workbench and wait for the job to finish.
  6. Repeat steps 1-5 for the remaining entity types - I only had mdmorg left to do.


Enabling entity linkage:

  1. Once you have finished running the mpxcomp jobs,  click Master Data Management > New Job Set.
  2. Click the add button, and select Bulk Tools > Link Entities (mpxlink).
  3. Under entity type, I selected mdmper - this will differ if you did not choose the Party template.
  4. On the Inputs and Outputs tab check the Generate BXM output option. Leave the rest of the tab unchanged.
  5. Click Finish. Select the Jobs tab in Workbench and wait for the job to finish.
  6. Repeat steps 1-5 for the remaining entity types - I only had mdmorg left to do.

 

Conclusion

After completing these steps, you will have deployed a new Configuration Project to your Operational Server, and you will have populated the server with the sample data from the project template. To confirm this, perform a search against your Operational Server and you should have entities returned.



Marcações:  mdm11 virtual initiate techtip sample configuration mdm mds mdm-workbench

Introduce yourself

jtonline 110000B6Y8 | | Comments (5) | Visits (8938)

Tweet

From an early MDM Workbench news site, the MDM Developers community has evolved and grown to a group of over 200 members, and it would be great to take a break from the usual posts and forum discussions to find out more about some of you with a quick blog interview. Whether you're a new member or a long term contributor, please say hi and tell us a little about yourself.

Feel free to leave a comment and answer any of the following questions that resonate with you, or add your own questions instead. This is just a casual blog interview and meant to be more like a real world conversation, rather than a formal resume or biography!

For fun, and a bit of encouragement, I have a few limited edition MDM Developer community stickers to give away!

image

Here are a few questions to get you started:

  • How long have you been working with MDM?
  • What did you do before that?
  • What inspires you in your work?
  • What are you passionate about?
  • If you were stuck on a technology deprived island, what single technology could you not live without?
  • What book are you currently reading?
  • What has been your biggest surprise you have witnessed in the technology industry?
  • Is there any technology that you think should get more respect and adoption but does not?
  • What is your favorite technology that fizzled or failed to live up to the hype?
  • Any new technologies that you think are about to break into the big time?
  • What future technology would make your life easier?
  • How do you prefer to find answers to your questions?
  • How are you using social networking today?
  • How could you see yourself using it in 5 years?
  • What are some of your favorite websites/feeds/twitter accounts to follow?
  • What publications / websites do you read / visit?
  • Where can we find you on Twitter, LInkedin, Facebook, and other social spaces?
  • Are you planning to attend any events where you are likely to meet other MDM developers?

Update: Unfortunately no one replied in time to claim the ticket that prompted this blog post. Luckily if you're one of the first to reply, you could still get one of these, much more exclusive, MDM Developers stickers!

image

Photo © Alexander Henning Drachmann (CC BY-SA 2.0)
 



Marcações:  community

MDM AE pMDM with RESTful web services

Dany Drouin 270004VXKT | | Comments (7) | Visits (8902)

Tweet

 

MDM AE pMDM with RESTful web services



New in MDM v11.4 is the ability to submit REST requests to the service controller.
No more complex SOAP client code, or worst EJB remote calls!


image

Previously, interactions with MDM Operational server were possible with EJB/RMI, JMS, JAX-WS and JAX-RPC(deprecated).  We have now added JAX-RS to the mix.

Possible payloads that are accepted are application/xml and application/json.  JSON support was added in v11.4 FP1.

It is important to note that all REST interaction are using one RESTful service “MDMWSRESTful”, PUT method type only and accessed via URI http://server:port/com.ibm.mdm.server.ws.restful/resources/MDMWSRESTful. 

The same xml request/response payload used for EJB/RMI is used for REST interactions.
ASI (Adaptive Service Interface) can be used in conjunction with the RESTful service.  This is particularly useful if a simplified XML/JSON structure is required for interactions or if the MDM solution must adapt to a certain XML industry standard such as IFW, ACCORD, and NIEM.

For the full list of capabilities and supported request headers consult the following documentation link:

https://www-01.ibm.com/support/knowledgecenter/SSWSR9_11.4.0/com.ibm.mdmhs.dev.platform.doc/concepts/c_rest_web_services.html

 

 

Interacting with MDMRESTful service



Using Apache Wink (or any other REST client api of choice), you can with minimal code submit a request to the MDM Hub Operational Server

Here’s a sample client leveraging Apache Wink demonstrating an MDM RESTful call:

ClientConfig config = new ClientConfig();
// setup basic authentication
BasicAuthSecurityHandler basicAuthHandler = new BasicAuthSecurityHandler();
basicAuthHandler.setUserName("mdmadmin");
basicAuthHandler.setPassword("mdmadmin");
basicAuthHandler.setSSLRequired(false);
config.handlers(basicAuthHandler);

RestClient rc = new RestClient(config);

Resource r =  rc.resource("http://myserver.com:9080/com.ibm.mdm.server.ws.restful/resources/MDMWSRESTful");

// optional MDM headers
r.header("TargetApplication", "");
r.header("RequestType", "");
r.header("Parser", "");
r.header("ResponseType", "");
r.header("Constructor", "");
r.header("OperationType", "");
r.header("ASI_Request", "");
r.header("ASI_Response", "");        
// submit put request of the xml payload.
String response = r.contentType("application/xml")
                   .accept("application/xml")
                   .put(String.class, requestPayload);

The above code will submitting an MDM xml payload and expecting back an xml response.

This is determined by the ‘Content-type’ and ‘Accept’ http header properties.

Here’s a look at a getParty xml payload and response:

Request XML:

<TCRMService xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xmlns="http://www.ibm.com/mdm/schema" 
              xsi:schemaLocation="http://www.ibm.com/mdm/schema MDMDomains.xsd">
    <RequestControl>
        <requestID>100187</requestID>
        <DWLControl>
            <requesterName>cusadmin</requesterName>
            <requesterLocale>en</requesterLocale>           
        </DWLControl>
    </RequestControl>
    <TCRMInquiry>
        <InquiryType>getParty</InquiryType>
        <InquiryParam>
            <tcrmParam name="PartyId">1</tcrmParam>
            <tcrmParam name="PartyType">P</tcrmParam>
            <tcrmParam name="InquiryLevel">1</tcrmParam>
        </InquiryParam>
    </TCRMInquiry>
</TCRMService>

Response XML (portion of the response has been trimmed):

<TCRMService
    xmlns="http://www.ibm.com/mdm/schema"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.ibm.com/mdm/schema MDMDomains.xsd">
    <ResponseControl>
        <ResultCode>SUCCESS</ResultCode>
        <ServiceTime>123</ServiceTime>
        <DWLControl>
            <requesterName>mdmadmin</requesterName>
            <requesterLanguage>100</requesterLanguage>
            <requesterLocale>en</requesterLocale>
            <userRole>mdm_admin</userRole>
            <requestID>100187</requestID>
        </DWLControl>
    </ResponseControl>
    <TxResponse>
        <RequestType>getParty</RequestType>
        <TxResult>
            <ResultCode>SUCCESS</ResultCode>
        </TxResult>
        <ResponseObject>
            <TCRMPersonBObj>
                <PartyId>1</PartyId>
                <DisplayName>Jane Doh</DisplayName>
                …
                <DWLStatus>
                    <Status>0</Status>
                </DWLStatus>
            </TCRMPersonBObj>
        </ResponseObject>
    </TxResponse>
</TCRMService>

The same request/response as JSON, using application/json, as both content-type and accept:

Request JSON:

{
             "TCRMService": {
                          "@schemaLocation": "http:\/\/www.ibm.com\/mdm\/schema MDMDomains.xsd",
                          "RequestControl": {
                                      "requestID": 604157,
                                      "DWLControl": {
                                                   "requesterName": "cusadmin",
                                                   "requesterLocale": "en"
                                      }
                          },
                          "TCRMInquiry": {
                                      "InquiryType": "getParty",
                                      "InquiryParam": {
                                                   "tcrmParam": [{
                                                                "@name": "PartyId",
                                                                "$": "1"
                                                   },
                                                   {
                                                                "@name": "PartyType",
                                                                "$": "P"
                                                   },
                                                   {
                                                                "@name": "InquiryLevel",
                                                                "$": "1"
                                                   }]
                                      }
                          }
             }
}

Response JSON (full response is shown):

{
    "TCRMService": {
        "@schemaLocation": "http://www.ibm.com/mdm/schema MDMDomains.xsd",
        "ResponseControl": {
            "ResultCode": "SUCCESS",
            "ServiceTime": "86",
            "DWLControl": {
                "requesterName": "mdmadmin",
                "requesterLanguage": "100",
                "requesterLocale": "en",
                "userRole": "mdm_admin",
                "requestID": "604157"
            }
        },
        "TxResponse": {
            "RequestType": "getParty",
            "TxResult": {
                "ResultCode": "SUCCESS"
            },
            "ResponseObject": {
                "TCRMPersonBObj": {
                    "PartyId": "1",
                    "DisplayName": "Jane Doh",
                    "PreferredLanguageType": "100",
                    "PreferredLanguageValue": "English",
                    "ComputerAccessType": "1",
                    "ComputerAccessValue": "14.4K Baud",
                    "PartyType": "P",
                    "CreatedDate": "2015-09-10 15:16:47.583",
                    "SinceDate": "2015-09-10 00:00:00.0",
                    "StatementFrequencyType": "1",
                    "StatementFrequencyValue": "Annually",
                    "ClientStatusType": "1",
                    "ClientStatusValue": "Active",
                    "AlertIndicator": "N",
                    "SolicitationIndicator": "N",
                    "ConfidentialIndicator": "N",
                    "ClientPotentialType": "1",
                    "ClientPotentialValue": "Client",
                    "ClientImportanceType": "4",
                    "ClientImportanceValue": "Medium",
                    "DoNotDeleteIndicator": "1",
                    "PartyLastUpdateDate": "2015-09-10 15:16:50.138",
                    "PartyLastUpdateUser": "mdmadmin",
                    "PartyLastUpdateTxId": "616244191260573356",
                    "PersonPartyId": "1",
                    "BirthDate": "1966-08-25 00:00:00.0",
                    "BirthPlaceType": "1",
                    "BirthPlaceValue": "Afghanistan",
                    "GenderType": "M",
                    "UserIndicator": "N",
                    "AgeVerifiedWithType": "2",
                    "AgeVerifiedWithValue": "Passport",
                    "HighestEducationType": "5",
                    "HighestEducationValue": "Master Degree",
                    "CitizenshipType": "1",
                    "CitizenshipValue": "Afghanistan",
                    "NumberOfChildren": "3",
                    "MaritalStatusType": "2",
                    "MaritalStatusValue": "Single",
                    "PartyActiveIndicator": "Y",
                    "PersonLastUpdateDate": "2015-09-10 15:16:50.31",
                    "PersonLastUpdateUser": "mdmadmin",
                    "PersonLastUpdateTxId": "616244191260573356",
                    "TCRMPartyAddressBObj": [
                        {
                            "PartyAddressIdPK": "826144191261145264",
                            "PartyId": "1",
                            "AddressId": "828144191261121794",
                            "AddressUsageType": "1",
                            "AddressUsageValue": "Primary Residence",
                            "StartDate": "2001-06-11 00:00:00.0",
                            "AddressGroupLastUpdateDate": "2015-09-10 15:16:51.591",
                            "AddressGroupLastUpdateUser": "mdmadmin",
                            "AddressGroupLastUpdateTxId": "616244191260573356",
                            "LocationGroupLastUpdateDate": "2015-09-10 15:16:51.451",
                            "LocationGroupLastUpdateUser": "mdmadmin",
                            "LocationGroupLastUpdateTxId": "616244191260573356",
                            "TCRMAddressBObj": {
                                "AddressIdPK": "828144191261121794",
                                "ResidenceType": "2",
                                "ResidenceValue": "Detached House",
                                "AddressLineOne": "120 Richmond St",
                                "City": "Toronto",
                                "ZipPostalCode": "M5A 1P4",
                                "ResidenceNumber": "789",
                                "ProvinceStateType": "108",
                                "ProvinceStateValue": "ON",
                                "CountyCode": "1",
                                "CountryType": "31",
                                "CountryValue": "Canada",
                                "LatitudeDegrees": "180",
                                "LongitudeDegrees": "90",
                                "AddressLastUpdateDate": "2015-09-10 15:16:51.216",
                                "AddressLastUpdateUser": "mdmadmin",
                                "AddressLastUpdateTxId": "616244191260573356",
                                "DWLStatus": {
                                    "Status": "0"
                                }
                            },
                            "DWLStatus": {
                                "Status": "0"
                            }
                        }
                    ],
                    "TCRMPartyIdentificationBObj": [
                        {
                            "IdentificationIdPK": "826744191261084265",
                            "PartyId": "1",
                            "IdentificationType": "1",
                            "IdentificationValue": "Social Security Number",
                            "IdentificationNumber": "1245",
                            "IdentificationStatusType": "2",
                            "IdentificationStatusValue": "Active",
                            "IdentificationExpiryDate": "2005-08-11 23:59:59.0",
                            "StartDate": "2002-02-02 00:00:00.0",
                            "PartyIdentificationLastUpdateDate": "2015-09-10 15:16:50.841",
                            "PartyIdentificationLastUpdateUser": "mdmadmin",
                            "PartyIdentificationLastUpdateTxId": "616244191260573356",
                            "DWLStatus": {
                                "Status": "0"
                            }
                        }
                    ],
                    "TCRMPersonNameBObj": [
                        {
                            "PersonNameIdPK": "822844191261056191",
                            "NameUsageType": "1",
                            "NameUsageValue": "Legal",
                            "PrefixType": "12",
                            "PrefixValue": "Miss",
                            "GivenNameOne": "Jane",
                            "StdGivenNameOne": "JANE",
                            "LastName": "Doh",
                            "StdLastName": "DOH",
                            "PersonPartyId": "1",
                            "StartDate": "2002-02-02 00:00:00.0",
                            "PersonNameLastUpdateDate": "2015-09-10 15:16:50.56",
                            "PersonNameLastUpdateUser": "mdmadmin",
                            "PersonNameLastUpdateTxId": "616244191260573356",
                            "LastUpdatedBy": "mdmadmin",
                            "LastUpdatedDate": "2015-09-10 15:16:50.56",
                            "DWLStatus": {
                                "Status": "0"
                            }
                        }
                    ],
                    "DWLStatus": {
                        "Status": "0"
                    }
                }
            }
        }
    }
}


How does MDM handle the JSON requests/responses?


 

The default MDM JSON model is actually based on the core XML schema model (MDMCommon.xsd and MDMDomains.xsd).  Internally, MDM will validate the JSON using these schemas. 

We use a “Mapped notation” api to build the JSON.  A couple things to note about this implementation: 

  • XML elements that contain nested elements will be mapped to JSON objects (ie. <TCRMPersonBObj> will be "TCRMPersonBObj": { … })
  • XML element values will be mapped to a JSON name/value pair (ie. <GivenNameOne>Jane</GivenNameOne> will be "GivenNameOne": "Jane")
  • XML element attributes become JSON name/value pairs within the JSON object. (ie. <tcrmParam name="PartyId">1</tcrmParam> will be "tcrmParam": { "@name": "PartyId", "$": "1" })  Note the attribute is prefix with “@”.  The xml element value will be mapped to the name/value pair using the name “$”.  The “@” and “$” characters are used to distinguish the attribute value from the element’s value.
  • XML array elements, an XML document which contains multiple nested elements of the same name will automatically be mapped to a JSON array.  (ie.
    <TCRMPersonNameBObj>
       <PersonNameIdPK>822844191261056191</PersonNameIdPK>
       <NameUsageType>1</NameUsageType>
    ...
    </TCRMPersonNameBObj>
    <TCRMPersonNameBObj>
       <PersonNameIdPK>821544198626606646</PersonNameIdPK>
       <NameUsageType>2</NameUsageType>
    ...
    </TCRMPersonNameBObj> will be

"TCRMPersonNameBObj": [ {
    "PersonNameIdPK": "822844191261056191",
    "NameUsageType": "1",
  ...
},
{
    "PersonNameIdPK": "821544198626606646",
    "NameUsageType": "2",
  ...
}]
)

There are instances where we need to force the return of a JSON array.  This is the case for list type elements in MDM or results which include multiple elements.  To control this behavior, you can set the elements which need to be forced as JSON array using the Configuration Management element “/IBM/DWLCommonServices/Restful/JsonResponse/ArrayObjects” (comma separated value list).

 

Don’t want to write any code to test your MDM services?



Using Firefox/Chrome POSTMAN REST client extension, you can submit your requests directly to the backend server securely.
image


Enter REST URL as http://localhost:9080/com.ibm.mdm.server.ws.restful/resources/MDMWSRESTful

Choose “PUT” as the HTTP method
Set headers: Content-Type and Accept to either application/xml or application/json
Set authorization by supplying credentials using Basic Auth tab.


Another great tool is SOAPUI. This tool can be used to fully automate your functional and regression testing by submitting REST requests.  For more info on this tool visit http://www.soapui.org/.


Another option is using cURL (http://curl.haxx.se/)
cURL is a cmd line tool for transferring data using a URL syntax.

image

curl --user "mdmadmin:mdmadmin" -X PUT
            -H "Content-Type: application/xml"
            -H "Accept: application/xml"
            -d @getParty.xml "http://localhost:9080/com.ibm.mdm.server.ws.restful/resources/MDMWSRESTful"

 

 



Marcações:  jax-rs pmdm webservices restful mdm rest

Developing behavior extensions for InfoSphere MDM v11

Dmitry Drinfeld 270005DFVF | | Comment (1) | Visits (8576)

Tweet

Developing behavior extensions for InfoSphere MDM v11

Special thanks to Stephanie Hazlewood for providing guidance as well as content for some of the sections of this article!

Executive Summary

Many established organizations end up having unmanaged master data. It may be the result of mergers and acquisitions or due to the independent maintenance of information repositories siloed by line of business (LOB) information. In either situation, the result is the same – useful information that could be shared and consistently maintained is not. Unmanaged master data leads to data inconsistency and inaccuracy.  IBM Master Data Management (MDM), specifically the physical MDM set of capabilities, allows the enterprise to create a single, trusted record for a party.  Similar capabilities exist for mastering product and account information.  It can be integrated with content management systems and can also support a co-existence style of master data management sometimes referred to as hybrid MDM capabilities, were a linkages between matched records mastered and maintained in various source systems are created in virtual MDM and then persisted in physical MDM.

One of the most fundamental extension mechanisms of InfoSphere MDM allows for the modification of service behavior. These extensions are commonly referred to as behavior extensions and the incredible flexibility they provide allows for an organization to implement their own “secret sauce” to the over 700 business services provided out of the box with InfoSphere MDM.  The purpose of this tutorial is to introduce you to behavior extensions and guide you through the implementation, testing, packaging and deployment of these extensions.  You will be introduced to the Open Service Gateway initiative (OSGi)-based extension approach introduced in the InfoSphere MDM Workbench as of version 11.

 

Introduction

With the release of InfoSphere MDM v11, we adopt the OSGi specification which allows, amongst many other things, extensions to be deployed in a more flexible and modular way. This document will describe a real client behavior extension scenario and step you through all of the following, required steps:

-        Extension scenario outline.

-        Creation of the extension project.

-        Development of the extension code.

-        Deployment of the extension onto the MDM server.

-        Testing deployed code using remote debugging.

We will then conclude this document with the summary of what you have learned.

 

Extension scenario

It is often necessary to customize an MDM implementation in order to meet your solution requirements. One of the extension capabilities InfoSphere MDM provides it the ability to implement additional business rules or logic to a particular out-of-the-box service.  These types of extensions are referred to as behavior extensions, as they ultimately change the behavior of a service. In this tutorial we will create a behavior extension to the searchPerson transaction.

The searchPerson transaction is used to retrieve information about a person when provided with a set of search criteria. You can filter out the result set by active, inactive or all records that get retrieved by these criteria. Important to note is that this particular search transaction uses exact match and wildcard characters to retrieve the result set.  There are separate APIs available for probabilistic searching – this service is not one of them.

Sometimes, the searchPerson transaction response may contain duplicate parties. For example, if a party contains both legal and business names which are identical, and searchPerson transaction uses last name as criteria, - the parent object will be returned twice in the response, as it will be matched by both of the names. While this behavior is acceptable in some circumstances, some cases might more filtering before it is returned. In order to do so, we will create a behavior extension, which will be responsible for processing transaction output and removing any duplicate records in the result set. The InfoSphere MDM Workbench provides us with exactly the right tools to quickly create and deploy such an extension.

 

Creating extension project

First, create the extension project structure using the wizards provided by MDM Workbench. Go to File -> New -> Other… and search for Development Project wizard:

image

If you cannot find Development Project wizard within the list, chances are the Workbench has not been installed, please verify using IBM Installation Manager.

When creating your project, make sure to specify a unique project and package names in order to avoid conflict with the existing ones:

image

Make sure to choose the correct server runtime for your projects, as well as unique name for the CBA project:

image

Note: You are allowed to choose from the existing CBAs. A single CBA can contain multiple development project bundles.

Click Finish and wait for the wizard to generate the required assets.

At this point, what we have is a skeletal InfoSphere MDM Development project that contains all of the basic facilities to help us create the desired extension. The next step is to create the extension assets and there are two ways of doing so: either by using the behavior extension wizard, or by using the model editor.

Creating a behavior extension using the extension wizard

You can create an extension using a wizard in the MDM Workbench, much like the one used to create a development project:

1.   Open Behavior Extension wizard by going to File -> New -> Other… -> Behavior Extension, located under Master Data Management -> Extension folders

image

2.   Once in the wizard, select the development project to place the extension under:  image  

Note: A development projects can contain multiple extensions of various types underneath it. You might choose to use development projects to logically group extensions  having a similar purpose, type or to facilitate parallel development activities.

3.   Within the next window, choose a name and a description for your behavior extension. Choose a Java class name for your extension. This is the class that we will be populating with custom logic in order to achieve desired behavior. Alternatively, if you require to use an IBM Operational Decision Manager (ODM, and previously known as ILOG) rule – specify this associated parameter. ILOG/ODM rule creation is not covered as a part of this tutorial as we will implement the extension in as a Java extension.

image

4.     Within the “Specify details of the trigger” pane, you need to specify the following parameters:

a. Trigger type:

                                    i.  ‘Action‘ will cause the behavior extension to trigger whenever chosen transaction is ran by itself or as a part of another transaction.  ‘Actions’ are executed at the component level. .

                                    ii.  On the other hand, if you are looking to trigger extension only on a specific standalone transaction event (otherwise known as controller level transaction) select ‘Transaction’ trigger type.

                                    iii. ‘Action Category’ trigger type executes behavior extension on various data actions such as add, update, view or all for extensions to be executed at the component level.

                                    iv. ‘Transaction Category’ trigger type will kick off behavior extension when a transaction of specified type is executed, namely inquiry, persistence or all.

          b. When to trigger:

                                    i. ‘Trigger before’ will cause the behavior extension to fire of before the work of the transaction is carried out. Sometimes you will hear this referred to as  a preExecute extension. It is a typically used when some sort of preparation procedure has to be executed before the rest of the transaction is carried out. An example of such scenario would be preparing data within the business object that is being persisted.

                                    ii. ‘Trigger after’ will cause the behavior extension to run after the transaction work has carried out. Sometimes you will hear this referred to as a postExecute extension. It is typically used in the scenarios where logic implemented in the behavior extension depends on the result of the transaction. Normally any sort of asynchronous notification would be placed in a post behavior extension, as there would be no way to roll it back in case of transaction failure, if it is sent before the transaction is executed.

          c.  ‘Priority’ parameter indicates the order in which this behavior extension will be triggered. The lower the priority number, the higher the priority.  That is,             a behavior extension with priority 1 would execute first followed by behavior extension with priority 2, 3 or 4 in that order.  

In our scenario we are looking to filter the response of a specific transaction,, namely searchPerson. Therefore we set the trigger type to be ‘Transaction’ with value of searchPerson. Since we are filtering the response of the transaction – we have to trigger our behavior extension after the transaction has gone through, and response became available. Lastly, in our particular example priority does not play a special role, so we will leave it at default of ‘1’.

image

5.   After the above configuration is done, click Next and review the chosen parameters. Note that there is a checkbox at the top of the dialog, allowing you to generate the code based on the specified parameters immediately. For the purposes of this tutorial leave it checked and click Finish.  The workbench will generate all of the required assets for you.

 

Creating a behavior extension using the model editor

If you have used the wizard approach above to create the behavior extension already, please feel free to skip ahead to the section titled, “review generated assets” that follows.

This section describes how to generate a behavior extension using the model editor.  To do so, the following steps will guide you through this process:

1.   Go to the development project you created earlier and open the module.mdmxmi file under the root folder of the project. Select the model tab within the opened view.

2.   Right click PartySearchBehaviorExtension folder, then select New -> Behavior Extension:

image

 

3.   Name the behavior extension that has been created as a result ‘PartySearchDuplicateFilter’, and provide appropriate documentation:
image

4.   Now we will create a transaction event definition under behavior extension. Right-click the behavior extension, then select New - > Transaction Event.

 

5.   Once the transaction even has been created, specify the appropriate properties:

 

a.   Because this event is triggered on the personSearch transaction, PersonSearchEvent is appropriate. Recall that sometimes the “trigger before” behavior is referred to as “preExecute” extension.

b.   Because ‘Pre’ checkbox stands for preExecute, (more specifically the behavior extension gets executed before the rest of the transaction) leave it unchecked. Similar to the wizard configuration, leave priority as ‘1’, since priority of execution does not affect this behavior extension.

c.   Finally, select searchPerson as the transaction of choice by clicking Edit… -> Party -> CoreParty -> searchPerson.

image

After all of the above configurations are done and reviewed, go ahead and click Generate Code under the Model Actions section of the view, telling workbench to generate configured assets.

 

Review your generated extension code

Once either of the above methods is used, let us review the generated assets:

  • Under bestpractices.demo.behavior, we find the PersonSearchDuplicateFilter which is the actual behavior extension Java class that, once configured, will be executed at runtime. We will be providing the implementation that will filter out duplicate person objects from the response of the searchPerson transaction here shortly.
  •       In the OSGI-INF/blueprint/blueprint-generated.xml we can see the OSGi service definition, listing the  bestpractices.demo.behaviour.PersonSearchDuplicateFilter class as the extension service.
  •       Finally, under the resources/sql/<db_type>/PartySearchBehaviorExtesion_MetaData_DB2.sql file, we find the configurations we’ve specified regarding behavior extension execution:

o    EXTENSIONSET table record defines the behavior extension, its associated class bestpractices.demo.behaviour.PersonSearchDuplicateFilter and its priority of ‘1’:

image

 

o    CDCONDITIONVALTP defines a new condition of transaction name being equal to searchPerson.

o    EXTSETCONDVAL connects the above CDCONDITIONVALTP record to the behavior extension record from EXTENSIONSET. Additionally another EXTSETCONDVAL record connects CDCONDITIONVALTP with id of ‘9’, which stands for execution of behavior transaction after transaction.

 

Let us now move on to developing the extension code required to filter out duplicate person records from the result set returned by the searchPerson transaction.

 

Develop the extension code

The behavior extension skeleton and supporting configuration assets have now been generated.  You add your custom logic, or behavior change, in the execute method of PersonSearchDuplicateFilter class. The objective is simple. We need to go through all of the partySearchResult objects returned by the service, and if a same person object repeats multiple times throughout the result vector – we want to remove it. The following code will achieve just that:

 

    public void execute(ExtensionParameters params)

    {

        // Only work with vectors in the response

        if(params.getWorkingObjectHierarchy() instanceof Vector)

        {

            // Get the response object hierarchy

            Vector partySearchResultList =

               (Vector)params.getWorkingObjectHierarchy();

           

            // Iterate through the party search result

            // objects to find duplicates

            Iterator listIterator =

               partySearchResultList.iterator();

               

            // We will keep the party ids of objects we've already

            // processed to identify the duplicates

            Vector partyIdList = new Vector();

            while(listIterator.hasNext())

            {

                Object o = listIterator.next();

                if(o instanceof TCRMPersonSearchResultBObj)

                {

                    TCRMPersonSearchResultBObj personSearchResultBObj =

                          (TCRMPersonSearchResultBObj)o;                                    

                    String partyId = personSearchResultBObj.getPartyId();

                   

                    // If the party id has not been seen yet, this person

                    // object is not a duplicate, otherwise - remove it from

                    // the response

                    if(partyIdList.contains(partyId))

                        listIterator.remove();

                    else

                        partyIdList.add(partyId);

                }

            }

        }

System.out.println("PartySearchBehaviorExtension has finished executing.");

    }

 

Note: The above implementation is not pagination friendly and pagination will not be covered as a part of this tutorial.

 

Once you have compiled the code above, you will notice that some of the classes are not found and have to be imported. You cannot simply import TCRMPersonSearchResultBObj because the package containing this business object is not imported for the bundle that we’re working with. To fix that go to BundleContent -> META-INF -> MANIFEST.MF and go to the dependencies tab:

image

Add com.dwl.tcrm.coreParty.component to the list of imported packages. Now you should be able to import TCRMPersonSearchResultBObj in our behavior extension class to resolve the dependency error.

After recompiling the projects again, you will notice that the PartySearchBehaviorExtensionCBA now contains an error:

image

This error is occurring because the composite bundle that contains PartySearchBehaviorExtension bundle does not import com.dwl.tcrm.coreParty.component. In order to resolve the error,  go to the manifest file of the CBA, which should be located under root folder of the project, and add com.dwl.tcrm.coreParty.component to the list of imported packages. As you recompile – it should resolve the error.

Now that all compilation problems have been resolved, we are ready to deploy our extension onto the server.

 

Deploying your new behavior extension to MDM

Once the implementation of the behavior extension has been developed, we are ready to deploy it onto the server. There are two steps involved into the deployment:

-        Deploying code to the server.

-        Executing generated SQLs to insert required metadata.

 

Deploying code to the server

Our customized behavior extension can be deployed to the server as a Composite Bundle Archive (CBA) as follows:

1.   Make sure that the customized code has been built and then export the CBA containing the behavior extension by right clicking the CBA project and selecting ‘Export… -> OSGi Composite Bundle (CBA)’.

2.   In the opened view, select PartySearchBehaviorExtension as the bundle to include. Click ‘Browse…’ and navigate to a selected export location and click ‘Save’. If you do not explicitly provide the file name, the wizard will generate the appropriate name automatically.

3.   Click ‘Finish’. The CBA containing the behavior extension has now been exported to selected location.

4.   At this point, we will assume that the MDM instance is up and running. Let’s open the WebSphere Administrative Console. We are looking to import our new CBA into the internal bundle repository. To do so go to Environment -> OSGi bundle repositories -> Internal bundle repository. In the opened view, click New…, choose Local file system and specify the location of the CBA we’ve exported above. Save your progress.

5.   Once the CBA has been imported, attach this new bundle to the MDM application. Go to Applications -> Applications Types -> Business-level applications. Choose MDM application from the opened view. In the next open view, open the MDM .eba file.

6.   We are now looking at the properties of the MDM Enterprise Bundle Archive (EBA). In order to attach our CBA, go to Additional Properties section and select Extensions for this composition unit. 

7.   If this is the first extension that you’ve deployed on your instance, the list of attached extensions will be empty. Let’s now click Add…, and check the CBA we’ve imported above, then click Add. Wait for the addition to complete. Save your changes.

8.   You may think that we are done here, but not quite. We’ve only updated the definition of the EBA deployment by adding our extension. The MDM OSGi application itself has not been updated and even if you restart the server, your new behavior extension will not be picked up. So you must update the MDM application to the latest deployment by returning to the EBA properties view.

image

Before we attached our extension, the button shown above was grayed out; the comment stated that the application is up to date. But since we’ve update our application with a new extension bundle, we need to update it to the latest deployment. Go ahead and click the Update to latest deployment … button.

9.   In the next view, you can see that the PartySearchBehaviorExtensionCBA that we’ve attached to the MDM EBA, will now be deployed:

image

At this point, scroll down and click Ok to proceed. It may take several minutes depending on your system hardware.

10.  At this point, WebSphere will take you through three views, offering multiple information summaries of the deployments and several customization options. There is no need to customize anything, go ahead and click Next three times, followed by  Finish. At this point the application will update. It may take some time; please allow 5 – 10 minutes to complete depending on underlying hardware. Once it is complete – save your changes. At this point, the MDM application has been updated to the latest deployment which includes our extension.

Now we need to deploy our custom metadata to the database.  This metadata will govern the behavior of our extension in ways discussed above.

 

Deploy metadata onto the MDM database

As mentioned earlier in this tutorial, the Workbench generates database scripts that insert the required configuration into the metadata tables of the MDM repository. This metadata is generated based on the parameters we provided for our behavior extension as part of the Creating extension project section. In order to deploy this metadata to the database, run the database scripts listed under the resources -> sql folder that are appropriate for your database type. Conversely, if you need to remove extension from the server, you would need to run the rollback scripts provided in the same folder.

Note: In the case where some portion of the script fails, please investigate the error, because it may render the extension useless. Potential reasons for an error may include residual data from previous extension (rollback was not run when extension was removed), incorrect database schema, etc.

Once the scripts have been successfully run, you’re your behavior extension has now been successfully deployed. Restart your WebSphere server so that your new metadata gets picked up when the application runs next.

 

Testing deployed code using remote debugging

Now that all of the aspects of the behavior extensions have been deployed, we are ready to test it out! To do that, run a searchPerson transaction. It is required to have at least one person in the database so that you can actually search and yield a successful search result to trigger your new extension. This test will show us that the extension is successfully deployed. Once the transaction returns as successful, go to the SystemOut.log of the WebSphere server which is located under the log folder of the WebSphere profile where MDM application is deployed. If the extension has deployed correctly, due to the following line in our custom code:

System.out.println("PartySearchBehaviorExtension has finished executing.");

You should be able to see this message in the logs:

[6/17/14 13:24:59:816 EDT] 000001b3 SystemOut     O PartySearchBehaviorExtension has finished executing.

Note: The log message is there for testing purposes only, and depending on the usage of the behavior extension can significantly impede performance. For that reason please make sure to remove such debugging messages or put them into fine logging level before going into production. Such as:

logger.finest("PartySearchBehaviorExtension has finished executing.");

 

Configuring WebSphere Application Server debug mode

To observe the behavior of our extension more closely, put WebSphere server into the debug mode, and connect MDM Workbench to the said server in order to debug our code step by step. To put your server in debug mode:

1.   Go to WebSphere Application Server administrative console, and navigate to Servers -> Server Types -> WebSphere application server -> <Name of your instance>.

2.   Once in the server configuration view, take a look at Server Infrastructure section and navigate to Java and Process Management -> Process definition.

3.   In the Additional Properties section, select Java Virtual Machine.

4.   Once we are in the Java Virtual Machine view, navigate down to the Debug Mode checkbox, check it and provide the following settings in the Debug arguments textbox:

-Dcom.ibm.ws.classloader.j9enabled=true -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=7777

Note that ‘7777’ is the debug port to which the MDM Workbench will connect. Make sure this port does not conflict with any other assigned ports on the server, and set it accordingly.

5.   Save configuration and restart your server. It is now running in debug mode. Note: If later you observe unexpected performance degradation and do not require debug mode any longer, make sure to take the server out of the debug mode using the same steps.

 

Configuring MDM Workbench to for remote debugging

Once the server is running in debug mode, we can go back to the MDM Workbench and configure it for debugging:

1.   In MDM Workbench, go to Run -> Debug Configurations.

2.   Within the Debug Configurations window, double click Remote Java Application. This will create a new Remote Java Application profile.

image

3.   When configuring the Remote Java Application, lets name our configuration ‘MDM Local Instance Debug’. The Project setting does not play a role, you may leave it empty, or whatever the default populated value is. Connection Type should remain as ‘Standard (Socket Attach)’. Lastly Connection Properties should reflect the location of the MDM instance and debug port we’ve chosen above.

image

          We will not cover other tabs because the configuration we’ve done so far is sufficient.

4.   Once configuration is complete, hit Apply followed by Debug in order to attach to the MDM instance. The attach process may take a little bit of time depending on the environment. Once it is complete, go to the Debug perspective of your environment. In the debug view, you should observe the connected MDM instance if the attach was successful:

image

You can see above that the instance is available along with all of the threads.

5.   Finally set a break point at the beginning of the behavior extension execute method and observe this breakpoint getting engaged once we run a searchPerson transaction:

image

 

6.   If you have multiple TCRMPersonSearchResultBObj coming back in the response, step through and observe the duplicates being removed.

As a last point, note that we can debug both local and remote instances as described above, using Eclipse’s Remote Java Application debug capabilities. 

 

Conclusion

In this tutorial we’ve gone through the steps of creating, configuring, deploying and testing a basic yet realistic behavior extension scenario for InfoSphere MDM.

We’ve covered two ways in which an extension template can be created: while the wizard option is straightforward and is preferable for a novice or a simple extension scenario, the model editor allows for more flexibility.

We’ve taken a look at the various configurations that apply to a behavior extension and outlined their effects on its execution. Additionally, we’ve covered the assets that get generated as a result of the configuration.

For the development step, we’ve created and analyzed the implementation of our behavior extension.

And finally, we’ve deployed, tested and debugged our behavior extension to make sure it performs as expected.

All of the above steps constitute a complete development process of an MDM Server behavior extension.

 

Related materials

InfoSphere Master Data Management operational server v11.x OSGi best practices and troubleshooting guide

How to deploy Composite Business Archives (CBA) to WebSphere



Marcações:  mdm-server mdm preview mdm-workbench osgi article behavior-extension mdm11

InfoSphere Master Data Management operational server v11.x OSGi best practices and troubleshooting guide

Dany Drouin 270004VXKT | | Visits (8363)

Tweet

InfoSphere Master Data Management operational server v11.x OSGi best practices and troubleshooting guide

 

Note: This preview only covers the initial set up of the MDM Workbench. The full developerWorks article has now been published and contains these additional topics:

  • Bundle project build path
  • Exporting services from composite bundles
  • Deploying a CBA composition unit extension
  • Class loaders explained… cyclical dependencies
  • Troubleshooting

 

Introduction

 

The goal of this article is to show best practices and optimal development practices for developing with the InfoSphere MDM operational server. We will discuss common OSGi patterns, troubleshooting, including failures and resolution, as well as how to best deploy MDM composite bundle (CBA) extensions.

 

The InfoSphere MDM version v11 operational server is based on an enterprise OSGi architecture, which is modular in nature. The benefits of a modular architecture application design include reducing complexity, reducing time to delivery, and increasing serviceability. The Java EE infrastructure leveraged in previous versions of InfoSphere MDM had limited ability to enforce or encourage a modular design.

The advantage of a modular MDM application is to allow customization to be deployed without having to alter the core MDM application. Instead, customizations are attached in the form of extensions to the core MDM application. This is done using composite bundles, or CBA files.

image

 

Optimal workspace operational server configurations

 

The InfoSphere MDM Workbench is a tool that supports development of customizations and extensions to MDM operational server. The MDM Workbench allows you to define the desired data model and transactions and then generates the code required to implement the MDM Server extensions.

When using workbench to build and deploy your MDM customizations and extensions, there are a few workspace configurations to consider for achieving the best performance and development experience.

 

  1. Workbench workspace server definition settings

    Publishing CBAs from the MDM Workbench using the "Run server with resources within the workspace" option, sometimes called a "loose config" option, is not desired because it might cause the workspace and the server deployment to be out of sync. Use of this option often results in CBA deployment errors where the MDM EBA application fails to start. To resolve this, you must manually remove the CBA deployment using the WebSphere Application Server administrative console to return the application back to its original state.

    The optimal configuration is to use the "Run server with resources on Server" publishing option. Although this option will be slower, it is more stable. This option is a true reflection of a deployment to the production environment because the CBA is physically built, packaged, and deployed in the internal bundle repository of WebSphere Application Server.

    The second server definition option is to avoid publishing changes automatically to the server. Since publishing can be a time consuming operation, we want to avoid doing this for every simple changes we do in the workspace. The option to publish only when we have sufficient changes is ideal.

    -Double click (or Right click and select Open) to open the server’s definition.

    image







    image
    Note: Un-deploy any previously deployed CBA assets from the server prior to switching the publishing settings.


    Finally, we recommend “Start server with a generated script” to be unchecked to allow proper MDM logging to occur.
    image









     

  2. Run server in debug mode

    Avoid re-publishing the CBA and bundle changes to the server for simple Java class changes.  We can opt to run the server in “debug mode” and leverage the hot swap of Java code (also known as the “Hot Method Replace”) feature.  When running the application server in debug mode, hot method replace allows most application changes to be picked up automatically without requiring a republish, application restart or server restart.  There are cases where we cannot simply rely on the hot method replace.  Application structure changes such as OSGi blueprint changes, manifest change, CBA manifest changes and code refactoring will still require a republish.
     
  3. Remove legacy applications and services

    If your MDM implementation doesn’t require the use of the Legacy UIs and/or legacy JAX-RPC web services, you can choose to uninstall these components from the application server.  This will increase the startup performance of the application server and also reduce memory consumption. 

    The MDM JAX-RPC web services are deployed as a CBA extension.  Note that the JAX-RPC web services have been deprecated in InfoSphere MDM v10 and replaced by the new JAX-WS specification.  It is recommended to migrate your existing JAX-RPC web services implementation to the new specifications but this may not always be possible.

    Uninstalling JAX-RPC web services is as simple as removing the CBA extension from the composition unit of the InfoSphere MDM operational server. 

    1. In WebSphere Application Server Administrative Console, Navigate to Applications --> Application Types --> Business Level Applications, and select the MDM-Operational-server-EBA-E001. Then select com.ibm.mdm.hub.server.app_E001-0001.eba. See below:

    image





























    2. Select ‘Extensions for this composition unit’.
    3. Select the ‘com.ibm.mdm.server.jaxrpcws.cba’ CBA currently attached to the EBA and click Remove.
    4. Save changes to master configuration file.
    5. Return to the previous screen and click ‘Update to latest deployment…’.
    6. Save changes to master configuration file.

     
  4. Apply WebSphere interim fixes for performance improvements

    See the following InfoSphere MDM technote for more info.
    http://www-01.ibm.com/support/docview.wss?uid=swg21651235

 

 

Read more...



Marcações:  classloader mdm11 cba best-practices mdm-workbench osgi troubleshooting mdmserver error deployment mdm

MDM Workbench v11 is here !

Mike Cobbett 11000061J8 | | Visits (8177)

Tweet

 

MDM Workbench v11 is here !
 
With huge pride, we have just shipped the version eleven of MDM, including the new unified MDM Workbench v11. It's been nearly 2 years in the making, and represents the biggest change to the MDM tooling in recent years. In this article we outline these changes and give the reader familiar with previous MDM tools a gentle introduction to what they can expect when they get their hands on the new tools.
 
The main changes made for v11 workbench can broadly be categorized under the titles unification, simplification and integration.
 
Unification
Unification is a drive to combine the tools from the v10.1 standard edition (formerly Initiate tools) and the tools from the v10.1 advanced edition into a single set of tools which run in the same Rational Application Developer (RAD) environment. Where the tools were inconsistent, or overlap existed we adopted common approach to  make sure both sets of tools work together in a unified tooling environment. 

 

In Summary: 
  • All tools run on a common tooling platform (Rational Application Developer 8.5.1 or Rational Software Architect for WebSphere v8.5.1)
  • A unified product installer, based on Installation Manager offers a common approach to installing workbench, server and all pre-requisite software, with "typical" and custom install options, and a launchpad to get things started.
  • Adoption of some common terminology, changing terms where necessary to avoid ambiguity where the same term meant different things in the "physical" and "virtual" arenas.
  • Deployment of handlers, additions, extensions, mappings, transforms and any java logic are deployed using the same "cba" packaging and deployment mechanism. 
  • The "servers" view in RAD is used by everything which communicates with a  server rather than our own list of servers
  • A combined project migration wizard for development projects, configuration projects and other project types.
  • Consistent use of eclipse perspectives (MDM Configuration and MDM Development for example)
  • Support of the "hybrid" capabilities of the MDM operational server which unify the server by having the tools create and customize a pre-built mapping, to create the transform which converts data from "virtual" to the "physical" parts of the MDM operational server.
An example of unification: Consistent use of the perspectives, showing the new MDM development perspective.

image

 

Simplification 
We want all the tools to be simpler. We aim to cut the time it takes to get value out of the MDM platform; automating where possible to relieve solution developers of repetitive tasks and reducing the amount of knowledge needs to get something working.

 

Toward this goal the workbench has made  these changes :
  • Easier set-up of the workspace, no longer is there a need to import an .EAR file into your workspace, use of the Development Environment Setup Tool is no longer needed. Just create a fresh workspace and off you go !
  • Adoption of OSGi modularity in the server means workbench users don't mix content from IBM and content they create in the workspace. All IBM-provided content stays out of the RAD workspace.
  • Fewer projects in the workspace, containing user-created content means the tools run faster, are more responsive, and less time is taken waiting for builders or wizards to complete.
  • There is no need to turn off auto-build options, which means users can realize all the benefits of the RAD IDE validation framework and built artifacts stay in-sync with their source artifacts.
  • Added tools to create a handler. Generation of a handler skeleton code complete with all the java and OSGi packaging required for deploying the handler project is enabled by a new wizard
  • Removed the need to use the Probablistic Matching Engine (PME) console web application. All of the function this separate web application contained to configure the matching engine is now present in the workbench (including local weight generation), so there is no need to install this separate web tool, or learn how to use it.
  • A multitude of UI improvements to get dialogs to ask fewer questions, removed steps in processes.
  • The ability to reset a test server and database instance back to what it was before you started deploying customizations to it, so developers can quickly return to a "clean starting state" when required.

An example of the way version is simpler can be seen by comparing a version 10.1 workspace against a version 11 workspace:

image

 

 

Integrations
"No man is an island" as the saying goes, and the same is true for products. MDM tools now play a wider role in enabling the ingestion and distribution of information in an MDM solution.
Enhancements in this area include: 
  • The custom web services mapping tools are enhanced to be able to understand the format of comma-delimited, pipe-delimited or other non-xml formats, using the Data Format Definition Language (DFDL) standard. A transform from that format to the MDM transaction payload formats can be defined, and the MDM batch processor can then be used to ingest files which are in the non-xml format. The same feature now also supports mapping to XML spec attributes where they are used in the operational server.
  • A new wizard has been added to facilitate the population of metadata information in Information Server, so InfoSphere Metadata Workbench can be used to perform lineage analysis, with an understanding of the MDM transactions, the transaction payload structures, the physical database tables used in the server, and how each of these pieces of information relate to each other.
  • A new wizard to populate the palette of the BPM designer tool. Export the "shape" of MDM data objects from either the virtual or physical parts of the operational server, and import that information into the Business Process Management (BPM) designer tool, after which point you can drag-drop such objects onto BPM flows as you design them, speeding up process flow creation.
  • Upgraded Operational Decision Management (ODM) rule support. Replaces the iLog support we had in previous releases.The ability to refer to an ODM rule and associate it with events in the workbench, and deploy that configuration to the server as a behavior extension.
  • New tools to help configure an MDM policy monitoring solution. Create a monitoring configuration, deploy it, and the policy monitoring part of MDM gathers statistics and reports on quality metrics of the data held within the operational server. 
For example: Our list of export wizards has been expanded to help push MDM metadata to more remote systems.

image

In summary, we hope you like the changes we've made to the tools, and hope you find that creating, configuring and developing an MDM solution is now quicker and easier than ever before.

For more detailed information, and a complete treatment of the MDM version 11 release, please refer to the information center for verson 11
 
 


Marcações:  mdm11 bpm dest osgi odm hybrid mdm dfdl mdm-workbench

MDM Application Toolkit and Product domain soft specs - Tips!

jaylimburn 2700028UUJ | | Comments (2) | Visits (7793)

Tweet

MDM Application Toolkit for Product Domain

I recently had to build a product bundling process for a demo using BPM and the MDM Application Toolkit(MDMAT). Having built many business processes over the past 2 years using data from InfoSphere MDM I realized this was going to be the first one I that I was to build against the product domain of the physical engine. Using the MDMAT against the Party domain is pretty darn easy and very quickly a rich process can be built that interacts with MDM's library of web services for many different types of processes. How useful would it be for me when operating against the Product Domain, especially when a good chunk of my data was stored in Product domain XML soft specs? Well I'm pleased to say it was also very straight forward. I've written some notes below that will hopefully allow others to also find it just as easy to use the MDMAT against the product domain. 

The Challenge

The process was to execute a search against the MDM product domain using some pre defined criteria that would allow me to pull back all products that met a certain criteria. in this case it was to retrieve a list of products that were within the 'Mobile Phone' category of the 'Channels' hierarchy, were aimed at a 'Market Segment' that was 'Affluent' had an 'Effective Date' before today's date and an 'Expiry Date' that was after todays date. This would allow me to show currently active offers on the mobile channel for Affluent customers. The 'Market Segment', 'Effective Date' and 'Expiry Date' attributes were all stored as attributes within an XML spec called 'Offer Attributes'. In the search results that come back from MDM I also needed to pull out some additional attributes that were stored within another XML spec called 'Channel Mobile Phone', these attributes were named 'MobilePhoneHeadline' and 'MobilePhoneSalesText' All of this had to be built in a small amount of time AND display on a mobile device. Fortunatly I knew that with the MDMAT and BPM I at least had a chance of pulling this off pretty easily.

The Solution

Whenever I build a business process I first start by defining the variables that I will need. Since BPM applications are data driven, I find it helpful to define the data upfront and then worry about wiring them into a process at a later stage. Using the MDM Workbench I exported my MDM WSDL and imported it into Process Designer. This gives me access to my MDM Product business objects within BPM, allowing me to easily construct a ProductSearchBobj object with the criteria I need to execute my search and also create a ProductSearchResultBObj object to hold the results that are returned from the search. In total I decided I needed 4 business objects:

 

Object Name Object Type Type imported from: Purpose
ProductSearch ProductSearchBObj MDM Workbench Hold Product search criteria
ProductSearchResults ProductSearchResultsBObj MDM Workbench Hold Product search result data
MDMConnection MDM_Connection (From MDMAT) MDM Application Toolkit Hold MDM server connection credentials
diplayObject DisplayObject Manually created To be used by the UI controls to display data

With the objects defined I could move on to define my process flow. I created a very simple flow to suit the requirements as seen below:

image

I would first use the 'Configure Spec Search Criteria' node to execute a script to populate the ProductSearch object with the crieteria I needed. I would then configure the 'Retrieve all Offers' node to use the MDM Application Toolkits' Physical MDM Txn service to execute a search an return a list of ProductSearchResults objects. I figured that there would be some simple scripting required to extract the information I would need from the search results XML specs so I created a 'Populate Display Object' node to define a script that would allow me to extract the spec values from the XML and pass into the displayObject. the 'Display All Offers' coach then displays the list of displayObjects in a table and once a user has made a selection the 'View Offer Details' coach displays details for that offer.

With my objects defined and my process defined all I had to do was a little bit of scripting to firstly populate my search and then extract my search results to populate my displayObject. (I had already populated my MDMConnection object with my MDM server's credentials and configured the 'Retrieve all Offers' node to use the MDM Application Toolkit's Physical MDM TXn service to call an MDM 'searchProductInstance' service and pass in my ProductSearch object.)

Populating the Search

I wrote a simple script in my 'Configure Spec Search Criteria' object to pass in the search criteria. I wont include the full script here, but all I had to do was create an instance of a ProductSearch object and set the following attributes:

 

Attribute Value
ProductSearch/EntityCategorySearchBObj/CategoryName 'Mobile Phone'
ProductSearch/EntityCategorySearchBObj/CategoryHierarchyName 'Channels'
Spec Value instance 1  
ProductSearch/SpecValueSearchBObj[0]/Path '/OfferAttributes/MarketSegment'
ProductSearch/SpecValueSearchBObj[0]/SpecId '1000100'
ProductSearch/SpecValueSearchBObj[0]/SpecValueSearchCriteriaBObj/Value 'Affluent'
Spec Value instance 2  
ProductSearch/SpecValueSearchBObj[1]/Path '/OfferAttributes/EffectiveDate'
ProductSearch/SpecValueSearchBObj[1]/SpecId '1000100'
 ProductSearch/SpecValueSearchBObj[1]/SpecValueSearchCriteriaBObj/Value currentTime
Spec Value instance 3  
ProductSearch/SpecValueSearchBObj[2]/Path '/OfferAttributes/EndDate'
ProductSearch/SpecValueSearchBObj[2]/SpecId '1000100'
ProductSearch/SpecValueSearchBObj[2]/SpecValueSearchCriteriaBObj/Value currentTime

When passed into the 'Retrieve all Offers' node my search criteria successfully results in a list of products that I am interested in being returned as a list of ProductSearchResultBObj's within my ProductSearchResult object. The final stage is to extract the attributes I need from the results to populate my display Object.

Extracting the spec values and populating the display object

Up until now everything I had done was pretty similar to other processes I had built, this final piece was the most challenging, in that I had never extracted values from an XML spec before within a business process. Looking at my ProductSearchResults object I drilled down into the XMLSpec attribute and noticed that there were no attributes defined within it to store the spec values that were returned from my search. That is because the import of the WSDL into Process Designer can not recognize the type as XML. To solve this problem I manually created objects that represent the XML spec contents. I added a 'ChannelMobilePhone' object that contained two String attributes and also I added an 'OfferAttributes' object that contained three String attributes as per my MDM XML spec. I knew that the MDM Application Toolkit uses the name of the attribute to work out how to populate the object inside the process so by creating the attributes to match the XML spec they will be populated with the values from the XML spec. When I was done my XML spec object looked like this: image

With my spec values now populated inside my ProductSearchResults object all that was left was to use a script to iterate through the results and populate my displayObject. This was pretty standard stuff, I just had to ensure I included plenty of null checks in the script but that was as complex as it got. On running my process the results were successfully returned from MDM and the spec values were added to the extra attributes I defined within my XMLSpec attribute. Finally the values from the XML spec were extracted and displayed in my coach screens.

This ended up being a bit of a longer blog post then I intended (sorry JT), but hopefully it will provide you a good starter in using the MDMAT for the product domain. I really enjoyed building this process (and writing this article) as it showed me how cool the MDMAT is for helping me to build MDM centric business processes. The ability to build processes against MDM and not worry about the connection and any complexity in calling MDM Web Services saves a huge amount of time and with a little bit of script I was able to leverage the value of MDM's XML specs. if you want more information drop me an email. I'd love to hear what you are doing. 



Marcações:  mdm specs mdm-application-toolkit pim techtip bpm product mdmat

MDM Webservices Security Enablement

SantoshKumarDubey 270003J3J0 | | Comment (1) | Visits (7319)

Tweet

MDM –WebServices Security enablement and validating request with backend LDAP on WAS

 

This document is step by step documentation to setup and turn on Global security for InfoSphere MDM:

1.       MDM server using LDAP on WAS Enabling Global Security for WAS BASE Edition

Log into the WebSphere admin console

http://<hostname>/:<port>/ibm/console/

Enabling Global Security for WAS ND Edition

Log into the WebSphere admin console

http://<hostname>/:<port>/ibm/console/

http://localhost:9061/ibm/console/navigatorCmd.do?forwardName=AdminSecurity.config.view

The port number is the port for that specific profile, server1 for that profile needs

to be started in order to access the admin console

2.       Start server and rite click on server, select “Administration”, after that click on “Run administrative console 

 

3.       This will start administrative console

 
 

4.       Click on Security tab and then click on the global security 

 
 

5.       In WAS7.x Click on Security tab in the left hand and then select Global Security under it, at rite hand side click on “Enable administrative security” By default all three security options are selected, deselect the two other options then “Enable administrative security”

 
 

6.        IN WAS6.x, Click on the “Security -> Secure administration, applications, and infrastructure” then at the rite hand side click on “Standalone the LDAP registry”

 
 

7.       Select Advanced Lightweight Directory Access Protocol (LDAP) user registry settings under the additional properties options group

 
 

8.        Configuration of the LDAP details by filling in the required details we can get these from the administrators

 
 

9.       Save the configuration by clicking on Save

 
 

10.    Configure the contents taking input from the Administrators as per your client setup

 
 

11.    Save the configurations by clicking on the save button

 
 

12.    Once details are filled first check the connection by clicking on the test connection

 

13.    Save the configurations by clicking on the save button

 
 

14.    If the connection is tested and it is successful we can enable the security but make sure to uncheck the ‘Use java 2 security’ we don’t need this in our configuration

15.    Save the configurations.

16.    Save changes to master configuration. Restart the server. This will enable the global security in your WAS and it will start expecting the user authorization data name/password

17.    The next step is to create the WAS security enabled MDM ear.

By default the security is enabled in the MDM ear, in case it is disabled we can ENABLE it by following the below step

On the RAD console click on ctrl+R this will open window listing all the files containing *.xmi. This will also have file having enable and disabled contents. To enable the security just copy the content in file .xmi_SecurityEnabled and paste it inside the file .xmi file.

18.    Once the security is enabled MDM.ear can be published to test our connection with proper user id and password from SOAP UI

19.    The next step is to make our SOAP request changes to accept authentication data (username/password). I am using the tool SOAPUI which can be downloaded from http://www.soapui.org/.

20.    Download the SOAPUI, and install it.

21.    Start SOAPUI  and select the option “New Soap UI Project” after clicking on File option

22.    Now select the appropriate WSDL, depending on service, for example party related services I have select PartyService.wsdl at “C:\workspace\PartyWSEJB\ejbModule\META-INF\wsdl\PartyService.wsdl”

23.    Open appropriate service and in SOAP UI and select Aut tab at the bottom of the request :

24.    This will pop up a window where we can enter the details as configured for your LDAP user details and password

25.    Rite click on the SOAP request and select “Add WSS Username Token” this will pop up a window where select the “password text option“ this will generate the soap header with security information in it.

26.    Fill in the remaining fields in it, it will generate the SOAP request as mentioned below.

<soapenv:Envelope xmlns:port="http://www.ibm.com/xmlns/prod/websphere/wcc/party/port" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">

   <soapenv:Header>

      <wsse:Security soapenv:mustUnderstand="1" xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">

         <wsse:UsernameToken wsu:Id="UsernameToken-1">

            <wsse:Username>db2admin</wsse:Username>

            <wsse:Password Type="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-username-token-profile-1.0#PasswordText">db2admin</wsse:Password>

            <wsse:Nonce EncodingType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary">2KxzVrABGesKvf2npKrzRQ==</wsse:Nonce>

            <wsu:Created>2012-04-15T05:40:42.187Z</wsu:Created>

         </wsse:UsernameToken>

      </wsse:Security>

   </soapenv:Header>

   <soapenv:Body>

      <port:GetPerson>

         <control>

            <requestId>100</requestId>

            <requesterName>Santosh</requesterName>

            <requesterLanguage>100</requesterLanguage>

         </control>

         <partyId>2342342</partyId>

         <inquiryLevel>4</inquiryLevel>

      </port:GetPerson>

   </soapenv:Body>

</soapenv:Envelope>

27.     Test the service with SOAP authentication containing data.



Marcações:  ibm techtip infosphere security mdm
  • Mostrar:
  • 10
  • 20
  • 30
  • Anterior
  • Avançar
1 2 3 4 5