In the first article in this series, "Getting connected â An introduction to connecting with adapters," you learned the basics about using each type of resource adapter with WebSphere Integration Developer, including the difference between inbound and outbound processing and how resource adapters fit into your WebSphere Integration Developer application. This article introduces you to the Java Database Connectivity (JDBC™) resource adapter. We begin with an overview of the JDBC resource adapter and then guide you through the steps necessary to build an application that uses both the inbound and outbound capabilities of the JDBC adapter.
JDBC provides a standard way to connect to any database using the Java programming language. Without JDBC, you would need to write custom code specific to each particular database. To provide uniform programmatic database access, database vendors provide a JDBC driver for their database. This is convenient for Java developers, but as a WebSphere Integration Developer, you may not want to concern yourself with Java database programming. What you really want is to have the database act as a service provider and/or consumer, and integrate it into your applications using imports and exports.
The JDBC resource adapter takes advantage of the standard database access provided by JDBC, hides those details from you, and lets you work with a database in the same manner as any other enterprise information system (EIS). As an integration developer, you only need to deal with an import or export. And as you saw in the previous articles, the enterprise discovery wizard guides you through the necessary configuration for the JDBC resource adapter.
Let's first take a quick look at the details of the JDBC adapter before we get started working with it.
The WebSphere Adapter for JDBC provides bidirectional connectivity between applications and databases. Examples of databases are IBM DB2®, IBM Informix®, Oracle, and Microsoft™ SQLServer. The JDBC Adapter lets you integrate your applications with any database having a JDBC driver that supports JDBC 2.0 or later.
Figure 1 illustrates a typical setting for the JDBC resource adapter, database components and WebSphere Process Server components.
Figure 1. JDBC Adapter Architecture
EISExport and EISImport are implemented using the Enterprise Service Discovery wizard. Recall from the previous articles that an export allows service calls into a module and an import allows a module to call other services. The wizard uses the JDBC resource adapter to determine what operations are available as part of the database. These are the operations that pertain to each table in the database. The wizard also creates business objects that will hold the row data for each table. So, if a table in a Customer database, for example, contains the columns CustKey, FName, and LName, then a business object will contain the attributes CustKey, FName, and LName. The wizard discovers the operations that pertain to an inbound service (export) and an outbound service (import). As we will show, for a database, the operations include create, update, and delete. Through these end points, you exchange application information with the different enterprise information systems as well. Figure 2 shows the EIS export and EIS import components defined in a module.
Figure 2. EIS export and EIS import components defined in WebSphere Integration Developer
As you might recall from the previous articles in this series, outbound processing is when a module makes requests to send or retrieve information to or from an EIS using an import. An example of outbound processing would be when your application looks up customer information from a database as part of a credit check, or when your application has received new information that needs to be stored in the database.
Since it is using an import, your module accesses the EIS as it would any other service. The JDBC resource adapter handles requests to the EIS from the import, in this case a database.
When your module needs to access or store information in a database, the main part of Figure 1 that you are interested in is the application tables. Application tables contain the enterprise information that your application will use or modify. Typically applications that use the tables will manipulate, update or synchronize the information contained within them. So, for outbound processing, when you configure the JDBC adapter, you specify which tables you want to access from your module.
Outbound processing for the JDBC adapter supports two business object styles when sending or retrieving data from the database: after-image and delta. An after-image business object is the complete business object after all changes have been made. For example, a business object might undergo several changes during processing in an application. The final version that is to be stored in the database is the after-image. Operations that use an after-image business object style are create, update and delete.
A delta, on the other hand, is a business object used in an ApplyChanges operation that contains only key values along with the changed data for the given key. An example of a delta is when an application has changed a customer's address. All that would exist in the business object would be the customer ID (if that is the key) and the new address. For further details on the business object styles, refer to the Technical overview of the Adapter for JDBC in the references section.
Inbound processing is when a module receives information from an EIS using an export. This occurs when there is an update to the EIS. The JDBC resource adapter forwards changed information in the EIS, in this case a create, update, or delete in a database, to an export in your module. The module can then perform whatever service it provides based on the changed information.
As you might recall from the previous article , the flat file resource adapter polled the event directory to determine when new files were added and then forwarded the contents to the module. With the JDBC resource adapter, the trick to getting database changes to a module is using a trigger. Let's look at how that works.
A database trigger is program residing in the database that runs when an event, such as an update, occurs on a table in the database. This trigger allows changes in a database to be sent to an export in a WebSphere Integration Developer module. Figure 3 shows an example IBM DB2 database trigger.
Figure 3. IBM DB2 database trigger sample
In human terms, the database trigger in Figure 3 says, "When there is a new row created in the CUSTOMER database table, insert values into the event store table". The values inserted include the primary key from the new row and the name of the business object that will be populated with the row data. To handle all of the database table events, you need to define three triggers, namely CREATE, UPDATE and DELETE, for each table of interest to your module.
An event table stores the asynchronous events generated by the database triggers in a staging table. The event table does not need to be defined in the same database as the database triggers and application database tables; but it must be defined and deployed as part of an XA-compliant (two-phase commit) database. The table stores the events until they are delivered to an export. When the module containing the export consumes the event, the event table removes the event. For the details of the event table definition refer to the Inbound processing section of the Technical overview of the Adapter for JDBC in the references section.
For inbound processing, when data manipulation (create, update or delete) has occurred in an application table, the database trigger captures the changes and inserts a row into the event table. The JDBC resource adapter continuously monitors the event table by polling the event database to detect when there is a new row in the event table. When there is a new row, the JDBC resource adapter converts the data in the row into a business graph and calls an interface operation on the EIS export in the WebSphere Integration Developer module. Just as you do with any other resource adapter, you create the business graph and interface by running the Enterprise Service Discovery wizard.
Let's put the concepts that you have just learned to work by building an application that uses the JDBC adapter. The database synchronization scenario is a good example where you can put both the inbound and outbound capabilities of the JDBC adapter to use. Quite often, modifying data in one database table requires a data synchronization with other types of databases that are in different locations. For example a company might have a database that mirrors another database and both databases need to be synchronized, or two companies that have merged might need to synchronize information that is kept in two different types of databases.
To simulate a real-world database synchronization scenario, you'll create two databases â ABCFIN and ABCINS that you want to keep synchronized. The tables represent finance and insurance tables from an imaginary company called ABC. Each database will contain the values in Table 1:
Table 1. Customer Table
Customer's first name
Customer's last name
You'll also develop the data synchronization logic using an interface map component, and use the enterprise discovery wizard to create the EIS export and EIS import. You'll define database triggers for the JDBC adapter inbound processing. Figure 4 illustrates an overview of the database synchronization scenario.
Figure 4. Database synchronization scenario
When you've finished developing the scenario, you'll run the application on the server to synchronize the data between the two databases.
The following sections describe the steps to build the database synchronization scenario:
- Set up the event store database
- Set up the application database (in practice, this might already exist)
- Import the resource adapter
- Create the imports and exports using the Enterprise Service Discovery wizard
- Implement the business logic
Before you begin building the application, you'll need
to set up the databases and tables that the application
will use. In the Downloads section, the
file contains a database
creation script for the ABCFIN database. This database
includes the Customer table, which is the table that the
application will synchronize, and the event table that
the JDBC resource adapter will poll to create inbound
- Download the database creation scripts from the Downloads section at the end of this article to a temporary directory.
- Launch the DB2 command line processor (on Windows) by selecting Start - Program Files - IBM DB2 - Command Line Tools - Command Line Processor.
db2 =>prompt in the DB2 CLP window. This exits the DB2 command line processor mode so that you can change directories.
- Change the directory to the temporary directory where you downloaded the database scripts in step 1. Your DB2 CLP window should look similar to Figure 5.
Execute both database creation scripts by entering
db2 âtf jdbc_source_script_db2.sqldb2 âtf jdbc_target_script_db2.sql
Figure 5 shows execution of the first command. For each script, you are prompted for the db2admin password and you see a message saying that the SQL command completed when the scripts finish.
Figure 5. Running the database creation scripts
Figure 6 shows the event store table after running the database creation scripts. You can see the tables by selecting Start - Program Files - IBM DB2 - General Administration Tools - Control Center , and then selecting Control Center - All Databases - ABCFIN - Tables.
The WebSphere Adapter for JDBC User Guide (see the Resources Section) explains the fields in the event store
table, but let's take a quick look at how this table
will be used. The event store table gets populated when
one of the database triggers runs. If you refer back to
the trigger sets the
value with the
(primary key) value of the Customer table. The JDBC
adapter uses the
as an index into the Customer table to retrieve the
changed row. Then it sends the data contained in that
row as a business graph to an export in a module.
Figure 6. Event store database contents after running database creation script
You have completed the database set up and are ready to discover services from your databases.
Before you can run the Enterprise Service Discovery wizard, you need to import the JDBC resource adapter into your workspace.
- In the Business Integration view, right-click and select Import.
- In the Import dialog box that opens, select RAR file, and then click Next.
Next to Connectorfile, click
Browse and browse to
<WIDInstallDIR>/Resource Adapters/JDBC/deploy, as Figure 7 shows:
Figure 7. Importing the JDBC resource adapter
Select the file
CWYBC_JDBC.rar, click Open, and then click Finish. If a dialog box opens asking you whether you want to switch to J2EE perspective, click No, because you are going to continue working in the Business Integration view.
In addition to importing the JDBC resource adapter, you also need to add two jars from your DB2 installation to your workspace, and put them on the resource adapter project build path. Different databases require different jars to be copied; refer to the adapter documentation in the Resources section for more information.
In your file system, locate
- With both files selected in Windows Explorer, right-click and select Copy.
- To display the connector modules, select Window - Show View - Physical Resources (connector modules do not display in the Business Integration view.)
In the Physical Resources view, right-click
CWYBC_JDBC - connector module
and select Paste.
The CWYBC_JCBC project should appear as in
Figure 8. Adding the database jars to the workspace
- Right-click the CWYBC_JDBC folder and select Properties.
- On the left side of the property page, select Java Build Path, and then select the Libraries tab.
then under CWYBC_JD - connector module, select
both db2jcc jars so that it looks like
Figure 9. Click
OK in the JAR Selection dialog, and then click OK in the connector project properties dialog.
Figure 9. Adding the database jars to the resource adapter project build path
Now that you have imported and set up the JDBC resource adapter, you are ready to discover services using the Enterprise Service Discovery wizard. In this section, you create the export from which the adapter sends updates when the database table is modified. You will create the export with an interface and business objects corresponding to the discovered services.
In this first set of steps you configure the Enterprise Service Discovery wizard to connect to your database:
- Switch back to the Business Integration view, right-click and select File - New - Enterprise Service Discovery.
shows, in the list of resource adapters you see the
JDBC EMD Adapter (version 184.108.40.206) from the
CWYBC_JDBC Connector Project, which you imported in
the previous section. Select it, and then click
Figure 10. Discovering services using the JDBC resource adapter
- On the Configure Settings for Discovery Agent page, under UserCredentials, set the username and password for your database. These values let the wizard connect to the database to discover services and data.
Under Machine Credentials, for
Database URL, enter
jdbc:db2:ABCFINas Figure 11 shows. The prefix
jdbc:db2indicates that the connection is to a DB2 UDB server, and
ABCFINrefers to the DB2 database catalog entry on the DB2 client.
For Jdbc Driver Class, type
com.ibm.db2.jcc.DB2Driver, as shown in Figure 11. This loads the DB2 Universal JDBC Driver that is necessary for the discovery wizard to be able to connect to the database. This value is specific to each JDBC vendor; for other databases, consult the adapter documentation and the JDBC vendor.
Figure 11. Connection configuration for the JDBC resource adapter
In the following steps, you use the Enterprise Service Discovery wizard to discover business objects used to retrieve and update data in your database. These business objects will be used as the inputs and outputs to operations of your module's imports and exports.
- Click Next. This brings you to the next page in the wizard, Find and Discover Enterprise Services.
Click ExecuteQuery, and then expand
DB2ADMIN - Tables.
You then see the CUSTOMER table as
Figure 12 shows.
Figure 12. Querying the database to discover business objects
- Select CUSTOMER and then click Add to import list. CUSTOMER is added to the Objects to be imported list.
- Click Next.
On the next pages you create the operations that are available for your database export, then create the module to contain the export and to configure the properties for the adapter.
On the Configure Objects screen, for
Service Type, ensure Inbound
is selected as
shows. For the rest of the values, you can accept
the defaults, which means that the
Deleteoperations will be available for your database service.
Figure 13. Selecting the inbound operations
On the Generate Artifacts page, next to
Module , click New, as
Figure 14. Generating the artifacts for the JDBC adapter module
- In the New Integration Dialog that opens, check Create a module project and then click Next.
In the New Integration Project wizard, enter
ABCFINfor the module name, and then click Finish. The wizard closes and you will be back at the Enterprise Service Discovery wizard. By default the export name (shown next to Name) is
Set J2C Authentication Data Entry to
widNode/db2alias. This value is used when the application runs to determine whether a user is authorized to connect to the database. You'll configure the user IDs that will use this alias on the server later.
- Ensure Use discovered connection properties is selected. You would select Use connection properties specified on server if you will configure (or have already configured) the resource adapter on the server.
Ensure the DatabaseVendor is set to
DB2. Accept the default values for the other fields and then click Finish.
shows what is in the workspace after you have created
the export using the Enterprise Service Discovery
shows the JDBCInboundInterface interface. It contains
operations called by the JDBC resource adapter when a
create, update, or delete event occurs on a row in the
CUSTOMER database. These operations are one way, since
they are only used to notify your application of a
database event. The account name used to access the
database, in this case
Db2admin, is prepended to each business object.
Figure 15. The workspace after creating the JDBC adapter export
Each operation in the interface takes the Db2adminCustomerBG business graph, shown in Figure 17, as input. Therefore, when an event occurs, the data in the row added or changed will be contained within the business object in the business graph.
Figure 16. The JDBCInboundInterface interface
Figure 17. The Db2adminCustomerBG business graph
In this section, you create the import that will let your module access the ABCINS database. You could create an import so that your application can access the ABCFIN database also, but in this scenario we are just interested in being notified of ABCFIN updates and then synchronizing.
- In the Business Integration view, right-click and select New - Enterprise Service Discovery.
- Select JDBC EMD Adapter (version 220.127.116.11) from the CWYBC_JDBC Connector Project, and then click Next.
On the Configure Settings for Discovery Agent page,
set the username and password
for your database, set the Database URL to
jdbc:db2:ABCINS, and the Jdbc Driver Class to
com.ibm.db2.jcc.DB2Driver. Note that this step is the same as the configuration step in the export section, except that the database name is ABCINS rather than ABCFIN.
- Click Next. Click ExecuteQuery, and then select DB2ADMIN - Tables- CUSTOMER.
- Click Add to import list. CUSTOMER is added to the Objects to be imported list.
- Click Next, and on the Configure Objects page, for Service Type, select Outbound.
- Click Next and on the Generate Artifacts page, select the ABCFIN module. This is the module you created in the inbound steps.
Set the J2C Authentication Data Entry
Ensure Use discovered connection properties
is selected and set the DatabaseVendor to
DB2. Accept the default values for the other fields and then click Finish.
ABCFIN module in the Business Integration view
after completing the Enterprise Service Discovery wizard
for the outbound service. Notice that there is now a
JDBCOutboundInterface interface and a
Db2adminCustomerContainer business object.
Figure 18. The ABCFIN workspace after creating the outbound service
JDBCOutboundInterface shown in
contains two-way operations that allow you to create,
update, retrieve, and delete rows from the database. The
operations use the same
business graph that the inbound operations use. The
Db2adminCustomerContainer shown in
is a business object that contains an array of
Db2adminCustomerBG business graphs. This allows your JDBC
service to work with multiple rows from the database
with the retrieveAll operation.
Figure 19. The Db2adminCustomerContainer interface
Figure 20. The Db2adminCustomerContainer business object
You're finished using the Enterprise Service Discovery wizard to create the import and export. If you made a mistake or want to change any of the values you set, you can modify the EIS binding values on the Property view for the import or export. However you cannot change the interface. If you need to add, remove, or modify operations, you must run the wizard again.
In this section, you create the business logic that uses the JDBC resource adapter export and import that you created in the previous sections. Depending on your line of business, your logic might do many things as database changes come through the export, before you send or fetch data to or from the database via the import. You might even want different modules to use the import and export separately. In our example, you will just keep your two databases in sync with a single module by sending updates from ABCINS to ABCFIN. Therefore, the business logic just consists of an interface map, which maps the outbound interface to the inbound interface. This business logic means that any service calls to the export get passed to the import.
To create the interface map, follow these steps:
- Right-click Mapping under ABCFIN and select New - Interface Map.
In the New Interface Map wizard, for the
ABCFINand then for the Name, enter
DataSyncMediation, as Figure 21 shows:
Figure 21. The New Interface Map wizard
- Click Next.
For the Source interface, select
JDBCInboundInteface and for the
Target interface, select
Figure 22. Selecting the source and target interfaces
- Click Finish. The Interface Map editor opens.
In the Interface Map editor, connect the three
operations of the JDBCInboundInterface to the first
three operations of the JDBCOutboundInterface as
Figure 23. Implementing the interface map
- For each of the connections that you created in the previous step, click on the connection, then connect the inbound operation parameter to the outbound operation input. Since you are just copying the data by a one-way operation, the response does not need to be mapped.
- Save the Interface Map editor contents.
- Open the assembly diagram by double-clicking ABCFIN - Assembly Diagram. You see the import and export that were created by the Enterprise Service Discovery wizard.
- Expand ABCFIN - Mapping and then drag the DataSyncMediation interface map to the assembly diagram.
on the assembly diagram and then select
Wire to existing.
Since only one interface matches the
JDBCInboundInterfaceexport (which is the interface you chose for
DataSyncMediation), and only one interface matches the
JDBCOutboundInterfaceimport), the wires are created as Figure 24 shows:
Figure 24. The ABCFIN module
You have completed the implementation that uses the JDBC resource adapter import and export. In the next section, we'll run the application.
Before you run the JDBC resource adapter application, you need to configure your server so that the resource adapter can authenticate with the database. The first step is to map the alias that we mentioned earlier to the user ID and password for the database.
- Right-click the WebSphere Process Server in the Servers view and select Start.
- After the server has started, right-click the server and select Run administrative console.
- When the Welcome page of the administrative console opens, click Log in Enter your user name and password if you have enabled server security.
Expand Security and click
and then click J2C Authentication data,
as Figure 25
Figure 25. Creating a new authentication alias
- In the top left corner of the Global security page, click New.
For Alias, enter
db2aliasand for User ID and Password, enter a valid user ID and password for your DB2 installation, as Figure 26 shows:
Figure 26. Creating the DB2 authentication alias
- Click OK, click the save link, and then click the Save button.
Next you need to add the module to the server.
- Right-click the server in the Servers view, and select Add and remove projects.
Under Available projects, select ABCFINApp,
and then click Add, as
Figure 27. Adding the module to the server
- Click Finish.
When everything has successfully published, you see the
Polling has started
in the console. This message means that your server is
listening for an event to occur in the database.
Now, let's watch the database synchronization application do its work. To do so, you can add a row to the CUSTOMER table in ABCFIN (which is the database with which we want to keep ABCINS in sync). This causes the trigger you created earlier to run, and places an event in the event database.
Open the DB2 Control Center, and under
All Databases - ABCFIN - Tables,
double-click the CUSTOMER
Table shown in
to open it.
Figure 28. Selecting the CUSTOMER table
Click Add Row,
and enter values in the new row such as those in
Figure 29. Adding a row to the database table
- Click Commit. You will see deliverEvent and sendEvent messages printed in the console as the JDBC adapter polls the event table and sees new entries.
- Now, in the ABCINS database, open the CUSTOMER table. You will see the row that Figure 30 shows.
Figure 30. The updated CUSTOMER table
As you can see, the ABCINS database now contains the values that you entered in a new row in the ABCFIN database.
In this article you learned the basics of the JDBC resource adapter and how it supports inbound and outbound processing. You set up two example databases and then built an application that used the resource adapter to copy any changes to one database into the other.
|Database creation scripts 1 and 2||jdbcscripts.zip||2 KB||HTTP|
|Completed application||completedmodule.zip||9 KB||HTTP|
WebSphere Adapter for JDBC User Guide
Technical overview of the Adapter for JDBC
Getting connected with WebSphere Integration Developer adapters, Part 1: An introduction to connecting with adapters (developerWorks, Apr. 2007)
Getting connected with WebSphere Integration Developer adapters, Part 2: An introduction to the WebSphere Adapter for Flat Files (developerWorks, May 2007)
Overview of all WebSphere adapters
Enterprise Metadata Discovery
WebSphere Adapter Development Redbook
Introduction to creating business processes with WebSphere Integration Developer
J2EE Connector Architecture 1.5 specification
A guided tour of WebSphere Integration Developer
WebSphere Integration Developer product information
developerWorks: WebSphere Process Server and WebSphere Integration Developer resources
developerWorks: WebSphere Business Integration zone
developerWorks: WebSphere development tools zone
Meet the experts: WebSphere Integration Developer
Get products and technologies
Richard Gregory is a software developer at the IBM Toronto Lab on the WebSphere Integration Developer team. His responsibilities include working on the evolution and delivery of test tools for WebSphere Integration Developer.
Greg Adams was the lead architect for the user interface in the award-winning Eclipse platform, and more recently, has been the lead architect and development lead in the core WebSphere Business Integration Tools, including WebSphere Studio Application Developer Integration Edition and WebSphere Integration Developer. Greg led the delivery of IBM's first complete services oriented architecture (SOA) tools stack and the first BPEL4WS standards supporting Business Process Editor; both critical deliverables in support of IBM's On Demand strategy.
Randy Giffen is a senior software developer is the usability lead for WebSphere Integration Developer and WebSphere Message Broker Toolkit. He was responsible for WebSphere Integration Developer's business state machine tools and the visual snippet editor. Prior to to this he was a member of user interface teams for WebSphere Studio Application Developer Integration Edition, Eclipse, and VisualAge for Java.