Database trace for a target with -UPDATE off
The following example illustrates the database trace that can result from a common error encountered when developing maps have database targets.
Database type is ODBC
Status returned to engine: (0) Success
No existing connection was found.
Connection to datasource OracleProd has been established.
Interface library version 6.0(140)
Loading data for output card 1.
Database adapter version 6.0(140)
Starting database load...
Datasource: OracleProd
Userid : EVENTMGT
Password : ********
Update mode is off.
The columns are of the following types:
Column 1 (MEMBERID) type is VARCHAR, precision is 10.
Column 2 (ATTENDEEFIRSTNAME) type is VARCHAR, precision is 30.
Column 3 (ATTENDEELASTNAME) type is VARCHAR, precision is 50.
Column 4 (TITLE) type is VARCHAR, precision is 50.
Column 5 (COMPANYNAME) type is VARCHAR, precision is 50.
Column 6 (ADDRESS) type is VARCHAR, precision is 255.
Column 7 (CITY) type is VARCHAR, precision is 50.
Column 8 (STATEORPROVINCE) type is VARCHAR, precision is 20.
Column 9 (POSTALCODE) type is VARCHAR, precision is 20.
Column 10 (COUNTRY) type is VARCHAR, precision is 50.
Column 11 (PHONENUMBER) type is VARCHAR, precision is 30.
Column 12 (FAXNUMBER) type is VARCHAR, precision is 30.
Column 13 (EMAILADDR) type is VARCHAR, precision is 50.
Column 14 (MEMBERSINCE) type is TIMESTAMP, precision is 19.
The insert statement to be executed is:
INSERT INTO Membership VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?)
Error in SQLExecute
Message: ORA-00001: unique constraint (SYSTEM.UNIQUE_MEMBER_ID) violated
SQL State: 23000
The following values were being inserted:
Column 1 MEMBERID : I978-1964
Column 2 ATTENDEEFIRSTNAME : Jean
Column 3 ATTENDEELASTNAME : Fresnière
Column 4 TITLE : Marketing Assistant
Column 5 COMPANYNAME : Mère Paillarde
Column 6 ADDRESS : 43 rue St. Laurent
Column 7 CITY : Montréal
Column 8 STATEORPROVINCE : Québec
Column 9 POSTALCODE : H1J 1C3
Column 10 COUNTRY : Canada
Column 11 PHONENUMBER : (514) 555-8054
Column 12 FAXNUMBER : (514) 555-8055
Column 13 EMAILADDR : fresnierej@paillarde.org
Column 14 MEMBERSINCE : {ts '1999-07-09 23:18:04'}
Failed to insert a row (rc = -9).
Failed after 2 rows inserted.
Database load complete.
Error returned to engine: (-9) Failed to execute the SQL statement
Cleaning up and closing the transaction...
Transaction rollback was successful.
Status returned to engine: (0) Success
Commit was successful.
Database disconnect succeeded.
If you forget to specify the usage of the update setting (using -UPDATE either in the
setting in the Map Designer or Integration Flow Designer or in the command line), you may receive a database error resulting from the attempt to insert a row with a duplicate index.The first entry in this example highlighted in bold type indicates that update mode is off. Because update mode is off, the database adapter attempts to insert a row into the database table for each row produced by the map. The Message entry contains the message returned by the database driver to the database adapter describing the cause of the error. In this example, the row in error would violate the UNIQUE_MEMBER_ID constraint defined for the table. The next lines show the column values for the row in which the error occurred and the final result of the database operation.
There are several possible methods for resolving this problem. Depending upon the desired behavior, you might:
- Enable update mode when executing the map by using the -UPDATE ON or -UPDATE ONLY adapter command.
- Use the Delete adapter command (-DELETE) to remove all rows from the output database table before inserting the rows resulting from map execution.
- Build logic into your map to ensure that there is no existing row in the table prior to generating an output row to be inserted. This might be accomplished by either defining a query for the table being used as an input against which a LOOKUP or SEARCHUP function is performed or by using the DBLOOKUP function to check for an existing row.
- Use -BADDATA so that all rows with a unique MEMBERID are inserted. Those rows that would result in duplicate rows are then saved to a specified file.