Import overview
The import utility populates a table, typed table, or view with data using an SQL INSERT statement. If the table or view receiving the imported data already contains data, the input data can either replace or be appended to the existing data.
Like export, import is a relatively simple data movement utility. It can be activated by issuing CLP commands, by calling the ADMIN_CMD stored procedure, or by calling its API, db2Import, through a user application.
- Import supports IXF, ASC, and DEL data formats.
- Import can be used with file type modifiers to customize the import operation.
- Import can be used to move hierarchical data and typed tables.
- Import logs all activity, updates indexes, verifies constraints, and fires triggers.
- Import allows you to specify the names of the columns within the table or view into which the data is to be inserted.
- Import can be used with Db2 Connect.
Import modes
| Mode | Best practice usage |
|---|---|
| INSERT | Inserts input data into target table without changing existing data |
| INSERT_UPDATE | Updates rows with matching primary key values
with values of input rows Where there's no matching row, inserts imported row into the table |
| REPLACE | Deletes all existing data and inserts imported data, while keeping table and index definitions |
| Mode | Best practice usage |
|---|---|
| REPLACE_CREATE | Deletes all existing data and inserts imported
data, while keeping table and index definitions Creates target table and index if they don't exist |
| CREATE | Creates target table and index Can specify the name of the table space where the new table is created |
In IBM® Data Studio Version 3.1 or later, you can use the task assistant for importing data. Task assistants can guide you through the process of setting options, reviewing the automatically generated commands to perform the task, and running these commands. For more details, see Administering databases with task assistants.
How import works
- Locking tables
Import acquires either an exclusive (X) lock or a nonexclusive (IX) lock on existing target tables, depending on whether you allow concurrent access to the table. - Locating and retrieving data
Import uses the FROM clause to locate the input data. If your command indicates that XML or LOB data is present, import will locate this data. - Inserting data
Import either replaces existing data or adds new rows of data to the table. - Checking constraints and firing triggers
As the data is written, import ensures that each inserted row complies with the constraints defined on the target table. Information about rejected rows is written to the messages file. Import also fires existing triggers. - Committing the operation
Import saves the changes made and releases the locks on the target table. You can also specify that periodic take place during the import.
- The path and the name of the input file
- The name or alias of the target table or view
- The format of the data in the input file
- The method by which the data is to be imported
- The traverse order, when importing hierarchical data
- The subtable list, when importing typed tables
- Additional options
There are a number of options that allow you to customize an import operation. You can specify file type modifiers in the MODIFIED BY clause to change the format of the data, tell the import utility what to do with the data, and to improve performance.
The import utility, by default, does not perform commits until the end of a successful import, except in the case of some ALLOW WRITE ACCESS imports. This improves the speed of an import, but for the sake of concurrency, restartability, and active log space considerations, it might be preferable to specify that commits take place during the import. One way of doing so is to set the COMMITCOUNT parameter to "automatic," which instructs import to internally determine when it should perform a commit. Alternatively, you can set COMMITCOUNT to a specific number, which instructs import to perform a commit once that specified number of records has been imported.
There are a few ways to improve import's performance. As the import utility is an embedded SQL application and does SQL fetches internally, optimizations that apply to SQL operations apply to import as well. You can use the
compoundfile type modifier to perform a specified number of rows to insert at a time, rather than the default row-by-row insertion. If you anticipate that a large number of warnings will be generated (and, therefore, slow down the operation) during the import, you can also specify thenorowwarningsfile type modifier to suppress warnings about rejected rows.- Messages file
- During an import, standard ASCII text message files are written to contain the error, warning, and informational messages associated with that operation. If the utility is invoked through the application programming interface (API) db2Import, you must specify the name of these files in advance with the MESSAGES parameter, otherwise it is optional. The messages file is a convenient way of monitoring the progress of an import, as you can access is while the import is in progress. In the event of a failed import operation, message files can be used to determine a restarting point by indicating the last row that was successfully imported.Note: If the volume of output messages generated by an import operation against a remote database exceeds 60 KB, the utility will keep the first 30 KB and the last 30 KB.