IBM Support

Choosing options for collecting optimizer db2support data

Question & Answer


Which options for data collection are appropriate for collecting optimizer data using the db2support tool?


When collecting optimizer data with the db2support tool, there are a number of options available and the most appropriate options to use will depend on the situation that needs to be diagnosed.

This article will discuss the more common options and what should be considered when choosing to use a given option. Several of these can impact the success or failure of the data collection.

Collection Level Option

The first option that should be considered is which Collection Level to use. The (-cl or -collect) option allows for the specification of how much optimizer data to collect. However, more than just the amount of data need to be considered when using this option. Choosing the correct level can determine if the needed information is collected and perhaps prevent a database / instance outage.

In general, collection level 1 is the most used and requested option and in the absence of other direction, a good choice. See the table below for the specifics on each collection level and for some reasons to choose a different option.

What is collected
When to use
When to avoid
  • Export of catalog data includes all tables / views in the database.
  • Complete db2look output.
  • Basic system configuration related to the optimizer.
  • The query is not compiled by the optimizer.
  • Does not require an SQL statement, but will collect it if provided.
  • This option provides the most catalog and db2look data, which makes it useful if dealing with multiple queries involving different sets of tables.
  • For MQT matching issues. If the MQT is not considered, then the higher data collection levels that limit the objects to only those involved in the query will not include data for the MQT in question.
  • If the preparation / compilation of access plan results in a failure (of the query or the database).
  • This option does not collect the output from the db2exfmt tool, which limits the amount of analysis that can be performed.
  • If there are a large number of objects in the database (not the size of the object, but the number of them) then the catalog exports and db2look may take a long time to run. (The higher levels of data collection rely on the EXPLAIN tables being populated in order to determine which tables / views to collect data for.)
  • The export of the catalog data and the db2look output only includes tables/views involved in the query.
  • Basic system configuration related to the optimizer.
  • The query is compiled by the optimizer.
  • Output from the db2exfmt tool.
  • Requires the SQL statement.
  • This is the preferred and most requested option because it balances data collection with the impact of the collection.
  • When db2exfmt output is needed, which is normally the case.
  • If there are a large number of objects in the database, this option can result in a smaller file and my reduce the time needed to collect the data.
  • If the query will not compile correctly, using this Collection Level (or higher) will fail to collect the needed db2look and catalog exports.
  • If compiling the query causes a database or instance failure, using this Collection Level (or higher) can cause an outage.
  • The same data as collection level 1.
  • Plus, ".db2service" data dumps.
  • If additional dumps from the use of the ".db2service" are needed. Otherwise, use Collection Level 1.
  • The same reasons to avoid Collection Level 1 apply to this level.
  • The same data as collection level 2.
  • Plus, db2batch utility output.
  • This option collects db2batch data as well as the other data, which means that in addition to the access plan, it is possible to see snapshot information during execution.
  • The same reasons to avoid Collection Level 1 apply to this level.
  • The query is executed, so queries that modify data would causes changes in the database. For this reason, some types of queries are blocked.
  • Because the query is executed, queries with parameter markers will fail to run.

Additional db2support options

Below are some additional db2support options that should be considered as they can impact the success or failure of the data collection. Not all options may be available on all releases and fix packs. If in doubt, consult the help for the db2support tool for the version of DB2 that is being used.

What is does
When to use
-cs or -curschema
  • This sets the CURRENT SCHEMA value for the query.
  • If there are unqualified tables in the SQL statement. Often a database administrator is the one collecting the data. By default the query will attempt to use the database administrator's user id as the CURRENT SCHEMA for any unqualified tables. This can result in a SQL0204N error and a failure to collect the needed data.
-td or -delimiter
  • This sets the SQL statement delimiter.
  • If the SQL statement is terminated by something other then a semicolon (";"), then this option will instruct the db2support tool to use the provided terminator.
-co or -complete
  • Collects full catalog and db2look data.
  • If the issue is with MQT matching or there is other reasons to require a full set of catalog information while also collecting db2exfmt output, this option can be included for Collection Levels 1 - 3 to override the limiting the catalog and db2look details and collect a full set of information.
-B or -basic
  • Limits the data collected to just the optimizer data.
  • If the DIAGPATH has a lot of data, then this option will instruct the db2support tool to skip the normal baseline data it gathers (such as the db2diag.log and FODC directories).

Common reasons for a failure of data collection

The following are the most common issues that result in a failure of the data collections.

  • SQL statements without a statement terminator or a non-default terminator. The default is to have the SQL statement terminated with a semicolon (";"). If there is no terminator in the file, the SQL will fail to be explained. If a non-default terminator is used, the -td (or -delimiter) option needs to be specified on the db2support command.
  • The SQL file provided to the db2support tool includes multiple SQL statements or directives such as connect / terminate. There should be only one SQL statement in the file submitted to the db2support tool. The db2support tool will handle the connection and termination to the database and including other directives will prevent the collection from completing successfully. Any additional directives (such as current schema) should be passed to the tool through the command line options.
  • The SQL statement contains unqualified tables. This can lead to SQL0204N errors where objects are not found when running the collection. To compensate for this, the -cs (or -curschema) option should be used to specify the schema that the query normally executes under.
  • Explain tables from prior releases. If there are existing explain tables for the user running the db2support command, those tables will be used. However, if those tables were created based on the definitions from a prior release, the explain of the query may fail with an SQL0220N error. If that is the case, use the db2exmig tool to migrate the tables to the current release. (Note, if the explain tables do not exist prior to running the db2support tool with options to collect explain data, the db2support tool will create and then later drop the explain tables automatically. It will not drop explain tables that exist before it is run.)

[{"Product":{"code":"SSEPGG","label":"DB2 for Linux- UNIX and Windows"},"Business Unit":{"code":"BU001","label":"Analytics Private Cloud"},"Component":"DB2 Tools - db2support","Platform":[{"code":"PF002","label":"AIX"},{"code":"PF010","label":"HP-UX"},{"code":"PF016","label":"Linux"},{"code":"PF027","label":"Solaris"},{"code":"PF033","label":"Windows"}],"Version":"9.7;9.5;10.1;10.5","Edition":""}]

Document Information

Modified date:
16 June 2018