Best practices when using Data Studio and Optim Development Studio with DB2 for z/OS

IBM Data Studio and Optim™ Development Studio are related products in the Optim portfolio of integrated data management tools. Although there has been much published on the usage of these tools, little information focuses on using these two products when accessing DB2® for z/OS® databases. This article answers commonly asked questions and describes best practices that one of our authors developed as a result of her work with customer questions and problem records.

Share:

Marichu Scanlon (marichu@us.ibm.com), Advisory Software Engineer, IBM

Photo of Marichu ScanlonMarichu Scanlon is the Data Studio Team Lead for the Continuing Engineering department at Silicon Valley Laboratory in San Jose, California. This group is responsible for maintaining the code bases for Data Studio and InfoSphere Data Architect.


developerWorks Contributing author
        level

Kathryn Zeidenstein (krzeide@us.ibm.com), InfoSphere Guardium Evangelist, IBM

Photo of Kathryn ZeidensteinKathy Zeidenstein has worked at IBM for a bazillion years. Currently, she is working as a technology evangelist for InfoSphere Guardium data activity monitoring, based out of the Silicon Valley Lab. Previously, she was an Information Development Manager for InfoSphere Optim data lifecycle tools. She has had roles in technical enablement, product management and product marketing within the Information Management and ECM organizations at IBM.



10 June 2010

Also available in Chinese Portuguese

Introduction

This section briefly reviews the history of the products described in this article.

Data Studio

IBM Data Studio is a no-charge, downloadable offering that provides basic database administration and database development capabilities, including the following:

  • Connecting to the subsystem
  • Viewing data server objects
  • Creating objects
  • Developing and debugging stored procedures and UDFs
  • Running Visual Explain
  • Tuning single queries

IBM Data Studio is the replacement product for the Developer Workbench, which previously replaced the Development Center.

For more information on the capabilities in Data Studio for DB2 for z/OS, see the Appendix.

Full support is offered for Data Studio based on your entitlement to one of the following IBM database servers:

  • DB2 for Linux®, UNIX®, and Windows®
  • DB2 for i®
  • DB2 for z/OS
  • Informix®

Currently, Data Studio is available in the following two packaging options:

Stand-alone package
The stand-alone package is a small-footprint, stand-alone offering ideally suited for database administrators. The package is sufficient for most users who perform standard database administration or database routine development. It includes basic, single-query tuning capability that was previously included in Optimization Service Center. The stand-alone package does not include SQLJ, Data Web Services, or XML development capability.
Integrated development environment (IDE) package
The IDE package is better suited for developers who want to shell share with other Eclipse offerings. The package includes most administrative capabilities, as well as an integrated Eclipse development environment for Java, XML (for DB2 for Linux, UNIX, and Windows), and Web services development. Single query tuning capability is currently not included.

Unless otherwise stated, the best practices in this document apply to both packages.

Optim Development Studio

Optim Development Studio is a natural upgrade product for Data Studio in terms of its added functionality for DBAs, database developers, and data access developers. It includes many of the same capabilities as Data Studio plus a rich set of pureQuery capabilities.

Table 1. Additional capabilities in Optim Development Studio and pureQuery
FunctionData StudioOptim Development Studio
Integrated Query Editor (SQL and XQuery)XX
SQL BuilderXX
Visual Explain for IBM databasesXX
Visual Explain for Oracle databasesX
Basic single-query tuning (access plan graph, query formatter, statistics advisor, environment capture)X
SQL Routine DebuggerXX
PL/SQL Routine Debugger against DB2 for Linux, UNIX, and Windows databasesXX
PL/SQL Routine Debugger against Oracle databasesX
Java Routine Debugger*X
XML Editor*X
Data Web Services*X
Create, alter, drop IBM database objectsXX
Create, alter, drop Oracle database objectsX
Manage security privileges for IBM database objectsXX
Manage security privileges for Oracle database objectsX
Deploy Web Services on DataPower appliances & JMSX
Tooling to build pureQuery code and bind pureQuery packagesX
Correlate SQL to Java source codeX
Impact Analysis for Java (object, SQL, source code)X
SQL execution statisticsX
Execute pureQuery codeX

* Requires IDE package of Data Studio

See the Resources section of the article for more information about pureQuery capabilities.

Because Optim Development Studio and Data Studio share some of the components and some of the best practices hold true for both products, we are including both products in this article. Unless stated otherwise, the best practices described in this article hold true for both products.

Ensuring DB2 for z/OS is set up properly

Even though Data Studio and Optim Development Studio (the tools) are easy to install and run, if your server is not correctly configured to handle the calls that come from one of the tools, you cannot successfully use them. For example, some jobs must be run on the server side to catalog the stored procedures that enable the retrieval of metadata used in populating the Data Source Explorer. Other jobs set up the environment to ensure the success of stored procedure deployment and debugging. In many cases, improper environment setup is the cause of problems in using the tools.

Although many of the required jobs have most likely been run in your environment when DB2 was installed, there are cases, such as migration from an older release, when the proper stored procedures might not have been deployed. This section serves as a checklist you can use with your system programmer to ensure your server is correctly configured to work with the tools.

See the stored procedure IBM Redbooks® publication listed in the Resources for details about the following sections.

Ensure DB2-supplied stored procedures are installed

Verify that the required DB2-supplied stored procedures are installed and that they are using valid WLM application environments The following DB2 for z/OS V8 and V9 customization jobs create and bind the various DB2-supplied stored procedures that the tools use to populate the folders in the Data Source Explorer.

DSNTIJRX
Adds REXX support for the DB2-supplied stored procedure DSNTPSMP, which enables creation of external stored procedures.
DSNTIJTM
Sets up the DSNTWR program executed when the DB2-supplied stored procedure WLM_REFRESH is called.
DSNTIJSG
Catalogs and binds the following where needed:
  • The DB2-supplied metadata stored procedures
  • The DSNTPSMP, DSNTBIND, and WLM_REFRESH stored procedures
  • The Java processing stored procedures
  • The ADMIN stored procedures
DSNTIJMS
Catalogs and binds, where needed, the DB2-supplied metadata stored procedures the SQL tooling uses

If these stored procedures are not present, the user sees an exception, and an sqlcode -440 is entered in the error log when a stored procedure is deployed or executed. The code is also added to the error log when you populate some of the pages in the SQL builder within the New Stored Procedure wizard.

Verify that you have appropriate authorization

The tools perform queries and updates to catalog tables on your behalf when developing and executing SQL statements, stored procedures, and, if you are using Optim Development Studio, DB2 packages. As you might expect, you need the appropriate DB2 authorizations in order to do these operations. In addition to these DB2 authorizations, you might also need RACF and USS authorizations to access certain resources.

What do Extract and Load mean when the target is DB2 for z/OS?

The Data Source Explore options to extract and load data from the tools do not invoke any DB2 for z/OS utilities. Extract can be used to put a relatively small amount of data into a file. Load takes data in a Windows- delimited file as input.

General authorizations
Table 2 lists the general authorizations required for users to create or modify objects in the tools. The Data Object Editor used in the Data Source Explorer enables users to create and modify objects from most of the nodes in the Data Source Explorer, so appropriate authorizations must exist to support that work. Example authorizations include sample contents, insert, update, delete, extract, and load table data.
Table 2. General authorities and privileges
TaskAuthority and privileges
Access target databaseCONNECT
Create a view or table using the Data Object Editor or embedded SQL within an applicationCREATEIN for the schema
Retrieve rows from a table (Sample Contents, Return All Rows, and Extract)SELECT
Insert or update an entry into a table or view using the Data Object Editor or embedded SQL within an applicationINSERT or UPDATE
Delete rows from a table or view using the Data Object Editor or embedded SQL within an applicationDELETE
Drop any object from the Data Source or embedded SQL within an applicationYou must have ownership of the object and at least one of the following:
  • DROPIN privilege for the schema or all schemas
  • SYSADM or SYSCTRL authority
ALTER any object from the Data Source Explorer or embedded SQL within an applicationYou must have ownership of the object and at least one of the following:
  • ALTERIN privilege for the schema or all schemas
  • SYSADM or SYSCTRL authority
Register stored procedures with a database server Any of CREATE PROCEDURE authority privileges:
  • SYSADM or DBADM
  • CREATEIN for the schema
  • IMPLICIT_SCHEMA
  • CREATEIN privilege on collection ID
Debug a stored procedure in the toolDEBUGSESSION
Run or call a stored procedureSYSADM or DBADM, or EXECUTE or CONTROL for the package associated with the stored procedure (for SQL-stored procedures or Java-stored procedures with embedded SQL)
Catalog authorizations
The tools access a number of DB2 system catalog tables when you browse an object's properties or if you create or update (alter) many of the objects in the Data Source Explorer. You need at least SELECT privilege to gather information from these catalogs. Usually, SELECT privilege to the catalogs is granted to PUBLIC.
Execute privileges on DB2-supplied stored procedures
The authorization ID that logs in to the connection should have the required execute privilege on the DB2-supplied stored procedures that the tools use. When a system programmer or administrator runs the jobs that install the DB2-supplied stored procedures, the last step in the job is to grant execute privileges on those stored procedures to PUBLIC. However, some administrators might have a customized that step such that users do not have the correct privileges by default.
RACF authorizations
When deploying DB2 for z/OS stored procedures, certain RACF authorizations are needed in order for the tool to refresh the WLM environment remotely and to interface with JES2. The system programmer can customize and execute the following members from SDSNSAMP to achieve this:
  • DSNTEJ6W, step PH06WS01: Customize this step to create and populate a RACF resource profile in resource class DSNR to allow users to call WLM_REFRESH. If this step is not done, the tools display a warning when the WLM_REFRESH is called while deploying a stored procedure using the tools.
  • DSNTIJRA, step DSNADSR: Make sure to register to Program Control the DB2 administration enablement modules that implement the following stored procedures:
    • SYSPROC.ADMIN_JOB_FETCH
    • SYSPROC.ADMIN_JOB_CANCEL
    • SYSPROC.ADMIN_JOB_QUERY
    • SYSPROC.ADMIN_JOB_SUBMIT
    • SYSPROC.ADMIN_COMMAND_UNIX
  • For the Unified Debugger setup, when the Session Manager is set up as a started task in z/OS, a RACF resource profile named DB2UDSMD must be defined. Refer to the stored procedures IBM Redbooks publication in the Resources section for the details on creating this profile.
UNIX System Services authorizations
While it might seem obvious that users require read/write (RW) authorization to a particular file structure, there are situations where this requirement is not as obvious. The system programmer must ensure that user have RW authorization in the following situations.
  • When the Session Manager (for Unified Debugger) is launched as a started task in z/OS, an HFS file is created in USS to hold the DB2UDSMD profile environment settings. The user invoking the Session Manager needs RW authority on this HFS file and directory. Refer to the Redbook in the Resources section for details on creating this HFS file.
  • When there are multiple JDK versions available in the z/OS server, the user needs RW authority on all the versions that the tools support. This is because it is possible for the client JDK to be at a different level than the server. The location of the JDK to be used when executing a Java stored procedure can be controlled through the JAVAENV dataset in the WLM application environment procedure corresponding to this JCL.

Verify the Unified Debugger setup (for routine debugging only)

The previous section mentioned some required authorizations for using unified debugging. This section describes some additional setup steps.

First, here is a quick review of the Unified Debugger. The Unified Debugger is the latest debugger technology. The Unified Debugger supports both DB2 for z/OS V8 (with PTF UK25860 from the same release date as DB2 for Linux, UNIX, and Windows V9 FP2) and DB2 for z/OS V9. For DB2 for z/OS V9, the Unified Debugger enables debugging for Java stored procedures, native SQL procedures, and external SQL stored procedures. You can read more about using the Unified Debugger and options for setup in the developerWorks article series in the Resources section.

Complete the following steps to ensure that the server is correctly configured for debugging.

  1. Ensure that the sample job DSNTIJSD has been customized to run successfully on DB2 for z/OS by querying the SYSPROC.SYSROUTINES catalog and locating the following stored procedures:
    • SYSPROC.DBG_INITIALIZECLIENT
    • SYSPROC.DBG_TERMINATECLIENT
    • SYSPROC.DBG_SENDCLIENTREQUESTS
    • SYSPROC.DBG_SENDCLIENTCOMMANDS
    • SYSPROC.DBG_RECVCLIENTREPORTS
    • SYSPROC.DBG_ENDSESSIONMANAGER
    • SYSPROC.DBG_PINGSESSIONMANAGER
    • SYSPROC.DBG_LOOKUPSESSIONMANAGER
    • SYSPROC.DBG_RUNSESSIONMANAGER
  2. Ensure that for each of those procedures, the WLM application environments are set up with the correct NUMTCBs. The values of the recommended NUMTCBs for the DB2-supplied stored procedures are listed in the IBM Redbooks publication (see Resources).
  3. If you already have Data Studio or Optim Development Studio set up, you can use the Data Source Explorer to verify these NUMTCBs. Open the DB2 for z/OS connection folder, and click Schemas > SYSPROC > Stored Procedures, as shown in Figure 1.
Figure 1. Debugger stored procedures shown in Data Source Explorer
Screen cap: stored procedures shown in Data Source Explorer tooling interface, including DBG_ENDSESSIONMANAGER

As described in the stored procedure IBM Redbooks publication and in the developerWorks article series (see Resources), a key component of stored procedure debugging is the Session Manager, which can be run on the z/OS server, or on the client. Complete the following steps to start the Session Manager.

  1. Before you start the Session Manager on the z/OS server, verify the following:
    • The RACF authorizations are set up as described in the Introduction under in Verify that you have appropriate authorization.
    • The DB2UDSMD environment profile file is allocated in a directory that you have RW access to.
    • The started task DB2UDSMD JCL is registered in SYS1.PROCLIB. Test this by issuing the S DB2UDSMD command in the z/OS console.
  2. To start the Session Manager on the client, run the db2dbgm.bat script.
  3. Capture the IP address and port.
  4. Verify that the Preferences settings for the Unified Debugger are set to those values by going to Window > Preferences > Run/Debug > Unified Debugger in the tools.
  5. Click Already running, and enter the IP address and port in the appropriate text boxes.
  6. If your environment has a firewall between the DB2 for z/OS server and the workstation where the tools are installed, and if your Session Manager is on the client, put an exception in your firewall to allow the client Session Manager to ping and communicate with the DB2 for z/OS server.
  7. Start the Session Manager on the z/OS server.

Setting up Visual Explain and query tuning (Data Studio stand-alone only)

Starting with Data Studio v2.2.0.2, some single-query tuning capabilities are available in the stand-alone package, along with Visual Explain. However, in order to use these functions, the DB2 for z/OS server must be configured correctly.

To configure your DB2 for z/OS subsystem to work with Data Studio 2.2.0.2 stand-alone, the system programmer completes the following steps.

  1. Install PTF PK58941(DSN5OFM).
  2. Modify DSNTESC and DSNTIJOS for your DB2 for z/OS system, and run them. These JCL files are in the DB2 sample library.
  3. Install the DBRMs that are included in the dbrm_dsrcp.zip file, as shown in Figure 2. See Resources for where this file is available for download
Figure 2. Data Studio stand-alone download site provides updated DBRMs for query tuning
Screen cap: Download using Download Director tab with Contains DBRMs to config query tuner DBRM_DSRCP.zip highlighted

When you run the above JCL files, notice that, in addition to the tables and packages that are required to tune single queries that run on DB2 for z/OS, these JCL files create similar objects for tuning query workloads. Tuning workloads is not supported in Data Studio. Therefore, you do not have to grant privileges on any of the objects for tuning workloads.

After applying the PTF, use Data Studio to bind the packages for single query tuning by completing the following steps.

  1. From the Data Source Explorer, right-click the database, and click Query Tuner > Configure Database for Tuning. A wizard opens. Because this wizard is also used by Optim Query Tuner and Optim Query Workload Tuner, some UI items are not relevant for binding these packages. For example, you can skip the advisor-related and workload-related pages because these do not apply to the stand-alone Data Studio.

    Data Studio checks all the prerequisites for configuring and reports them in the first page. If your system programmer did not bind the DBRMs on your behalf after applying the PTF, then the Packages status field shows Package Not Found, as shown in Figure 3.

Figure 3. Configuration wizard checks for query tuning prerequisites
Screen cap: An error indicates that the ADMF001 package is not found
  1. Click Next. The next page enables you to bind the DBRMs that your system programmer installed.
  2. Click Bind to bind these DBRMs, as shown in Figure 4.
Figure 4. Bind packages for basic, single query tuning
Screen cap: Bind button is highlighted with some packages selected
  1. After binding, create your Explain tables, as shown in Figure 5. You can skip this step if you are reusing tables, such as when teams share existing Explain tables. If your Explain tables are in V8 format and you have migrated to V9, you can migrate the Explain tables and preserve the information in these tables. Customize the SDSNSAMP member DSNTESC to perform the migration for you.
Figure 5. Option to create Explain tables
Screen cap: Enable EXPLAIN tables window with Create Table Spaces and Tables highlighted
  1. If the DB2 for z/OS server was previously configured to use with Visual Explain for DB2 for z/OS or to use with Optimization Service Center (OSC), install updated DBRMs into your server and rebind these packages. The package versions of OSC are different than those of stand-alone Data Studio, even though they are sharing the same set of tables. For OSC and Data Studio to coexist, you need to bind a version for each product. Note that only OSC FP7 and later are compatible with Data Studio. See Resources for links to Technotes that contain more details.

Optimizing data sources solutions for integrated data management

This section describes some productivity and performance tips when using the Data Source Explorer.

Use filtering on the Data Source Explorer

In Verify the Unified Debugger setup (for routine debugging only), you used filtering to limit the number of objects that are displayed, which can help tremendously with performance and productivity when you administer a large number of objects. To activate a filter, complete the following steps:

  1. Right-click the folder (such as Schemas), and select Filter.
  2. In the Filter window, uncheckDisable Filter.

There are two ways to filter:

  • By specifying a filter predicate, such as Starts with the characters DBG
  • By selecting from a list of values for the object being filtered. For example, in Figure 6, the filtering is applied on the Schemas folder, and only objects within the SYSPROC schema are to be displayed. This coordinates with the filtering you saw in Figure 1.
Figure 6. Connection filter properties
Screen cap: Connection Filter Properties window with the SYSPROC object selected and highlighted

Use configuration repository (only available in Optim Development Studio and other Optim tools)

If you have team members with a common set of data servers that they connect to, your team might benefit from the Optim Development Studio's capability to specify configuration repositories, which avoids the need to manually pass connection information around to all members of a team. The configuration repository is available only in the priced, Eclipse-based Optim products, including Optim Development Studio, Optim Database Administrator, and InfoSphere™ Data Architect.

A configuration repository is a set of tables created in a database server that members of a team or project can access. The configuration repository, which requires a one-time setup, stores the information for the database connections that the team members need to use.

For example, assume that a team working on a project requires connectivity to the following data servers:

  • DB2LUW1: a DB2 on LUW data server
  • DB2Z_V81: a DB2 on z/OS V8 server in New Function Mode
  • DB2Z_V91: a DB2 on z/OS V9 server in New Function Mode

As the administrator, select a database that is accessible by everyone on the team, and configure a repository there. This repository holds connection information, including database name, database location, host, and JDBC driver location of the servers above. The connections to the servers do not need to be already created or active when the repository is created or accessed.

When a member of the team launches Optim Development Studio, he simply creates a connection to the database server that contains this repository, and he will be able to access the connection information for all three servers, even if the connection is not active. You can find a detailed discussion in the article, "Using common connections with Optim solutions” (see Resources).

Share profiles using export/import

If you are using IBM Data Studio, the configuration repository is not an option for you. Another way to share common connections is to export specific connection profiles and have your team members import them into their workspaces. There are limitations to using the connection profiles when compared to the configuration repository, including the following:

  • Connection must be in a connected state in order to be exported.
  • Unless you originally create the connection without saving it, the user ID and passwords will be exported. It is recommended that any connections you want to share with others not include this information.
  • Exported connection profiles cannot be grouped logically.
  • Imported connection profiles based on a DB2 alias will require the same alias names to exist in the tools into which they are imported.
  • You will need to tell your team members where on the file system the exported connection file resides.

To export one or more connection profiles, complete the following steps.

  1. Click the Export icon location of icon in upper right of data source explorer at the top right corner of the Data Source Explorer.
  2. Select the connection profiles you want to export, and specify a file to which the profile will be saved. In the example, the profiles for EC184V9 and EC218V9 are selected into a file named ConnProfile, as shown in Figure 7. This location could be part of a shared file repository in a well-defined location that is shared with other members of the team. Notice also that that the specified information is encrypted.
Figure 7. Choosing connection profiles to export
Screen cap: profiles EC184V9 AND EC218V9 are selected, and c:\Export\ConnProfile is file name

Now, when other users launch their workspaces, they can import the connection profiles.

  1. Click the Import icon, as shown in Figure 8.
Figure 8. Click the marked icon to import connection profiles from the file system
Screen cap: import is located in upper right of data source explorer
  1. In the Import Connection Profiles dialog, specify the location of the file containing the exported connection profiles.

Subset your DB2 Connect aliases

Some organizations maintain a DB2 Connect connection profile, which contains the DB2 aliases for all of the DB2 servers available in their systems. This profile is usually imported into each team's or project’s DB2 Connect. Many teams really need to access only a subset of those server aliases. Using the tools, a team can filter the entire list of aliases used for DB2 Connect.

To filter your DB2 Connect aliases, complete the following steps.

  1. Launch the tool with the default setting, which loads all DB2 aliases.
  2. Delete all the DB2 aliases that you don’t need. For example, you might want only those DB2 aliases that target DB2 for z/OS V9. When you restart the tool, the deleted DB2 aliases are no longer listed in the Database Connections folder, as shown in Figure 9.
Figure 9. Before and after DB2 aliases are deleted
Screen cap: Explorer with 21 DB2 connections listed; after deleting, results shows 5

If you need to restore your deleted DB2 aliases, complete the following steps.

  1. Click Window > Preferences > Data Management > DB2 Options.
  2. Check Restore deleted DB2 aliases on startup, as shown in Figure 10.
Figure 10. Restore aliases to view
Screen cap: DB2 Options shows Restore deleted DB2 aliases on startup checked and highlighted

If you don’t want to use DB2 aliases at all, and you prefer to use your own set of JDBC drivers, complete the following steps.

  1. Click Window > Preferences > Data Management > DB2 Options.
  2. Uncheck Load DB2 alias connections on startup.
  3. Restart the workspace by clicking File > Restart.

DB2 aliases no longer appear in the Database Connections folder. Note that if a DB2 alias is a target server for a data development project in the workspace, unchecking Load DB2 alias connections on startup does not remove this DB2 alias from the next startup. This is because the tools detect that this connection is required by one of the projects.

Work offline

The tools let you work offline, which means that you are unconnected to the database. This is useful when you are developing SQL statements and routine logic and you do not need to examine the output returned from executing the statement or routine, such as in the case of result sets and output parameters.

To work offline, complete the following steps.

  1. Save the metadata model by right-clicking the connection profile in the Data Source Explorer and selecting Save Offline, as shown in Figure 11).
Figure 11. Options to save metadata for current work and to work offline
Screen cap: right click EC184V9 shows Save offline and Work offline highlighted
  1. Select Work Offline. This action disconnects you from the database server.

Use a flat view of objects

If you want to see a view of objects similar to how the DB2 Control Center displays them, you can switch the Data Source Explorer from a hierarchical (tree) view to a flat view by clicking the view toggle icon, as shown in Figure 12.

Figure 12. Highlighted icon toggles between flat and hierarchical views
Screen cap: icon is third icon from the left in Data Source Explorer

When you request a flat view, the tools refresh your Database Connections folders and display the folders by type. When you click a node, the elements of that node are displayed in the editor in a tabular format. For example, Figure 13 shows that the Table Spaces node is selected, and all the table spaces are listed.

Figure 13. Flat view lists objects by type
Screen cap: folders listed as databases, schemas, tables;. Table spaces is highlighted; all table spaces in EC184V9 listed

Use a common version for the SQLJ/JDBC driver and JRE/JDK

The tools ship with a set of JDBC driver jar files and a set of Java Runtime Environment files. Optim Development Studio also ships with a Java Development Kit. Because the tools can coexist or shell-share with other Eclipse-based products, issues sometimes arise about which JDBC or JRE/JDK version to use.

Some customers prefer to use one version of the IBM Database Server Driver for JDBC and SQLJ (known commonly as the JCC driver) and JRE/JDK files for all their products. Other customers prefer to use the JCC driver version supplied by DB2 Connect, preferring to use a specific Java level for all their Java application development. Doing so prevents incompatibilities in the compiled or deployed code once they are in production.

Change the default jars by modifying the location of the jars by completing the following steps.

  1. Right-click Connection Profile > Properties > Driver Properties.
  2. Click the Edit Driver Definition icon Screen cap: to right of driver definition field is a triangle shape.
  3. In the Driver Definition dialog, click the Jar List tab.
  4. Select one of the jar files, and select Edit Jar/Zip, as shown in Figure 14.
Figure 14. Select this button to change location of SQLJ/JDBC driver jars
Screen cap: Edit JAR/Zip button is highlighted in Jar List
  1. Select the location of your common JDBC jar files.

Similarly, you can set a common JRE/JDK version in Optim Development Studio by completing the following steps.

  1. Click Window > Preferences > Java > Installed JREs.
  2. Click Add to add another version, as shown in Figure 15.
Figure 15. Add a new version of the JRE/JDK
Screen cap: a JDK is shown and the Add button is highlghted to the right of the selected jdk.
  1. Click Remove to remove the current version or to disable a version by unchecking it.

To set the JDK, complete the following steps.

  1. From the Preferences window, click Data Management > SQL Development > Stored Procedures and User Defined Functions > Deploy Options.
  2. In the Java Home section, click Browse to bring up the file browser, as shown in Figure 16.
Figure 16. Set the JDK from Deploy Options under Preferences
Screen cap: The Java home section of the Deploy preferences page is highlighted
  1. Go to the directory where the common JDK is located.

Optimizing productivity with SQL tools

This section describes some useful tips for SQL development.

Set preferences to limit result sets and size of data

Queries can return hundreds of thousands of rows and very large amounts of data, in which case execution can take a long time and slow down performance during development. Application developers usually do not need to see all the rows, so limiting the rows fetched by using the FETCH FIRST x ROWS ONLY clause helps in the performance of the result-set retrieval.

The tools provide you with the capability to limit the number of rows returned from a query or stored procedure result set. To limit the number of displayed rows returned from a query or stored procedure result set, click Windows > Preferences > Data Management > SQL Development > SQL Results View Options. This also limits the number of rows displayed in the SQL Results view.

For example, Figure 17 shows the number of rows returned from a query limited to 500 and the number of rows displayed in the tool (in the SQL Results view) limited to 500 as well.

Figure 17. Set SQL results limits in the SQL development area of Preferences
Screen cap: Max row count and Max display row count are hightlighted with a value 500 in each field

Use SQL templates

SQL templates are a great productivity enhancement in any of the following circumstances:

  • If you develop SQL statements that access the same tables over and over again
  • If the SQL statements follow some pattern
  • If you want to facilitate use of certain coding standards for SQL

With SQL templates, part of the statement has been pre-coded. Although the tools provide several common templates for SQL and XQuery, you can modify those templates and create your own templates that can be shared with other members of the team.

Complete the following steps to create your own templates using the Template Editor.

  1. Click Window > Preferences > Data Management > SQL Development > SQL and XQuery Editor > Templates to launch the Templates dialog, as shown in Figure 18.
Figure 18. Templates dialog lets you edit existing templates or create new ones
Screen cap: INSERT template highlighted; INSERT INTO ${table_name} (${col1}, ${col2}) VALUES (${val1}, ${val2})
  1. Select one of the templates from the list to see a preview of that template.
  2. Edit the template to suit your needs by selecting the Edit button, or create a new one by clicking New.

You can export templates you have created to a file and share the templates with other team members. The article “Building efficiencies into a DBA’s day” (see Resources) describes in more detail the use of command templates in Optim Database Administrator. However, the principle is the same for SQL and for other tools as well.

Import pre-defined SQL statements

Some organizations have a dedicated developer or group of developers who ensure that the SQL statements that are embedded in the applications are properly tuned. These statements are then handed to the application developers. Stored procedure developers can import predefined SQL statements into the scripts folder from a file system and later import them into their stored procedures.

To import predefined SQL statements, right-click the scripts folder of your project, and select Import. Direct the file browser to the location of your pre-defined SQL statements. You can use these pre-defined SQL statements in your stored procedures or pureQuery applications.

Use content assist

Along with templates, you can use content assist when you are creating or modifying your SQL statements. This feature enables you to validate the following:

  • The schemas within the database
  • The tables within a schema
  • The fields or columns within a table

This is an especially useful feature when modifying existing SQL statements.

If you have a filter applied, you do not see the objects that are filtered out, and the parser reports these objects as errors (that is, not found). You don’t have to disable your filter to get rid of the errors. Right-click the statement, and select Validate Table References. This performs a quick query on the tables used in your SQL statement and clears the errors if those tables are found in the connected database server.

Turn off statement validation until you are ready

Some DBAs or developers prefer having statement validation in operation while editing, while others prefer to wait until the statement is fully coded before having the tool validate the syntax. If you are a person who is annoyed by those red marks in the left margin while coding SQL, you can turn off validation by right-clicking in the SQL editor and unselecting the menu option to Validate Statement Syntax.

When you are done editing the statement, right-click in the editor again and turn validation back on, as shown in Figure 19.

Figure 19. Right-click in editor to select or deselect statement validation
Screen cap: in sql script editing, right-click menu shows Validate Statement Syntax option is checked

Optimizing routine development

This section highlights some best practices for developing stored procedures or user-defined functions.

Set preferences for deploying routines

As with SQL statements, you can set default values for your routine’s deployment options in Deploy Options preferences for stored procedures. To set these options, click Window > Preferences > Data Management > SQL Development > Stored Procedures and User Defined Functions > Deploy Options, and set the default routine options for Java and SQL routines.

Many organizations set the bind options and WLM Environment in the Preferences so that each new stored procedure has these values.

Another trick is to create a mask for some of the fields, like the WLM Environment. This is useful when a specific WLM application environment is assigned to a developer but the first 6 or 7 characters of the WLM Environment are the same. At stored procedure creation time, the developer simply adds the last character to the WLM Environment name.

For example, assume that all the WLM Environment names start with DB2WLMD, and each developer is assigned to a different WLM environment, starting with DB2WLMD1 through DB2WLMD9. Each developer can set the Preference to DB2WLMD, and during new stored procedure development, the WLM ENVIRONMENT field is prepopulated with the first characters of the WLM environment name, as shown in Figure 20.

Figure 20. Set deployment in preferences (left side) to prepopulate option at deploy time (right side)
Screen cap: DB2WLMD is WLM environment; 1 is appended to get DB2WLMD1 at deploy time

Specify a build owner, package owner, and qualifier

For stored procedures, the build owner is the value of the CURRENT SQLID in effect when the CREATE PROCEDURE is invoked. For external SQL stored procedures, the CREATE PROCEDURE is invoked by the build utility, DSNTPSMP, rather than by Data Studio itself. So, to correctly specify this, the Build Owner field in the Deploy wizard must be set. It is good practice to explicitly set these values rather than assume that the CURRENT SQLID is correctly set at deploy time. When you set this explicitly, the tools pass the value in Build Owner and Package Owner to DSNTPSMP, regardless of the value of the CURRENT SQLID at the time DSNTPSMP was invoked.

Also, some organizations give the bind authorization to secondary authorization IDs rather than to the primary authorization ID. By explicitly setting the build owner and the package owner to a secondary authorization ID, access to the objects manipulated by the stored procedure can be given to the secondary authorization ID, rather than the primary authorization ID used to log in to the server.

Limit the number of Routine Editor windows that are open

As with SQL statement development, you can turn off statement validation during stored procedure development. If you leave it turned on, the tools continuously validate the syntax for each modification done in any Routine Editor’s Source tab. Usually this validation completes very quickly. However, issues occur when there are many Routine Editor windows (tabs) opened or when the stored procedure being modified contains thousands of lines of code. So, it is a good practice to keep open only the Routine that you are modifying and close all other editors.

Explicitly specify applicable options for external and native SQL stored procedures

This best practice applies to the case when there are existing V8 stored procedures that you are modifying in the tools to use against V9. In V8, only external SQL stored procedures were available. However, as of V9, with the addition of native SQL stored procedures, there can be ambiguity around which type of stored procedure is being specified.

To avoid problems, make it a practice to always code the FENCED and EXTERNAL keywords for your external SQL stored procedures, and code the VERSION keyword for your native SQL stored procedures. The tools automatically add the FENCED keyword when the New Stored Procedure wizard is used to create an external SQL stored procedure. The tools also add the VERSION keyword when used to create a native SQL stored procedure. However, if an external SQL stored procedure created in DB2 for z/OS V8 is migrated to DB2 for z/OS V9, the stored procedure does not have the FENCED keyword. Therefore, the tools assume that this stored procedure is a native SQL stored procedure, because the VERSION keyword is optional for native SQL stored procedures. This can cause errors in both re-deployment and execution.

Use code fragments for code reuse

Organizations that want to standardize some of the code in their stored procedures can benefit from using this feature in the tools. Examples of common code include the following:

  • Common data declarations in SQL routines
  • Copyright prologs
  • Common exception handlers
  • Common imports, methods, and catch-try blocks for Java stored procedures.

In the New Stored Procedure wizard, on the Code Fragments page, click the ellipsis next to the code fragment to launch the file browser. You can then point it to the location of your common code, as shown in Figure 21.

Figure 21. Specifying a fragment
Screen caps: Ellipsis button highlighted; select file MYSP_SQL_HDR.FRAGMENT; fragment details in text editor

This page is customized to the language type of your stored procedure, and the tools can insert the common code in the proper area, as shown in Figure 22.

Figure 22. Inserted fragment
Screen cap: SPFRAG1shows sql procedure editor window with fragments inserted into the in-progress stored procedure

Connect to your target server before creating SQLJ routines

The tools rely on information obtained from the target database server to create the correct iterators for a SQLJ stored procedure. (Note that SQLJ procedure development is not supported in the Data Studio stand-alone package.) The New Stored Procedure wizard generates the iterator by connecting to the database and querying the data types of the columns participating in the query. If either the table has not been created, or the database server is not connected, the tools still generate an iterator. However, you need to edit the stored procedure with the table column information before deploying it.

Import SQL statements into your stored procedure

Some organizations prefer to hand the SQL statements to use in an application directly to their application developers. In some cases, the application developer is not knowledgeable in SQL. The section Optimizing productivity with SQL tools described how developers can import these statements into their projects. You can use the New Stored Procedure wizard to import or embed those statements into your stored procedure by completing the following steps.

  1. On the SQL Statements page of the wizard, click Remove to remove the default SQL statement in the Statement Details area.
  2. Click Import, as shown in Figure 23.
Figure 23. Remove existing default and then import desired SQL
Screen cap: SQL statements window showing Remove and Import buttons highlighted

The Import SQL Statements dialog is launched.

  1. Select the SQL statements that you want to use in your stored procedure, and click Import, which brings up the dialog shown in Figure 24.
Figure 24. Select the scripts you want to import
Screen cap: Script 1, Script 2, and Script 3 selected showing syntax; Import button highlighted

The selected statements are shown in the Statements window of the SQL Statements page.

  1. Click on each statement to display it in the Statement Details window.
  2. If you have more than one SQL statement on the SQL Statements page of the wizard, be sure to change the Result set field to Multiple, as shown in Figure 25. Otherwise, an input parameter called whichQuery is generated, and the statements are coded in a case expression.
Figure 25. For multiple SQL statements, specify multiple result sets (unless you want all statements in a case expression)
Screen cap: Result set field is highlighted and contains the value Multiple

Optimizing query tuning and Visual Explain

Refer to the section Setting up Visual Explain and query tuning to ensure you are correctly configured to use these tools.

Visual Explain provides a familiar access plan graph, and you can continue to use that if you are comfortable with it. With query tuning, you get much more capability. With Data Studio stand-alone single query tuning, you can do the following:

  • Capture queries from all data sources that the Optimization Service Center for DB2 for z/OS supports and from XML files that are exported from DB2 Query Monitor for z/OS.
  • View formatted queries by collapsing and expanding parts of the query and by seeing associated statistics for each part of the query.
  • View access plan graphs. The access plan graph provided with query tuning is enhanced over what you get with Visual Explain. For example, you can display the PLAN_TABLE record for the explained query (DB2 for z/OS only), whereas Visual Explain cannot. DB2 for z/OS also automatically checks that the Explain table is correct and gives you the option to migrate it if needed. However, the query tuning access plan graph is currently only included in Data Studio stand-alone and in the Query Tuner set of products. You can use Visual Explain in products such as Optim Development Studio. (Visual Explain also supports IDS and Oracle through ODS).
  • Capture information about the data server that queries run against. This capability is called query environment capture. It is also known in Optimization Service Center for DB2 for z/OS as service SQL.
  • Generate reports on the performance of queries.
  • Run the Query Statistics Advisor to analyze the statistics that are available for the data that a query accesses. You can also analyze conflicting statistics. And you can look for additional statistics that you might capture to improve how the data server processes the query.

Recommendation: If you are using Data Studio stand-alone with DB2 for z/OS or Linux, UNIX, and Windows, use the no-charge query tuning capability in Data Studio.

Launching query tuner or Visual Explain

People often are not familiar with how to launch query tuning from Data Studio.

In Data Studio V2.2.0.2 stand-alone, you launch either the basic single query tuning capability or Visual Explain from the same launch points using the following methods.

  • In the SQL Editor, highlight and right-click the SQL statement and select Open Visual Explain (for Visual Explain) or Start Tuning (for single-query tuning, including access plan graph). Note, if your statement is an XPATH, or XQUERY statement, you can only use Visual Explain.
  • In the Data Project Explorer, right-click an SQL statement that contains an INSERT, an UPDATE, a SELECT, an SQL stored procedure, or an SQL user-defined function. Select Open Visual Explain or Start Tuning.
  • In the Data Source Explorer, right-click an SQL stored procedure, a view, or an SQL user-defined function, and select Open Visual Explain or Start Tuning.

Note: If the tools find more than one INSERT, UPDATE, SELECT, or XQUERY statement inside an SQL stored procedure or SQL user-defined function, only the first statement is used for either Visual Explain or query tuning.

When you use basic single-query tuning, you are prompted to use an existing project or to create a new Query Tuner Project. The editor opens, as shown in Figure 26.

Figure 26. Basic single query tuning editor provides many capabilities
Screen cap: Choose Tuning Activities link is highlighted in the Optim Query Tuner editor

To create an access plan graph in query tuner, select Choose Tuning Activities, and check the Access Plan Graph box Screen cap: Access Plan Graph is shown checked in Choose Query Tuning Activities window. Alternatively, if you select Tune Query, the statistics advisor runs, the query is formatted nicely, and an access plan graph is generated.

Sharing Explain tables

Some organizations use one set of Explain tables that they also migrate from one version to another. You can specify the schema for the Explain tables you want to use when you launch Visual Explain by specifying the CURRENT SCHEMA in the Collect Explain Data wizard, as shown in Figure 27.

Figure 27. Sharing a common set of Explain tables by specifying the schema
Screen cap: Collect Explain Data window iwth CURRENT SCHEMA highlighted with value ADMF001

The section Setting up Visual Explain and query tuning described how you can skip creating the Explain tables, which enables various projects to use the same tables.


Conclusion

The information in this article should help you get up and running successfully with Data Studio and Optim Development Studio with your DB2 for z/OS system. Note that there are active discussion forums for both products on developerWorks. It’s a good idea to subscribe to these forums and take advantage of both IBM and community knowledge about the products.


Acknowledgments

The authors would like to thank Kendrick Ren and Paul Wirth for their review of this article.


Appendix. Features in Data Studio that support DB2 for z/OS

Here are the features in Data Studio that support DB2 for z/OS.

Table 3. Summary of Data Studio support for DB2 for z/OS
Functionality groupingDetails
Database managementGenerate DDL
Manage and alter storage groups
Analyze impact
Add to overview diagram
TableCreate, open (edit data), query, alter, drop
Create: index, alias, trigger
Manage privileges
Generate DDL
Value distributions
Add to overview diagram
View managementCreate, open, query, alter, drop
Manage privileges
Extract data
Analyze impact
Value distributions
TableCreate, open (edit data), query, alter, drop
Create: index, alias, trigger
Manage privileges
Generate DDL
Value distributions
Visual Explain
Add to overview diagram
Alias managementCreate, open (edit data), query, alter, drop
Analyze impact
Value distributions
Trigger managementCreate, alter, drop
Analyze impact
Schema managementCreate, alter, drop
Manage privileges
Generate DDL
Index managementCreate and drop (alter only using drop or recreate)
Run statistics
Table space managementCreate, alter, drop
Manage privileges
User-defined distinct typesCreate, alter, drop
Analyze impact
User-defined functionsCreate and alter using routine editor/SQL editor
Drop
Manage privileges
Analyze impact
Stored proceduresCreate, alter using routine editor/SQL editor
Drop
Debugging SQL and Java (Java requires IDE package)
Manage privileges
Database usersAdd or change
Analyze impact
Generate DDL
RolesCreate, alter, drop
Analyze impact
View membership
XML managementGenerate DDL
Register with XSR (requires IDE)
Drop (requires IDE)
Annotated schema editor
XML editor (requires IDE)
XML Schema editor (requires IDE)
Data Web ServicesCreate and deploy Data Web Services (requires IDE)
Connection managementIntegration with Kerberos and LDAP
Exporting connection profiles
Connection working sets
Query usability and diagnosticsSQL and XQuery editor with content assist and query formatting
Explain query
Advanced query formatting
Single query stats advisor
Advanced Visual Explain (access plan graph)
z/OS environment capture for serviceability

Resources

Learn

Get products and technologies

Discuss

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Information management on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Information Management
ArticleID=495250
ArticleTitle=Best practices when using Data Studio and Optim Development Studio with DB2 for z/OS
publish-date=06102010