This section briefly reviews the history of the products described in this article.
IBM Data Studio is a no-charge, downloadable offering that provides basic database administration and database development capabilities, including the following:
- Connecting to the subsystem
- Viewing data server objects
- Creating objects
- Developing and debugging stored procedures and UDFs
- Running Visual Explain
- Tuning single queries
IBM Data Studio is the replacement product for the Developer Workbench, which previously replaced the Development Center.
For more information on the capabilities in Data Studio for DB2 for z/OS, see the Appendix.
Full support is offered for Data Studio based on your entitlement to one of the following IBM database servers:
- DB2 for Linux®, UNIX®, and Windows®
- DB2 for i®
- DB2 for z/OS
Currently, Data Studio is available in the following two packaging options:
- Stand-alone package
- The stand-alone package is a small-footprint, stand-alone offering ideally suited for database administrators. The package is sufficient for most users who perform standard database administration or database routine development. It includes basic, single-query tuning capability that was previously included in Optimization Service Center. The stand-alone package does not include SQLJ, Data Web Services, or XML development capability.
- Integrated development environment (IDE) package
- The IDE package is better suited for developers who want to shell share with other Eclipse offerings. The package includes most administrative capabilities, as well as an integrated Eclipse development environment for Java, XML (for DB2 for Linux, UNIX, and Windows), and Web services development. Single query tuning capability is currently not included.
Unless otherwise stated, the best practices in this document apply to both packages.
Optim Development Studio
Optim Development Studio is a natural upgrade product for Data Studio in terms of its added functionality for DBAs, database developers, and data access developers. It includes many of the same capabilities as Data Studio plus a rich set of pureQuery capabilities.
Table 1. Additional capabilities in Optim Development Studio and pureQuery
|Function||Data Studio||Optim Development Studio|
|Integrated Query Editor (SQL and XQuery)||X||X|
|Visual Explain for IBM databases||X||X|
|Visual Explain for Oracle databases||X|
|Basic single-query tuning (access plan graph, query formatter, statistics advisor, environment capture)||X|
|SQL Routine Debugger||X||X|
|PL/SQL Routine Debugger against DB2 for Linux, UNIX, and Windows databases||X||X|
|PL/SQL Routine Debugger against Oracle databases||X|
|Java Routine Debugger||*||X|
|Data Web Services||*||X|
|Create, alter, drop IBM database objects||X||X|
|Create, alter, drop Oracle database objects||X|
|Manage security privileges for IBM database objects||X||X|
|Manage security privileges for Oracle database objects||X|
|Deploy Web Services on DataPower appliances & JMS||X|
|Tooling to build pureQuery code and bind pureQuery packages||X|
|Correlate SQL to Java source code||X|
|Impact Analysis for Java (object, SQL, source code)||X|
|SQL execution statistics||X|
|Execute pureQuery code||X|
* Requires IDE package of Data Studio
See the Resources section of the article for more information about pureQuery capabilities.
Because Optim Development Studio and Data Studio share some of the components and some of the best practices hold true for both products, we are including both products in this article. Unless stated otherwise, the best practices described in this article hold true for both products.
Ensuring DB2 for z/OS is set up properly
Even though Data Studio and Optim Development Studio (the tools) are easy to install and run, if your server is not correctly configured to handle the calls that come from one of the tools, you cannot successfully use them. For example, some jobs must be run on the server side to catalog the stored procedures that enable the retrieval of metadata used in populating the Data Source Explorer. Other jobs set up the environment to ensure the success of stored procedure deployment and debugging. In many cases, improper environment setup is the cause of problems in using the tools.
Although many of the required jobs have most likely been run in your environment when DB2 was installed, there are cases, such as migration from an older release, when the proper stored procedures might not have been deployed. This section serves as a checklist you can use with your system programmer to ensure your server is correctly configured to work with the tools.
See the stored procedure IBM Redbooks® publication listed in the Resources for details about the following sections.
Ensure DB2-supplied stored procedures are installed
Verify that the required DB2-supplied stored procedures are installed and that they are using valid WLM application environments The following DB2 for z/OS V8 and V9 customization jobs create and bind the various DB2-supplied stored procedures that the tools use to populate the folders in the Data Source Explorer.
- Adds REXX support for the DB2-supplied stored procedure DSNTPSMP, which enables creation of external stored procedures.
- Sets up the DSNTWR program executed when the DB2-supplied stored procedure WLM_REFRESH is called.
- Catalogs and binds the following where needed:
- The DB2-supplied metadata stored procedures
- The DSNTPSMP, DSNTBIND, and WLM_REFRESH stored procedures
- The Java processing stored procedures
- The ADMIN stored procedures
- Catalogs and binds, where needed, the DB2-supplied metadata stored procedures the SQL tooling uses
If these stored procedures are not present, the user sees an exception,
sqlcode -440 is entered in the error log when a stored procedure is
deployed or executed. The code is also added to the error log when you
populate some of the pages in the SQL
builder within the New Stored Procedure wizard.
Verify that you have appropriate authorization
The tools perform queries and updates to catalog tables on your behalf when developing and executing SQL statements, stored procedures, and, if you are using Optim Development Studio, DB2 packages. As you might expect, you need the appropriate DB2 authorizations in order to do these operations. In addition to these DB2 authorizations, you might also need RACF and USS authorizations to access certain resources.
- General authorizations
- Table 2 lists the general authorizations required for users to create or modify objects in the tools. The Data Object Editor used in the Data Source Explorer enables users to create and modify objects from most of the nodes in the Data Source Explorer, so appropriate authorizations must exist to support that work. Example authorizations include sample contents, insert, update, delete, extract, and load table data.
Table 2. General authorities and privileges
|Task||Authority and privileges|
|Access target database||CONNECT|
|Create a view or table using the Data Object Editor or embedded SQL within an application||CREATEIN for the schema|
|Retrieve rows from a table (Sample Contents, Return All Rows, and Extract)||SELECT|
|Insert or update an entry into a table or view using the Data Object Editor or embedded SQL within an application||INSERT or UPDATE|
|Delete rows from a table or view using the Data Object Editor or embedded SQL within an application||DELETE|
|Drop any object from the Data Source or embedded SQL within an application||You must have ownership of the object and at least
one of the following:
|ALTER any object from the Data Source Explorer or embedded SQL within an application||You must have ownership of the object and at least
one of the following:
|Register stored procedures with a database server|| Any of CREATE PROCEDURE authority privileges:
|Debug a stored procedure in the tool||DEBUGSESSION|
|Run or call a stored procedure||SYSADM or DBADM, or EXECUTE or CONTROL for the package associated with the stored procedure (for SQL-stored procedures or Java-stored procedures with embedded SQL)|
- Catalog authorizations
- The tools access a number of DB2 system catalog tables when you browse an object's properties or if you create or update (alter) many of the objects in the Data Source Explorer. You need at least SELECT privilege to gather information from these catalogs. Usually, SELECT privilege to the catalogs is granted to PUBLIC.
- Execute privileges on DB2-supplied stored procedures
- The authorization ID that logs in to the connection should have the required execute privilege on the DB2-supplied stored procedures that the tools use. When a system programmer or administrator runs the jobs that install the DB2-supplied stored procedures, the last step in the job is to grant execute privileges on those stored procedures to PUBLIC. However, some administrators might have a customized that step such that users do not have the correct privileges by default.
- RACF authorizations
- When deploying DB2 for z/OS stored
procedures, certain RACF authorizations are needed in order for the tool
to refresh the WLM environment remotely and to interface with JES2. The
system programmer can customize and execute the following members from
SDSNSAMP to achieve this:
- DSNTEJ6W, step PH06WS01: Customize this step to create and populate a RACF resource profile in resource class DSNR to allow users to call WLM_REFRESH. If this step is not done, the tools display a warning when the WLM_REFRESH is called while deploying a stored procedure using the tools.
- DSNTIJRA, step DSNADSR: Make sure to register to Program Control
the DB2 administration enablement modules that implement the
following stored procedures:
- For the Unified Debugger setup, when the Session Manager is set up as a started task in z/OS, a RACF resource profile named DB2UDSMD must be defined. Refer to the stored procedures IBM Redbooks publication in the Resources section for the details on creating this profile.
- UNIX System Services authorizations
- While it might seem
obvious that users require read/write (RW) authorization to a particular
file structure, there are situations where this requirement is not as
obvious. The system programmer must ensure that user have RW authorization
in the following situations.
- When the Session Manager (for Unified Debugger) is launched as a started task in z/OS, an HFS file is created in USS to hold the DB2UDSMD profile environment settings. The user invoking the Session Manager needs RW authority on this HFS file and directory. Refer to the Redbook in the Resources section for details on creating this HFS file.
- When there are multiple JDK versions available in the z/OS server, the user needs RW authority on all the versions that the tools support. This is because it is possible for the client JDK to be at a different level than the server. The location of the JDK to be used when executing a Java stored procedure can be controlled through the JAVAENV dataset in the WLM application environment procedure corresponding to this JCL.
Verify the Unified Debugger setup (for routine debugging only)
The previous section mentioned some required authorizations for using unified debugging. This section describes some additional setup steps.
First, here is a quick review of the Unified Debugger. The Unified Debugger is the latest debugger technology. The Unified Debugger supports both DB2 for z/OS V8 (with PTF UK25860 from the same release date as DB2 for Linux, UNIX, and Windows V9 FP2) and DB2 for z/OS V9. For DB2 for z/OS V9, the Unified Debugger enables debugging for Java stored procedures, native SQL procedures, and external SQL stored procedures. You can read more about using the Unified Debugger and options for setup in the developerWorks article series in the Resources section.
Complete the following steps to ensure that the server is correctly configured for debugging.
- Ensure that the sample job DSNTIJSD has been customized to run
successfully on DB2 for z/OS by
querying the SYSPROC.SYSROUTINES catalog and locating the
following stored procedures:
- Ensure that for each of those procedures, the WLM application environments are set up with the correct NUMTCBs. The values of the recommended NUMTCBs for the DB2-supplied stored procedures are listed in the IBM Redbooks publication (see Resources).
- If you already have Data Studio or Optim Development Studio set up, you can use the Data Source Explorer to verify these NUMTCBs. Open the DB2 for z/OS connection folder, and click Schemas > SYSPROC > Stored Procedures, as shown in Figure 1.
Figure 1. Debugger stored procedures shown in Data Source Explorer
As described in the stored procedure IBM Redbooks publication and in the developerWorks article series (see Resources), a key component of stored procedure debugging is the Session Manager, which can be run on the z/OS server, or on the client. Complete the following steps to start the Session Manager.
- Before you start the
Session Manager on the z/OS server, verify the following:
- The RACF authorizations are set up as described in the Introduction under in Verify that you have appropriate authorization.
- The DB2UDSMD environment profile file is allocated in a directory that you have RW access to.
- The started task DB2UDSMD JCL is registered in SYS1.PROCLIB. Test this by issuing the S DB2UDSMD command in the z/OS console.
- To start the Session Manager on the client, run the db2dbgm.bat script.
- Capture the IP address and port.
- Verify that the Preferences settings for the Unified Debugger are set to those values by going to Window > Preferences > Run/Debug > Unified Debugger in the tools.
- Click Already running, and enter the IP address and port in the appropriate text boxes.
- If your environment has a firewall between the DB2 for z/OS server and the workstation where the tools are installed, and if your Session Manager is on the client, put an exception in your firewall to allow the client Session Manager to ping and communicate with the DB2 for z/OS server.
- Start the Session Manager on the z/OS server.
Setting up Visual Explain and query tuning (Data Studio stand-alone only)
Starting with Data Studio v184.108.40.206, some single-query tuning capabilities are available in the stand-alone package, along with Visual Explain. However, in order to use these functions, the DB2 for z/OS server must be configured correctly.
To configure your DB2 for z/OS subsystem to work with Data Studio 220.127.116.11 stand-alone, the system programmer completes the following steps.
- Install PTF PK58941(DSN5OFM).
- Modify DSNTESC and DSNTIJOS for your DB2 for z/OS system, and run them. These JCL files are in the DB2 sample library.
- Install the DBRMs that are included in the dbrm_dsrcp.zip file, as shown in Figure 2. See Resources for where this file is available for download
Figure 2. Data Studio stand-alone download site provides updated DBRMs for query tuning
When you run the above JCL files, notice that, in addition to the tables and packages that are required to tune single queries that run on DB2 for z/OS, these JCL files create similar objects for tuning query workloads. Tuning workloads is not supported in Data Studio. Therefore, you do not have to grant privileges on any of the objects for tuning workloads.
After applying the PTF, use Data Studio to bind the packages for single query tuning by completing the following steps.
- From the Data Source Explorer, right-click the database, and
click Query Tuner > Configure Database for
Tuning. A wizard opens. Because this wizard is
also used by Optim Query Tuner and Optim Query Workload Tuner,
some UI items are not relevant for binding these packages. For example, you can
skip the advisor-related and workload-related pages because these
do not apply to the stand-alone Data Studio.
Data Studio checks all the prerequisites for configuring and reports them in the first page. If your system programmer did not bind the DBRMs on your behalf after applying the PTF, then the Packages status field shows
Package Not Found, as shown in Figure 3.
Figure 3. Configuration wizard checks for query tuning prerequisites
- Click Next. The next page enables you to bind the DBRMs that your system programmer installed.
- Click Bind to bind these DBRMs, as shown in Figure 4.
Figure 4. Bind packages for basic, single query tuning
- After binding, create your Explain tables, as shown in Figure 5. You can skip this step if you are reusing tables, such as when teams share existing Explain tables. If your Explain tables are in V8 format and you have migrated to V9, you can migrate the Explain tables and preserve the information in these tables. Customize the SDSNSAMP member DSNTESC to perform the migration for you.
Figure 5. Option to create Explain tables
- If the DB2 for z/OS server was previously configured to use with Visual Explain for DB2 for z/OS or to use with Optimization Service Center (OSC), install updated DBRMs into your server and rebind these packages. The package versions of OSC are different than those of stand-alone Data Studio, even though they are sharing the same set of tables. For OSC and Data Studio to coexist, you need to bind a version for each product. Note that only OSC FP7 and later are compatible with Data Studio. See Resources for links to Technotes that contain more details.
Optimizing data sources solutions for integrated data management
This section describes some productivity and performance tips when using the Data Source Explorer.
Use filtering on the Data Source Explorer
In Verify the Unified Debugger setup (for routine debugging only), you used filtering to limit the number of objects that are displayed, which can help tremendously with performance and productivity when you administer a large number of objects. To activate a filter, complete the following steps:
- Right-click the folder (such as Schemas), and select Filter.
- In the Filter window, uncheckDisable Filter.
There are two ways to filter:
- By specifying a filter predicate, such as
Starts with the characters DBG
- By selecting from a list of values for the object being filtered. For example, in Figure 6, the filtering is applied on the Schemas folder, and only objects within the SYSPROC schema are to be displayed. This coordinates with the filtering you saw in Figure 1.
Figure 6. Connection filter properties
Use configuration repository (only available in Optim Development Studio and other Optim tools)
If you have team members with a common set of data servers that they connect to, your team might benefit from the Optim Development Studio's capability to specify configuration repositories, which avoids the need to manually pass connection information around to all members of a team. The configuration repository is available only in the priced, Eclipse-based Optim products, including Optim Development Studio, Optim Database Administrator, and InfoSphere™ Data Architect.
A configuration repository is a set of tables created in a database server that members of a team or project can access. The configuration repository, which requires a one-time setup, stores the information for the database connections that the team members need to use.
For example, assume that a team working on a project requires connectivity to the following data servers:
- DB2LUW1: a DB2 on LUW data server
- DB2Z_V81: a DB2 on z/OS V8 server in New Function Mode
- DB2Z_V91: a DB2 on z/OS V9 server in New Function Mode
As the administrator, select a database that is accessible by everyone on the team, and configure a repository there. This repository holds connection information, including database name, database location, host, and JDBC driver location of the servers above. The connections to the servers do not need to be already created or active when the repository is created or accessed.
When a member of the team launches Optim Development Studio, he simply creates a connection to the database server that contains this repository, and he will be able to access the connection information for all three servers, even if the connection is not active. You can find a detailed discussion in the article, "Using common connections with Optim solutions” (see Resources).
Share profiles using export/import
If you are using IBM Data Studio, the configuration repository is not an option for you. Another way to share common connections is to export specific connection profiles and have your team members import them into their workspaces. There are limitations to using the connection profiles when compared to the configuration repository, including the following:
- Connection must be in a connected state in order to be exported.
- Unless you originally create the connection without saving it, the user ID and passwords will be exported. It is recommended that any connections you want to share with others not include this information.
- Exported connection profiles cannot be grouped logically.
- Imported connection profiles based on a DB2 alias will require the same alias names to exist in the tools into which they are imported.
- You will need to tell your team members where on the file system the exported connection file resides.
To export one or more connection profiles, complete the following steps.
- Click the Export icon at the top right corner of the Data Source Explorer.
- Select the connection profiles you want to export, and specify a file to which the profile will be saved. In the example, the profiles for EC184V9 and EC218V9 are selected into a file named ConnProfile, as shown in Figure 7. This location could be part of a shared file repository in a well-defined location that is shared with other members of the team. Notice also that that the specified information is encrypted.
Figure 7. Choosing connection profiles to export
Now, when other users launch their workspaces, they can import the connection profiles.
- Click the Import icon, as shown in Figure 8.
Figure 8. Click the marked icon to import connection profiles from the file system
- In the Import Connection Profiles dialog, specify the location of the file containing the exported connection profiles.
Subset your DB2 Connect aliases
Some organizations maintain a DB2 Connect connection profile, which contains the DB2 aliases for all of the DB2 servers available in their systems. This profile is usually imported into each team's or project’s DB2 Connect. Many teams really need to access only a subset of those server aliases. Using the tools, a team can filter the entire list of aliases used for DB2 Connect.
To filter your DB2 Connect aliases, complete the following steps.
- Launch the tool with the default setting, which loads all DB2 aliases.
- Delete all the DB2 aliases that you don’t need. For example, you might want only those DB2 aliases that target DB2 for z/OS V9. When you restart the tool, the deleted DB2 aliases are no longer listed in the Database Connections folder, as shown in Figure 9.
Figure 9. Before and after DB2 aliases are deleted
If you need to restore your deleted DB2 aliases, complete the following steps.
- Click Window > Preferences > Data Management > DB2 Options.
- Check Restore deleted DB2 aliases on startup, as shown in Figure 10.
Figure 10. Restore aliases to view
If you don’t want to use DB2 aliases at all, and you prefer to use your own set of JDBC drivers, complete the following steps.
- Click Window > Preferences > Data Management > DB2 Options.
- Uncheck Load DB2 alias connections on startup.
- Restart the workspace by clicking File > Restart.
DB2 aliases no longer appear in the Database Connections folder. Note that if a DB2 alias is a target server for a data development project in the workspace, unchecking Load DB2 alias connections on startup does not remove this DB2 alias from the next startup. This is because the tools detect that this connection is required by one of the projects.
The tools let you work offline, which means that you are unconnected to the database. This is useful when you are developing SQL statements and routine logic and you do not need to examine the output returned from executing the statement or routine, such as in the case of result sets and output parameters.
To work offline, complete the following steps.
- Save the metadata model by right-clicking the connection profile in the Data Source Explorer and selecting Save Offline, as shown in Figure 11).
Figure 11. Options to save metadata for current work and to work offline
- Select Work Offline. This action disconnects you from the database server.
Use a flat view of objects
If you want to see a view of objects similar to how the DB2 Control Center displays them, you can switch the Data Source Explorer from a hierarchical (tree) view to a flat view by clicking the view toggle icon, as shown in Figure 12.
Figure 12. Highlighted icon toggles between flat and hierarchical views
When you request a flat view, the tools refresh your Database Connections folders and display the folders by type. When you click a node, the elements of that node are displayed in the editor in a tabular format. For example, Figure 13 shows that the Table Spaces node is selected, and all the table spaces are listed.
Figure 13. Flat view lists objects by type
Use a common version for the SQLJ/JDBC driver and JRE/JDK
The tools ship with a set of JDBC driver jar files and a set of Java Runtime Environment files. Optim Development Studio also ships with a Java Development Kit. Because the tools can coexist or shell-share with other Eclipse-based products, issues sometimes arise about which JDBC or JRE/JDK version to use.
Some customers prefer to use one version of the IBM Database Server Driver for JDBC and SQLJ (known commonly as the JCC driver) and JRE/JDK files for all their products. Other customers prefer to use the JCC driver version supplied by DB2 Connect, preferring to use a specific Java level for all their Java application development. Doing so prevents incompatibilities in the compiled or deployed code once they are in production.
Change the default jars by modifying the location of the jars by completing the following steps.
- Right-click Connection Profile > Properties > Driver Properties.
- Click the Edit Driver Definition icon .
- In the Driver Definition dialog, click the Jar List tab.
- Select one of the jar files, and select Edit Jar/Zip, as shown in Figure 14.
Figure 14. Select this button to change location of SQLJ/JDBC driver jars
- Select the location of your common JDBC jar files.
Similarly, you can set a common JRE/JDK version in Optim Development Studio by completing the following steps.
- Click Window > Preferences > Java > Installed JREs.
- Click Add to add another version, as shown in Figure 15.
Figure 15. Add a new version of the JRE/JDK
- Click Remove to remove the current version or to disable a version by unchecking it.
To set the JDK, complete the following steps.
- From the Preferences window, click Data Management > SQL Development > Stored Procedures and User Defined Functions > Deploy Options.
- In the Java Home section, click Browse to bring up the file browser, as shown in Figure 16.
Figure 16. Set the JDK from Deploy Options under Preferences
- Go to the directory where the common JDK is located.
Optimizing productivity with SQL tools
This section describes some useful tips for SQL development.
Set preferences to limit result sets and size of data
Queries can return hundreds of thousands of rows and very large amounts of data, in which case execution can take a long time and slow down performance during development. Application developers usually do not need to see all the rows, so limiting the rows fetched by using the FETCH FIRST x ROWS ONLY clause helps in the performance of the result-set retrieval.
The tools provide you with the capability to limit the number of rows returned from a query or stored procedure result set. To limit the number of displayed rows returned from a query or stored procedure result set, click Windows > Preferences > Data Management > SQL Development > SQL Results View Options. This also limits the number of rows displayed in the SQL Results view.
For example, Figure 17 shows the number of rows returned from a query limited to 500 and the number of rows displayed in the tool (in the SQL Results view) limited to 500 as well.
Figure 17. Set SQL results limits in the SQL development area of Preferences
Use SQL templates
SQL templates are a great productivity enhancement in any of the following circumstances:
- If you develop SQL statements that access the same tables over and over again
- If the SQL statements follow some pattern
- If you want to facilitate use of certain coding standards for SQL
With SQL templates, part of the statement has been pre-coded. Although the tools provide several common templates for SQL and XQuery, you can modify those templates and create your own templates that can be shared with other members of the team.
Complete the following steps to create your own templates using the Template Editor.
- Click Window > Preferences > Data Management > SQL Development > SQL and XQuery Editor > Templates to launch the Templates dialog, as shown in Figure 18.
Figure 18. Templates dialog lets you edit existing templates or create new ones
- Select one of the templates from the list to see a preview of that template.
- Edit the template to suit your needs by selecting the Edit button, or create a new one by clicking New.
You can export templates you have created to a file and share the templates with other team members. The article “Building efficiencies into a DBA’s day” (see Resources) describes in more detail the use of command templates in Optim Database Administrator. However, the principle is the same for SQL and for other tools as well.
Import pre-defined SQL statements
Some organizations have a dedicated developer or group of developers who
ensure that the SQL statements that are embedded in the
applications are properly tuned. These statements are then handed to the
application developers. Stored procedure developers can import predefined
SQL statements into the
scripts folder from a file system and later import
them into their stored procedures.
To import predefined SQL statements, right-click the scripts folder of your project, and select Import. Direct the file browser to the location of your pre-defined SQL statements. You can use these pre-defined SQL statements in your stored procedures or pureQuery applications.
Use content assist
Along with templates, you can use content assist when you are creating or modifying your SQL statements. This feature enables you to validate the following:
- The schemas within the database
- The tables within a schema
- The fields or columns within a table
This is an especially useful feature when modifying existing SQL statements.
If you have a filter applied, you do not see the objects that are filtered out, and the parser reports these objects as errors (that is, not found). You don’t have to disable your filter to get rid of the errors. Right-click the statement, and select Validate Table References. This performs a quick query on the tables used in your SQL statement and clears the errors if those tables are found in the connected database server.
Turn off statement validation until you are ready
Some DBAs or developers prefer having statement validation in operation while editing, while others prefer to wait until the statement is fully coded before having the tool validate the syntax. If you are a person who is annoyed by those red marks in the left margin while coding SQL, you can turn off validation by right-clicking in the SQL editor and unselecting the menu option to Validate Statement Syntax.
When you are done editing the statement, right-click in the editor again and turn validation back on, as shown in Figure 19.
Figure 19. Right-click in editor to select or deselect statement validation
Optimizing routine development
This section highlights some best practices for developing stored procedures or user-defined functions.
Set preferences for deploying routines
As with SQL statements, you can set default values for your routine’s deployment options in Deploy Options preferences for stored procedures. To set these options, click Window > Preferences > Data Management > SQL Development > Stored Procedures and User Defined Functions > Deploy Options, and set the default routine options for Java and SQL routines.
Many organizations set the bind options and WLM Environment in the Preferences so that each new stored procedure has these values.
Another trick is to create a mask for some of the fields, like the WLM Environment. This is useful when a specific WLM application environment is assigned to a developer but the first 6 or 7 characters of the WLM Environment are the same. At stored procedure creation time, the developer simply adds the last character to the WLM Environment name.
For example, assume that all the WLM Environment names start with
DB2WLMD, and each developer is assigned to a different WLM environment,
starting with DB2WLMD1 through DB2WLMD9. Each developer can set the
DB2WLMD, and during new stored procedure development, the
WLM ENVIRONMENT field is prepopulated with the first characters of the WLM
environment name, as shown in Figure 20.
Figure 20. Set deployment in preferences (left side) to prepopulate option at deploy time (right side)
Specify a build owner, package owner, and qualifier
For stored procedures, the build owner is the value of the CURRENT SQLID in effect when the CREATE PROCEDURE is invoked. For external SQL stored procedures, the CREATE PROCEDURE is invoked by the build utility, DSNTPSMP, rather than by Data Studio itself. So, to correctly specify this, the Build Owner field in the Deploy wizard must be set. It is good practice to explicitly set these values rather than assume that the CURRENT SQLID is correctly set at deploy time. When you set this explicitly, the tools pass the value in Build Owner and Package Owner to DSNTPSMP, regardless of the value of the CURRENT SQLID at the time DSNTPSMP was invoked.
Also, some organizations give the bind authorization to secondary authorization IDs rather than to the primary authorization ID. By explicitly setting the build owner and the package owner to a secondary authorization ID, access to the objects manipulated by the stored procedure can be given to the secondary authorization ID, rather than the primary authorization ID used to log in to the server.
Limit the number of Routine Editor windows that are open
As with SQL statement development, you can turn off statement validation during stored procedure development. If you leave it turned on, the tools continuously validate the syntax for each modification done in any Routine Editor’s Source tab. Usually this validation completes very quickly. However, issues occur when there are many Routine Editor windows (tabs) opened or when the stored procedure being modified contains thousands of lines of code. So, it is a good practice to keep open only the Routine that you are modifying and close all other editors.
Explicitly specify applicable options for external and native SQL stored procedures
This best practice applies to the case when there are existing V8 stored procedures that you are modifying in the tools to use against V9. In V8, only external SQL stored procedures were available. However, as of V9, with the addition of native SQL stored procedures, there can be ambiguity around which type of stored procedure is being specified.
To avoid problems, make it a practice to always code the FENCED and EXTERNAL keywords for your external SQL stored procedures, and code the VERSION keyword for your native SQL stored procedures. The tools automatically add the FENCED keyword when the New Stored Procedure wizard is used to create an external SQL stored procedure. The tools also add the VERSION keyword when used to create a native SQL stored procedure. However, if an external SQL stored procedure created in DB2 for z/OS V8 is migrated to DB2 for z/OS V9, the stored procedure does not have the FENCED keyword. Therefore, the tools assume that this stored procedure is a native SQL stored procedure, because the VERSION keyword is optional for native SQL stored procedures. This can cause errors in both re-deployment and execution.
Use code fragments for code reuse
Organizations that want to standardize some of the code in their stored procedures can benefit from using this feature in the tools. Examples of common code include the following:
- Common data declarations in SQL routines
- Copyright prologs
- Common exception handlers
- Common imports, methods, and catch-try blocks for Java stored procedures.
In the New Stored Procedure wizard, on the Code Fragments page, click the ellipsis next to the code fragment to launch the file browser. You can then point it to the location of your common code, as shown in Figure 21.
Figure 21. Specifying a fragment
This page is customized to the language type of your stored procedure, and the tools can insert the common code in the proper area, as shown in Figure 22.
Figure 22. Inserted fragment
Connect to your target server before creating SQLJ routines
The tools rely on information obtained from the target database server to create the correct iterators for a SQLJ stored procedure. (Note that SQLJ procedure development is not supported in the Data Studio stand-alone package.) The New Stored Procedure wizard generates the iterator by connecting to the database and querying the data types of the columns participating in the query. If either the table has not been created, or the database server is not connected, the tools still generate an iterator. However, you need to edit the stored procedure with the table column information before deploying it.
Import SQL statements into your stored procedure
Some organizations prefer to hand the SQL statements to use in an application directly to their application developers. In some cases, the application developer is not knowledgeable in SQL. The section Optimizing productivity with SQL tools described how developers can import these statements into their projects. You can use the New Stored Procedure wizard to import or embed those statements into your stored procedure by completing the following steps.
- On the SQL Statements page of the wizard, click Remove to remove the default SQL statement in the Statement Details area.
- Click Import, as shown in Figure 23.
Figure 23. Remove existing default and then import desired SQL
The Import SQL Statements dialog is launched.
- Select the SQL statements that you want to use in your stored procedure, and click Import, which brings up the dialog shown in Figure 24.
Figure 24. Select the scripts you want to import
The selected statements are shown in the Statements window of the SQL Statements page.
- Click on each statement to display it in the Statement Details window.
- If you have more than one SQL statement on the SQL
Statements page of the wizard, be sure to change the Result set field to
Multiple, as shown in Figure 25. Otherwise, an input parameter called
whichQueryis generated, and the statements are coded in a case expression.
Figure 25. For multiple SQL statements, specify multiple result sets (unless you want all statements in a case expression)
Optimizing query tuning and Visual Explain
Refer to the section Setting up Visual Explain and query tuning to ensure you are correctly configured to use these tools.
Visual Explain provides a familiar access plan graph, and you can continue to use that if you are comfortable with it. With query tuning, you get much more capability. With Data Studio stand-alone single query tuning, you can do the following:
- Capture queries from all data sources that the Optimization Service Center for DB2 for z/OS supports and from XML files that are exported from DB2 Query Monitor for z/OS.
- View formatted queries by collapsing and expanding parts of the query and by seeing associated statistics for each part of the query.
- View access plan graphs. The access plan graph provided with query tuning is enhanced over what you get with Visual Explain. For example, you can display the PLAN_TABLE record for the explained query (DB2 for z/OS only), whereas Visual Explain cannot. DB2 for z/OS also automatically checks that the Explain table is correct and gives you the option to migrate it if needed. However, the query tuning access plan graph is currently only included in Data Studio stand-alone and in the Query Tuner set of products. You can use Visual Explain in products such as Optim Development Studio. (Visual Explain also supports IDS and Oracle through ODS).
- Capture information about the data server that queries run against. This capability is called query environment capture. It is also known in Optimization Service Center for DB2 for z/OS as service SQL.
- Generate reports on the performance of queries.
- Run the Query Statistics Advisor to analyze the statistics that are available for the data that a query accesses. You can also analyze conflicting statistics. And you can look for additional statistics that you might capture to improve how the data server processes the query.
Recommendation: If you are using Data Studio stand-alone with DB2 for z/OS or Linux, UNIX, and Windows, use the no-charge query tuning capability in Data Studio.
Launching query tuner or Visual Explain
People often are not familiar with how to launch query tuning from Data Studio.
In Data Studio V18.104.22.168 stand-alone, you launch either the basic single query tuning capability or Visual Explain from the same launch points using the following methods.
- In the SQL Editor, highlight and right-click the SQL statement and select Open Visual Explain (for Visual Explain) or Start Tuning (for single-query tuning, including access plan graph). Note, if your statement is an XPATH, or XQUERY statement, you can only use Visual Explain.
- In the Data Project Explorer, right-click an SQL statement that contains an INSERT, an UPDATE, a SELECT, an SQL stored procedure, or an SQL user-defined function. Select Open Visual Explain or Start Tuning.
- In the Data Source Explorer, right-click an SQL stored procedure, a view, or an SQL user-defined function, and select Open Visual Explain or Start Tuning.
Note: If the tools find more than one INSERT, UPDATE, SELECT, or XQUERY statement inside an SQL stored procedure or SQL user-defined function, only the first statement is used for either Visual Explain or query tuning.
When you use basic single-query tuning, you are prompted to use an existing project or to create a new Query Tuner Project. The editor opens, as shown in Figure 26.
Figure 26. Basic single query tuning editor provides many capabilities
To create an access plan graph in query tuner, select Choose Tuning Activities, and check the Access Plan Graph box . Alternatively, if you select Tune Query, the statistics advisor runs, the query is formatted nicely, and an access plan graph is generated.
Sharing Explain tables
Some organizations use one set of Explain tables that they also migrate from one version to another. You can specify the schema for the Explain tables you want to use when you launch Visual Explain by specifying the CURRENT SCHEMA in the Collect Explain Data wizard, as shown in Figure 27.
Figure 27. Sharing a common set of Explain tables by specifying the schema
The section Setting up Visual Explain and query tuning described how you can skip creating the Explain tables, which enables various projects to use the same tables.
The information in this article should help you get up and running successfully with Data Studio and Optim Development Studio with your DB2 for z/OS system. Note that there are active discussion forums for both products on developerWorks. It’s a good idea to subscribe to these forums and take advantage of both IBM and community knowledge about the products.
The authors would like to thank Kendrick Ren and Paul Wirth for their review of this article.
Appendix. Features in Data Studio that support DB2 for z/OS
Here are the features in Data Studio that support DB2 for z/OS.
Table 3. Summary of Data Studio support for DB2 for z/OS
|Database management||Generate DDL|
Manage and alter storage groups
Add to overview diagram
|Table||Create, open (edit data), query, alter, drop|
Create: index, alias, trigger
Add to overview diagram
|View management||Create, open, query, alter, drop|
|Table||Create, open (edit data), query, alter, drop|
Create: index, alias, trigger
Add to overview diagram
|Alias management||Create, open (edit data), query, alter, drop|
|Trigger management||Create, alter, drop|
|Schema management||Create, alter, drop|
|Index management||Create and drop (alter only using drop or recreate)|
|Table space management||Create, alter, drop|
|User-defined distinct types||Create, alter, drop|
|User-defined functions||Create and alter using routine editor/SQL editor|
|Stored procedures||Create, alter using routine editor/SQL editor|
Debugging SQL and Java (Java requires IDE package)
|Database users||Add or change|
|Roles||Create, alter, drop|
|XML management||Generate DDL|
Register with XSR (requires IDE)
Drop (requires IDE)
Annotated schema editor
XML editor (requires IDE)
XML Schema editor (requires IDE)
|Data Web Services||Create and deploy Data Web Services (requires IDE)|
|Connection management||Integration with Kerberos and LDAP|
Exporting connection profiles
Connection working sets
|Query usability and diagnostics||SQL and XQuery editor with content assist and query
Advanced query formatting
Single query stats advisor
Advanced Visual Explain (access plan graph)
z/OS environment capture for serviceability
- Read the IBM Redbooks publication "DB2 9 for z/OS Stored Procedures, Through the Call and Beyond" for detailed setup instruction on using Data Studio for stored procedures.
- Refer to the developerWorks series "Debugging stored procedures on DB2 z/OS with Data Studio Developer" for detailed instructions.
- Check out the following Technotes for information about installing updated DBRMs into your server and rebinding Visual Explain and Optimization Service Center (OSC) packages:
- Read the developerWorks article "Debugging stored procedures on DB2 z/OS with Data Studio Developer, Part 2: Configure the stored procedure debug session manager on z/OS " for instructions on setting up session manager on the server.
- See "Understanding the packaging of Integrated Data Management solutions for database development, management, and performance" (developerWorks, updated April 2010) to learn more about packaging and which features are part of which product.
- Read "What's new and cool in Optim Development Studio 2.2" (developerWorks, Jun 2009) to learn about the support for Oracle and other pureQuery features that help developers and DBAs work better together.
- Check out the pureQuery platform page, which includes an overview of pureQuery and a list of FAQs.
- Read the no-charge e-book entitled "Getting started with Data Studio for DB2" for step-by-step instructions on using Data Studio. Although focused on DB2 for Linux, UNIX, and Windows, some capabilities apply also to DB2 for z/OS.
- Read the developerWorks article Using common connections with Optim solutions" for more information on using the common connections repository, including use cases.
- Read the developerWorks article "Building efficiencies into a DBA's day" for more information on using templates. Although based on command templates, the same concepts apply to using SQL templates, which can be used with DB2 for z/OS.
- Refer to the product web site: "DB2 Administration Tool for DB2 for z/OS
- Explore the developerWorks Optim family page to learn more about Optim solutions. Find technical documentation, how-to articles, education, downloads, product information, and more.
- See Virtual Tech Briefings on Optim Solutions for Integrated Data Management for a schedule of virtual technical briefings around the Optim integrated data management portfolio.
- Learn more about Information Management at the developerWorks Information Management zone. Find technical documentation, how-to articles, education, downloads, product information, and more.
- Stay current with developerWorks technical events and webcasts.
Get products and technologies
- Download no-charge Data Studio software.
- Download the Optim pureQuery Runtime as part of the Optim Development Studio trial download.
- Build your next development project with IBM trial software, available for download directly from developerWorks.
- Participate in the discussion forum.
- Get answers to questions on the Optim Development Studio and pureQuery forum
- Check out the Integrated Data Management Experts blog and get involved in the Integrated Data Management community space, which contains a comprehensive list of resources and downloads.
- Check out the developerWorks blogs and get involved in the developerWorks community.