InfoSphere Workload Replay can capture local and remote DB2 for z/OS production workloads plus all the information that is needed to replay them:
- The original application timing
- Order of execution
- Transaction boundaries
- Isolation levels
- Other SQL and application characteristics
Built-in reporting tools identify accuracy and performance differences between replayed workloads, which reduces the effort that is required to analyze how changes in the z/OS environment might impact critical workloads.
Figure 1. Capturing and replaying DB2 for z/OS workloads
Key themes of this release include improved performance, greater deployment flexibility, and more fine-grained control of the capture and replay process.
Version 2.1 of InfoSphere Workload Replay takes advantage of the Guardium V9 GPU 50 64-bit support, which increases scalability and workload processing capacity. When you deploy InfoSphere Workload Replay for the first time, the 64-bit version is installed automatically. If you plan to upgrade from version 1.1 to version 2.1, you can continue to run on the 32-bit version of Guardium or upgrade to 64-bit.
Export and import workloads
In an enterprise environment, production and test environments are likely isolated from each other for security reasons. Likewise, environments might be geographically distributed with only limited network connectivity between servers. Previously, with a centralized Workload Replay v1.1 server deployment it was not possible or practical to capture and replay workloads in a decentralized environment.
Figure 2. Centralized server deployment cannot be used in decentralized environments
To allow for capturing and replaying the same workload in separated environments, Workload Replay v2.1 introduces workload export and import. You can use this new feature to securely transfer captured workloads between servers in decentralized deployments.
Limited or no network connectivity might exist between the source and the target Workload Replay servers. In those cases, intermediary storage areas temporarily store the workload in external FTP or Secure copy (SCP) servers that are accessible to either the source or the target server. You can move workloads between these storage areas manually by using physical media, such as hard disks or DVDs.
Figure 3. Transferring captured workloads in decentralized deployments
Export a workload
Assume that you captured a production workload. You review it by using the capture report and now want to replay the workload in a test environment that is not connected to the production environment. In the Workload Replay web console, the two workloads are displayed.
Figure 4. Listing captured workloads in the source server
Before a workload can be moved to intermediary storage Workload Replay packages the workload with the required information from the source environment. The user ID that you use to capture the workload must have the Can Export Workload privilege on the source subsystem to invoke the export task in the Workload Replay web console.
Figure 5. Creating a workload file on the source Workload Replay server
During workload export the required information is assembled in a self-contained, encrypted workload file. You can then transfer the encrypted file from the source Workload Replay server to any accessible FTP or SCP server.
Figure 6. Identifying workloads to upload to external storage
upload workload file CLI command to upload an
encrypted workload file to an FTP or SCP server.
Figure 7. Uploading a workload file to an FTP server from the source
When the workload file is uploaded successfully to intermediary storage, remove the encrypted file from the source Workload Replay server by deleting the exported workload in the web console.
Figure 8. Removing an exported workload file from the source Workload Replay server
When network connectivity is nonexistent or slow between the target server and the intermediary storage, move the workload file to an FTP or SCP server that the target Workload Replay server can access.
Import a workload
To import a captured workload into a target Workload Replay server, use the CLI console to download the encrypted workload file to the Workload Replay server, and then import the workload.
download workload file command in the CLI console
and provide the required information. Include the connectivity information
for the FTP or SCP server where the workload file is located and the
workload file name.
Figure 9. Downloading a workload file from an FTP server onto the target server
The download operation stores the encrypted workload file in an internal working directory on the target Workload Replay server.
You can display a list of imported workload files with the
show workload files CLI command.
Figure 10. Identifying workload files to import
Import the workload file to the target server with the Import task in the SQL Workloads tab of the target’s Workload Replay web console. You can give the imported workload a different name, for example if a workload with the original name exists on the target server.
Figure 11. Importing a workload file
The import operation decrypts and unpacks the encrypted workload file and associates the workload with a database connection. The user ID that you use to capture the workload must have the Can Import Workload privilege for the associated subsystem to import workloads.
Figure 12. Reviewing workload tasks for an imported workload
After a captured workload is successfully imported, create a capture report or transform the imported workload in preparation for replaying the workload on the target Workload Replay server.
The workload file remains on the server after the workload was imported.
To free up storage on the target server, you can delete the downloaded
workload file with the
delete workload file CLI command.
Figure 13. Deleting an imported workload file
Consider using the export and import feature to temporarily or permanently archive captured workloads in a secure location outside the Workload Replay server. This way, you can establish a workload library that can be shared as needed.
Capture LOB and XML data
As a workload is captured, the Workload Replay server collects capture information about host variables and parameters that are used in each executed SQL statement or stored procedure call. If the application manipulates large object (LOB) or XML data, large amounts of data are streamed by S-TAP to the Workload Replay server and then stored there.
In certain application scenarios, the return code and result count (rows that are returned or modified) of the SQL statement's (or stored procedure calls) might not depend on the actual input LOB or XML values. Since the Workload Replay report analysis does not compare data values in result sets, you might choose to not capture the actual input values and instead use generated dummy data. Dummy data eliminates the need to transfer and store the original LOB and XML data on the server.
Let’s look at two examples.
- An application periodically inserts 0.5 KB CLOB data that contains item descriptions. The DB2 return code for the INSERT operation is identical, irrespective of whether the CLOB reads "Feel the incredibly soft touch…" or "123456789012345678901234567890…". If the results of subsequent SQL executions do not depend on whether something is incredibly soft or just a generated value, capture the length information of the description and reduce the amount of data to stream, store, and process.
- Another application might call a stored procedure that processes an incoming XML document and depending on the content return one or more rows. The stored procedures business logic might return different result set sizes if you call the stored procedure with generated data. In this case, replaying the workload with generated data might be problematic because workload replay results are potentially impacted and the Workload Replay comparison reports are not accurate.
Generally speaking you must determine whether the following criteria apply to your environment:
- Is the accuracy of the replay results impacted if generated LOB or XML data is used during replay?
- Does the replay environment need to contain the exact same data as the source environment after a workload is replayed?
- Can the existing (hardware and network) infrastructure support the
- Timely streaming of large amounts of data between the capture environment and the Workload Replay server (during capture) and between the Workload Replay server and the target environment (during replay).
- Storage of large amounts of data on the Workload Replay server.
If use of generated LOB and XML data is unacceptable during workload replay, configure capture to record actual data values (up to a maximum of 32 KB as only length information is collected beyond that value).
Figure 14. Defining whether XML and LOB data is captured
LOB and XML data values are currently not displayed in the capture, transform, or comparison reports.
Filter captured SQL when replaying workloads
While a workload replay is in progress, extraneous SQL activity can occur on shared subsystems. For example, a performance monitor might run in the background or other applications might execute. Version 2.1 of InfoSphere Workload Replay introduces filters for you to limit the SQL execution information that is collected during workload replay.
If you anticipate SQL activity on the replay system that is irrelevant to your evaluation, consider defining a replay capture filter.
Because filters do not restrict the SQL that executes while a replay is in progress, this extraneous SQL might still impact the behavior (and performance) of the workload that is replayed.
If no filters are defined (or you only use less restrictive filters than during workload capture), new SQL might show in the workload comparison report. Likewise, if more restrictive filters are defined for workload replay, the comparison report might report SQL as missing.
Define replay capture filters in the replay wizard of the Workload Replay web console.
Figure 15. Reviewing workload replay options
You can retrieve the original capture filter and modify it as necessary to account for differences in your replay environment.
Figure 16. Defining replay capture filters
Keep in mind that some filters incur a higher S-TAP usage. For example, schema level filters require more parsing of the SQL statements to determine whether they are referencing the specified database object or not.
Map static SQL collection IDs when you transform captured workloads
If your workload takes advantage of static SQL, the associated packages (also called application packages) must be present in your source environment. Starting with version 2.1, InfoSphere Workload Replay supports collection mapping to allow for accurate workload replay of statically executed SQL in environments where packages are in different collections.
You can map captured collection information in the workload transformation wizard of the Workload Replay web console.
Figure 17. Reviewing workload transformation options
When you map collection IDs, keep in mind that the replay user ID (or user IDs) must be authorized to execute packages in the mapped collection.
Figure 18. Defining collection ID mappings
For more information about these new features in InfoSphere Workload Replay for DB2 for z/OS, see the New features and enhancements document that is listed in Resources.
The key features in version 2.1 of InfoSphere Workload Replay for DB2 on z/OS are a blend of performance improvements, SQL capture and replay enhancements, and support for decentralized server deployments. With these features, you can use an even broader range of workloads to quickly analyze in a pre-production environment what impact any changes in your DB2 for z/OS environment might have on applications that run in production.
- IBM InfoSphere Workload Replay, new features and enhancements: Read about new features, enhancements, modifications, and fix packs in Version 1.1.x of InfoSphere Optim Query Capture and Replay.
- IBM InfoSphere Workload Replay Information Roadmap: Explore information on planning, installing, evaluating and using, troubleshooting and support for InfoSphere Workload Replay.
- Follow developerWorks on Twitter.
- Watch developerWorks on-demand demos that range from product installation and setup demos for beginners to advanced functionality for experienced developers.
Get products and technologies
- Evaluate IBM products in the way that suits you best: Download a product trial, try a product online, use a product in a cloud environment.
- Participate in the discussion forum.
- Join the developerWorks community, a professional network and unified set of community tools for connecting, sharing, and collaborating.
Dig deeper into Information management on developerWorks
Get samples, articles, product docs, and community resources to help build, deploy, and manage your cloud apps.
Experiment with new directions in software development.
Software development in the cloud. Register today to create a project.
Evaluate IBM software and solutions, and transform challenges into opportunities.