Recently working with a client’s SOA environment showed several interesting DB2 performance issues. One DB2 performance issue that was quite stunning was the large number of connections that the .Net and Java applications were making to DB2 and other systems. Researching the system and application further uncovered a wide disparity in the handling and the amount of connections each of the many application modules were using. Proper connection handling is very important to DB2 performance because of three main reasons: acquiring new connections is expensive, application connections maintain database locks and connections are unit of work transactions.
First, getting a database connection is expensive because of all those great things that a database provides such as security and integrity. Since the database is important, every connection request must have its security checked and be authorized. This security authorization against the database system and the data desired is quick, but takes time. Next, when database processing guarantees integrity, it is through its transaction logging of the unit of work. Starting a new database unit of work again is fast but must be managed within the database so that it can be backed out should the transaction processing fail.
Next, within the database connection, the SQL processing selects, inserts, updates, or deletes data. These actions holds locks against the data referenced and prevent applications from trying to update the same data or reference the same deleted data. The DB2 applications have several mechanisms to control and handle this locking within the application and system. The best way for DB2 performance is to Bind the application against the database is using the Bind parameters for cursor stability ISOLATION(CS) and CURRENTDATA(NO). This minimizes the immediate locks held and allows other transactions more concurrency to the data. If the application is read only and is not concerned with other transaction manipulating the data then use uncommitted read ISOLATION(UR). Using the ISOLATION(UR) setting is preferred for application referencing data that doesn’t change.
Next, the application unit of work must maintain the connection. Large application workloads that perform too many updates, inserts or deletes within a unit of work hold on to too many locks and can cause extended back out times when an application fails and impact DB2 performance. It is very important to have the proper transaction commit scope, issue appropriate commits to minimize the amount of locks and amount of work that the database may have to back out. It is also critical for the applications to reference the database tables and perform their updates in the same sequence. Referencing the data in same order acquires and releases locks synchronously, allowing more application concurrency. Since your application wants to minimize the number of locks and the time those locks are held, it is always best to do your data updates and inserts right before your application performs a commit or ends your transaction. This minimizes the time the locks are held and again provides more workload concurrency.
While all of this information is standard practice for most applications, within the new SOA architectures the services may not know much about the unit of work or connection situation. Within one client SOA architecture, recent research showed that a particular module had seven different connections active within its service. The service had several connections; DB2 for z/OS, DB2 for LUW, Oracle, MQ series inbound and outbound Queues and connections to application and web servers for AJAX activities. It is a bit much to have all of these connections within a single service and when some minor changes caused this module to fail many processes could not function. Also debugging was very difficult because one connection failure caused all the connection participants to back out their transactions, causing more locking and data integrity issues.
So make sure your application handles connections properly because they are expensive to acquire and impact DB2 performance. Minimize the number of database activities within a transaction to minimize the locks and understand the number of connections that are involved within a particular unit-of-work so that you can get the best DB2 performance from your applications.
_______________________________________________________Dave Beulke is an internationally recognized DB2 consultant, DB2 training and DB2 education instructor.Dave helps his clients improve their strategic direction, dramatically improve DB2 performance and reduce their CPU demand, saving millions in their systems, databases and application areas within their mainframe, UNIX and Windows environments. Search for more of Dave Beulke’s DB2 Performance blogs on DeveloperWorks and at www.davebeulke.com.