Keeping your Maximo Tuned Up and Performing
TomRontiris 270003RGE6 Comments (5) Visits (27782)
Maximo is everywhere! It has been around for several decades and is in use by many different industries of all sizes. Client databases have grown in size and complexity over the years. Continued monitoring of database integrity is always recommended to make sure all your I’s are dotted and T’s are crossed.
Just the other day I was talking to a client that was experiencing major slowness in performance while generating a handful of Work Orders from the Preventive Maintenance application. They were attempting to generate about 5 work orders, which was taking over 20 minutes on their system. My first question to them was how long does it take to generate a single work order from the Work Order application? Their response to me was that it took them two minutes, to create a single new work order.
I had them enable their SQL logging and retry the steps to generate a work order from the work order application. They had many long running queries taking over 1 second each. We examined their database to see if they had the appropriate indexes loaded for the work order table. The indexes appeared to exist and appeared to be accurate.
Why the lag? I had them copy one of their SQL statements that was taking eight seconds to execute; into their query tool in order to run it manually. The statement sure enough took eight seconds there too! I then had them run an explain plan which analyzed the SQL statement and determines the efficiency of the script as well as which indexes that query is using.
We noticed that the query was not using the expected index, but rather was picking up another unrelated index. That got me thinking…. The indexes must be out of sync. Over time as databases grow and get altered users forget to keep up to date with database maintenance. In this case database maintenance involved updating database statistics. There is an action in the Database Configuration application that enables users to Update Statistics for Maximo. This essentially will execute update statistics action against all tables in the database.
It is recommended that Update Statistics is run on a regular basis. In the example above the client had not run update statistics in a very, very long time. Once they ran update statistics they noticed a vast improvement in performance. They were able to successfully generate a new work order in the work order app in 5 to 8 seconds from the 2 to 3 minutes it was originally taking.
After running Update Statistics the following statement was able to use the correct existing work order index and thus execution went from 9875 milliseconds to 23 milliseconds..
select * from woactivity where parent= '1234' and siteid= 'BEDFORD' order by wosequence, taskid (execution took 9875 milliseconds)
We were then able to examine the PM Work Order generation process which was much faster too. The client also needed to add an index on the WPITEM table, that helped reduce another query that was taking a long time to execute. Once we created the index and ran the Update Statistics the system was running at better than expected levels!
Following SQL statement was missing the appropriate index: (Once an index was recreated on the WPITEM table for the wonum and siteid columns, the query executed under 1000 ms)
select * from wpitem where wonum= '1235' and siteid= 'BEDFORD' (execution took 2156 milliseconds)
A SQL statement is considered to be a long running statement if it takes >1000 milliseconds.
Keep in mind that updating statistics and creating new indexes will not solve everybody’s performance problems. Every environment has different configurations and possible customizations. The steps mentioned above should be the initial steps that should be taken when beginning to diagnose performance problems, but are not to be considered as solutions for every performance problem encountered.