Fixing performance issues due to outdated database statistics

Queries on the Decision Center database might be too slow if the database statistics are not up-to-date.

Database statistics are automatically updated by the database server. However, they might become out-of-date for a while if the number of records in the Decision Center repository is significantly modified. This impacts performance before the next statistics update.

Operations that can cause a large number of updates can be, for example:
  • Administrative operations: archiving or migrating the database, copying projects
  • Copying large amounts of elements
  • Deleting branches
A temporary decrease in performance after these operations might be noticed.
Some database vendors allow you to manually update the statistics to fix the issue, for example:
  • For DB2, use the RUN_STATS command.
  • For Oracle, use the DBMS_STATS.GATHER_SCHEMA_STATS procedure.
    Note: With Oracle, updating statistics might not fix immediately the performance issue. This is because poorly optimized execution plans that are computed using outdated statistics might still be available in the server memory. To ensure that new execution plans are computed based on updated statistics, you must clean the server memory with the following command: ALTER system flush shared_pool.
  • For PostgreSQL, use the ANALYSE command.

For more information about the specific procedure to follow to update database statistics, see the documentation of your database vendor.