Upgrading, backing up, and migrating data

You use the same script to back up, restore, and migrate your data during an upgrade.

Before you begin

  • Read about the limitations that apply to this procedure. For more information, see Backup and migration limitations
  • If you do not configure a key-based Secure Shell (SSH) authentication as part of the installation, you are prompted for the password during the restoration. For more information about setting up SSH, see Setting up Secure Shell to use key-based authentication..
  • Fix Pack 1Before you migrate to Log Analysis 1.3.3 FP001, complete the following steps for each saved search with an absolute time filter.
    1. To display a list of saved searches, click the Saved Searches icon.
    2. Select the saved search that you want to update.
    3. Right-click the saved search and select Edit.
    4. Edit the time filter to the Coordinated Universal Time.
    5. To save the updated search filter, click Search

About this task

Fix Pack 1Note: To migrate from Log Analysis 1.3.2 to Log Analysis 1.3.3 FP001, you must first migrate to Log Analysis 1.3.3. After you migrate to Log Analysis 1.3.3, you can install Log Analysis 1.3.3 FP001.
These items are backed up and migrated by this procedure:
  • Saved searches, tags, and Data Sources
  • Data Types including Source Types, Rule Sets, File Sets, and Collections.
  • Topology configuration files
  • Usage statistics
  • LDAP configuration files
  • Custom Search Dashboards
  • Insight Packs
  • Log File Agent (LFA) configuration files
  • License configuration files
  • All chart specifications, including custom chart specifications
In addition to these files, a number of files that are not required for a new installation are also backed up and maintained for reference purposes. These files include:
  • Log files that are generated by IBM® Operations Analytics - Log Analysis
  • Log files that were uploaded in batch mode
Data other than the types of listed previously are not backed up or restored. Any customization that is made outside the files that are listed here must be migrated manually. For example, user information and changes to passwords for default users are not migrated to the target server.

LDAP information for one LDAP server is migrated automatically. If you have more than one DAP server, you must migrate and configure the information from the other LDAP server manually. The migrated information is stored in the ldapRegistry.xml file.

Procedure

To back up and restore data in IBM Operations Analytics - Log Analysis 1.3.3:
  1. Back up your data. For more information, see Backing up data.
  2. Restore your data. For more information, see Restoring data.
To migrate data from IBM Operations Analytics - Log Analysis 1.3.2 to 1.3.3:
  1. Back up your data in 1.3.2.
  2. Move the backed up, compressed files to the <Backup_dir> on the 1.3.3 server.
  3. Restore the data on the 1.3.3 server.

What to do next

You must update any existing Logstash plugins for IBM Operations Analytics - Log Analysis 1.3.2 after you migrate to IBM Operations Analytics - Log Analysis 1.3.3 to ensure that Logstash installations function correctly. For more information about updating the Logstash, see Updating the Logstash plugin after migration in the Upgrading, backing up, and migrating data section.

If you manually configured Hadoop in IBM Operations Analytics - Log Analysis 1.3.2, you must run the following command after you migrate to IBM Operations Analytics - Log Analysis 1.3.3.
<HOME>/IBM/LogAnalysis/utilities/migration/hadoop_config_migration.sh
Submit the Name Node and Data Node details to ensure that configuration and status details are updated and visible in the Hadoop Integration screen.
Note: Hadoop configuration migration from 1.3.2 to 1.3.3 is only supported for IBM Open Platform Hadoop installations. The Hadoop configuration migration is required to view details in the Hadoop Integration screen. Other Hadoop configurations continue to function without this feature.
Note: You must use the same Log Analysis user for both backup and restore. This is a Hadoop configuration requirement.
If you migrate data to a new target server, you must complete the following steps to update the data source with the relevant information for the server:
  1. Log in to IBM Operations Analytics - Log Analysis and open the Data sources workspace.
  2. Identify the data sources that use Local file as the location in their configuration. For each of these data sources, open the data source and complete the steps in the wizard without changing any data. This action updates the data sources with the relevant information for the new server.
  3. Identify the data sources that use Custom as the location in their configuration. For each of these data sources, open the data source and complete the steps in the wizard, updating the host name to match the new one.