Backup and restore scripts
Backup and restore script is written in Python to run Db2 commands inside the Db2u pod in the OpenShift® environment.
Before you begin
- Install the Python 3 and
OC
CLI and logged in to the environment. - Export the project name variable with the name of the environment. The default variable is
ibm-data-cataloging
. - For restore, check whether the
dcs-Backup
folder contains the tar.gz file that is generated from the previous backup.
About this task
- Scale pods to 0
- Shutdown Db2
- Run backup or restore with Db2 move.
- Get the tar archive of backup files.
- Clean files from the Db2 pod
- Start Db2
- Scale pods to 1
Procedure
- Get the script from the utility scripts repo resources repo then run it with the following
command.
python3 backup_restore.py
- The script contains a basic menu with the following options:
- Backup
- Restore
- Shutdown Db2
- Start Db2
- Exit
The first option does not need any steps. It must start with the steps that are described in the overview. At the end, the backup tar.gz file must be in a directory that is named
dcs-Backup
, where the script runs.The second option lists
dcs-Backup
tar.gz files. You can select one from those and run it with the steps that are described in the overview. After the restore is done, you need to go to the Data Cataloging user interface and refresh the summary database to see the imported records.After one of the first two options is finished, the system starts the scale pods. Some of them can fail in crashloop or error, but after the depending pods are running, they must recover after running state pods.
The third and fourth options are only used if the backup fails, leaving the environment in
maintenance mode
. To restore the state of Data Cataloging, you need to scale pods manually.