IBM Support

Planning Analytics on Cloud - Backup process using PowerShell script

How To


Summary

A Planning Analytics administrator would like to perform their own backup of the Planning Analytics Data folder by using a Planning Analytics Turbo Integrator process.

For security reasons, third-party utilities (for example 7-zip) cannot be installed on the Cloud.

Environment

IBM performs backups of the shared folder daily. IBM retains data backups for the last 28 days:
  • Seven daily backups of the most recent days.
  • Four weekly backups of the most recent weeks.
For the IBM Planning Analytics On-Demand cloud service, IBM retains data backups for the last 28 days as follows:
  • Three daily backups of the most recent days.
  • Two weekly backups of the most recent weeks.
  • One backup of the most recent 28 days.
See Cloud Services service descriptions for IBM Planning Analytics for the current backup policy.
A Planning Analytics administrator might want to:
  • Perform backups at a higher frequency, or at ad hoc intervals.
  • Have a different retention policy.
  • Be able to recover data without having to log a service request.

Steps

1. Remove Locked Files From Data Folder
To ensure the backup routine is not interrupted, make sure that there are no locked files in the DataBaseDirectory folder.
  • tm1s.log. This file is the live transaction log. By default it is located in the DataBaseDirectory folder. In Planning Analytics on Cloud environments, the LoggingDirectory TM1s.cfg parameter is set to change the location of log files to a different folder.
  • tm1rawstore.<timestamp>. This file is the temporary, unprocessed log file for audit logging. By default it is located in the DataBaseDirectory folder. The RawStoreDirectory TM1s.cfg parameter can be used to change the location of this file.
LoggingDirectory=Logs
RawStoreDirectory=Logs
2a. Create a PowerShell Script - Compressed Archive
Create a text file with the .ps1 extension (for example Backup.ps1) containing the following code:
param(
	[String]$folderSource = '..\Data',
	[String]$folderTarget = '..\Archive',
	[String]$compression = 'Optimal',
	[Int32]$daysRetention = 30
)


# CREATE NEW BACKUP FILE

# Define the name of the archive zip file
$timestamp = Get-Date -Format FileDateTime
$destination = $folderTarget +  '\Backup' + $timestamp + '.zip'

# Folder to be excluded
$folderExclude = '}async'

# Backup the source folder as a zip file into the target folder
Get-ChildItem $folderSource -Directory | Where-Object { $_.Name -ne $folderExclude } | Compress-Archive -CompressionLevel $compression -DestinationPath $destination -Update
Get-ChildItem $folderSource -File | Compress-Archive -CompressionLevel $compression -DestinationPath $destination -Update


# DELETE OLD BACKUP FILES

# Evaluate the cutoff date for deletion
$dateCurrent = Get-Date
$dateOldest = $dateCurrent.AddDays( 0 - $daysRetention )

# Delete old backup files
Get-ChildItem -Path $folderTarget | Where-Object { $_.LastWriteTime -lt $dateOldest } | Remove-Item -Force -Recurse
This script has the following settings:
  • The folder to be backed up is called Data, which is a sibling to the folder containing the PowerShell script.
  • A backup file is created with the name Backup<timestamp>.zip.
  • The }async folder is excluded from the backup.
  • The backup file is placed in a folder called Archive, which is a sibling to the folder containing the PowerShell script.
  • The default backup file retention period is set to 30 days.
Note: Due to a limitation in the underlying Microsoft .NET API, the Compress-Archive cmdlet has a maximum individual file size limit of 2 GB.
2b. Create a PowerShell Script - Copied Archive
If the DataBaseDirectory folder contains files greater than 2 GB, then an alternative script can be used that creates a backup copy of the folder. As the backups are not compressed, they take up more room on the server.
Create a text file with the .ps1 extension (for example Backup.ps1) containing the following code:
param(
	[String]$folderSource = '..\Data',
	[String]$folderTarget = '..\Archive',
	[Int32]$daysRetention = 30
)


# CREATE NEW BACKUP FOLDER

# Define the name of the backup folder
$timestamp = Get-Date -Format FileDateTime
$destination = $folderTarget + '\Backup' + $timestamp + '\Data'

# Define the name of the log file
$fileLog = $folderTarget + '\Backup' + $timestamp + '\Backup' + $timestamp + '.log'

# Folder to be excluded
$folderExclude = '}async'

# Create the target folder
New-Item -ItemType "directory" -Path $destination

# Backup the source folder as a copy into the target folder
# Copy options
#	/e			Copies subdirectories. This option automatically includes empty directories
#	/z			Copies files in restartable mode
#	/copy:<copyflags>	Specifies which file properties to copy. D - Data, A - Attributes, T - Time stamps
#	/dcopy:<copyflags>	Specifies what to copy in directories. D - Data, A - Attributes, T - Time stamps
#	/mir			Mirrors a directory tree
#	/sl			Don't follow symbolic links and instead create a copy of the link
# File selection options
#	/xd <directory>		Excludes directories that match the specified names and paths
#	/is			Includes the same files. Same files are identical in name, size, times, and all attributes
# Retry options
#	/r:<n>			Specifies the number of retries on failed copies
#	/w:<n>			Specifies the wait time between retries, in seconds
# Logging options
#	/v			Produces verbose output, and shows all skipped files
#	/ts			Includes source file time stamps in the output
#	/fp			Includes the full path names of the files in the output
#	/np			Specifies that the progress of the copying operation will not be displayed
#	/log:<logfile>		Writes the status output to the log file (overwrites the existing log file)
robocopy $folderSource $destination /e /z /copy:DAT /dcopy:DAT /mir /sl /xd $folderExclude /is /r:1000 /w:3 /v /ts /fp /np /log:$fileLog


# DELETE OLD BACKUP FOLDERS

# Evaluate the cutoff date for deletion
$dateCurrent = Get-Date
$dateOldest = $dateCurrent.AddDays( 0 - $daysRetention )

# List old backup folders to be deleted in log file
'------------------------------------------------------------------------------' | Out-File -FilePath $fileLog -Append -Encoding ascii
'   DELETED' | Out-File -FilePath $fileLog -Append -Encoding ascii
'------------------------------------------------------------------------------' | Out-File -FilePath $fileLog -Append -Encoding ascii

# Obtain a list of folders to be deleted
$deleteList = @( Get-ChildItem -Path $folderTarget | Where-Object { $_.LastWriteTime -lt $dateOldest } )

# Determine if no folders are to be deleted
if( $deleteList.Count -eq 0 ){
	
	# Output message
	'    None' | Out-File -FilePath $fileLog -Append -Encoding ascii
	
}else{
	
	# List the folders to be deleted
	$deleteList | Out-File -FilePath $fileLog -Append -Encoding ascii
	
	# Delete old backup folders
	$deleteList | Remove-Item -Force -Recurse
	
}
This script has the following settings:
  • The folder to be backed up is called Data, which is a sibling to the folder containing the PowerShell script.
  • A backup folder is created with the name Backup<timestamp>.
  • The backup folder is placed in a folder called Archive, which is a sibling to the folder containing the PowerShell script.
  • A copy of the Data folder is placed in the Backup<timestamp> folder.
  • The }async folder is excluded from the backup.
  • A log file of the copy routine, called Backup<timestamp>.log, is placed in the Backup<timestamp> folder.
  • The default backup file retention period is set to 30 days.
3. Create a Planning Analytics Turbo Integrator Process
Create a Planning Analytics Turbo Integrator Process containing the following code in either the Prolog or the Epilog.
# Relative location of backup script
sScript = '..\Script\Backup.ps1';

# Retention period for backup files (in days)
# The default period is 30 days
nDaysRetention = 7;

# Command line to be executed
sCommand = 'Powershell ' | sScript;
sCommand = sCommand | ' -daysRetention ' | NumberToString( nDaysRetention );

# Do not wait for the command to finish executing, to minimise risk of hanging and locking
nWait = 0;

# Run the PowerShell script
ExecuteCommand( sCommand, nWait );
The Turbo Integrator process:
  • Assumes that the PowerShell script is called Backup.ps1 and is located in a folder called Script, which is a sibling to the DataBaseDirectory folder.
  • Overrides the retention period in the PowerShell script, and uses a period of 7 days instead. (Amend the setting for the nDaysRetention variable as required.)
The process can be scheduled to run regularly by being placed in a Planning Analytics Chore.
Planning Analytics is an in-memory application. It is recommended that data is saved from memory to disk before a backup is performed; this save can be achieved by a SaveDataAll command.
4. Folder structure
An example of the folder structure assumed by the preceding code is as follows:
  • \prod\<TM1Model>\Archive.  Location of the backup files.
  • \prod\<TM1Model>\Data. The DataBaseDirectory folder.
  • \prod\<TM1Model>\Script.  Location of the PowerShell script.
Folder structure

Document Location

Worldwide

[{"Type":"MASTER","Line of Business":{"code":"LOB10","label":"Data and AI"},"Business Unit":{"code":"BU059","label":"IBM Software w\/o TPS"},"Product":{"code":"SSD29G","label":"IBM Planning Analytics"},"ARM Category":[],"Platform":[{"code":"PF025","label":"Platform Independent"}],"Version":"All Versions"}]

Product Synonym

PA;PAoC;TM1

Document Information

Modified date:
05 December 2022

UID

ibm16471179