Technical Blog Post
Abstract
Log Analysis - creating data retention.
Body
Log Analysis can only store short term data in solr. For long term data you should store them in hadoop.
Whether you are storing data in solr or hadoop, you want to set a retention period for your data by deleting the older data to ensure your Log Analysis will run error free and with optimum performance. If you store too much data in solr, you may face performance degraded issue or more severe your Log Analysis may not start.
The procedures to setup daily data retention process for your data are:
1) Open the <HOME>/IBM/LogAnalysis/utilities/deleteUtility/delete.properties file.
2) Locate the following lines and specify the use case that you want to run as 4:
[useCase]
useCaseNumber = 4
3) Change the following value to the retention you want to set in delete.properties. For example, I want to retain 2 months of data:
[userCase_4]
retentionPeriod=30d
4) Put the correct value for the following in the delete.properties file and save this file:
[mandatoryInput]
hostName = <Log Analysis server's hostname>
port = <https port (by default 9987) on the Log Analysis server>
userName = unityadmin
delayDuration = 1000
5) Update the <HOME>/IBM/LogAnalysis/utilities/deleteUtility/callDeleteUtility.sh
<Path to Python> deleteUtility.py <password> <target>
where <Path to Python> is the location to which python is installed and the <password> is the password for the associated user name defined in the delete.properties file. <target> is the target data that you want to delete. Use it to specify the type of data that you want to delete, -solr or -hadoop . The default value is -solr
6) Run the following to include the callDeleteUtility.sh to cron.
sh ./createCron.sh
7) You can verify the crontab entry by running:
crontab -l | grep deleteUtility
An example of output is:
0 0 */1 * * /home/laadmin/IBM/LogAnalysis/utilities/deleteUtility/callDeleteUtility.sh
UID
ibm11081893