By default, Hadoop Distributed File System (HDFS) auditing creates daily logs that are
multiple gigabytes in space. You can manage these audit logs using the Ambari administrative
console.
Procedure
- Log in to the Ambari console and navigate to the section.
- Expand the Advanced hdfs-log4j section and scroll to the hdfs
audit logging section.
- To manage the size of number of HDFS audit logs, add the following lines:
log4j.appender.DRFAAUDIT=org.apache.log4j.RollingFileAppender
log4j.appender.DRFAAUDIT.MaxFileSize=100MB
log4j.appender.DRFAAUDIT.MaxBackupIndex=10
Change
the MaxFileSize and MaxBackupIndex accordingly.
- Comment out the following line:
log4j.appender.DRFAAUDIT=org.apache.log4j.DailyRollingFileAppender
- After making changes to the audit logging configuration, click Save and
then select to Restart All Affected.