HDFS audit logging

By default, Hadoop Distributed File System (HDFS) auditing creates daily logs that are multiple gigabytes in space. You can manage these audit logs using the Ambari administrative console.

Procedure

  1. Log in to the Ambari console and navigate to the HDFS > Configs > Advanced section.
  2. Expand the Advanced hdfs-log4j section and scroll to the hdfs audit logging section.
  3. To manage the size of number of HDFS audit logs, add the following lines:
    log4j.appender.DRFAAUDIT=org.apache.log4j.RollingFileAppender
    log4j.appender.DRFAAUDIT.MaxFileSize=100MB
    log4j.appender.DRFAAUDIT.MaxBackupIndex=10
    Change the MaxFileSize and MaxBackupIndex accordingly.
  4. Comment out the following line:
    log4j.appender.DRFAAUDIT=org.apache.log4j.DailyRollingFileAppender
  5. After making changes to the audit logging configuration, click Save and then select to Restart All Affected.