Security Bulletin
Summary
IBM products for Cloudera Data Platform and Hortonworks Data Platform are affected by critical Apache Log4j vulnerability (CVE-2021-44228). A malicious user could exploit this vulnerability to run arbitrary code as the user or service account running the affected software. The fix includes Apache Log4j v2.16.
Vulnerability Details
CVEID: CVE-2021-44228
DESCRIPTION: Apache Log4j could allow a remote attacker to execute arbitrary code on the system, caused by the failure to protect against attacker controlled LDAP and other JNDI related endpoints by JNDI features. By sending a specially crafted code string, an attacker could exploit this vulnerability to load arbitrary Java code on the server and take complete control of the system.
Note: The vulnerability is also called Log4Shell or LogJam.
CVSS Base score: 10
CVSS Temporal Score: See: https://exchange.xforce.ibmcloud.com/vulnerabilities/214921 for the current score.
CVSS Vector: (CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H)
Affected Products and Versions
| Affected IBM Product(s) | Version(s) |
| Hortonworks Data Platform (HDP) with IBM | 3.0, 3.0.1 |
|
Cloudera Data Platform (CDP) with IBM
| 7.0 - 7.1.7 |
Remediation/Fixes
Customers are encouraged to act swiftly to resolve this issue.
Apply hotfix
Download all files from this repo: https://github.com/cloudera/cloudera-scripts-for-log4j
Steps
Run the following script on all affected cluster nodes.
NOTE: After applying the Short Term Resolution, if you add a node, you will need to re-apply the Short Term Resolution again on the new nodes.
Script: run_log4j_patcher.sh [cdp|cdh|hdp]
Function: The run_log4j_patcher.sh script scans a directory for jar files and removes JndiLookup.class from the ones it finds. Do not run any other script in the downloaded directory--they will be called by run_log4j_patcher.sh automatically.
- Stop all running jobs in the production cluster before executing the script
- Navigate to Cloudera Manager > YARN > Configuration and ensure that yarn.nodemanager.delete.debug-delay-sec is set to 0 If the value is not zero, you must restart the YARN service after setting the value to 0
- Navigate to Cloudera Manager > YARN > Configuration and search for yarn.nodemanager.local-dirs to get the configured Node Manager Local Directory path
- Remove filecache and usercache folder located inside the folders that are specified in yarn.nodemanager.local-dirs
- Download all files from the GitHub repo and copy to all nodes of your cluster.
- Run the script as root on ALL nodes of your cluster.
a. Script will take 1 mandatory argument (cdh|cdp|hdp)
b. The script takes 2 optional arguments: a base directory to scan in, and a backup directory. The default for both are /opt/cloudera and /opt/cloudera/log4shell-backup, respectively. These defaults work for CM/CDH 6 and CDP 7. A different folder will be updated for HDP. - Ensure that the last line of the script output indicates ‘Finished’ to verify that the job has completed successfully. The script will fail if a command exits unsuccessfully.
- Restart Cloudera Manager Server, all clusters, and all running jobs and queries.
Usage: $PROG (subcommand) [options]
Subcommands:
- help Prints this message
- cdh Scan a CDH cluster node
- cdp Scan a CDP cluster node
- hdp Scan a HDP cluster node
Options (cdh and cdp subcommands only):
-t <targetdir> Override target directory (default: distro-specific)
-b <backupdir> Override backup directory (default: /opt/cloudera/log4shell-backup)
Environment Variables (cdh and cdp subcommands only):
The SKIP_* environment variables should only be used if you are running the script again and want to skip phases that have already completed.
SKIP_JAR If non-empty, skips scanning and patching .jar files
SKIP_TGZ If non-empty, skips scanning and patching .tar.gz files
SKIP_HDFS* If non-empty, skips scanning and patching .tar.gz files in HDFS
RUN_SCAN If non-empty, runs a final scan for missed vulnerable files.
This can take several hours.
NOTE: CDH/CDP Parcels: The script removes the affected class from all CDH/CDP parcels already installed under /opt/cloudera. This script needs to be re-run after new parcels are installed or after upgrading to versions of CDH/CDP that do not include the long-term fix.
Removing affected classes from Oozie Shared Libraries (CDH & CDP)
The vulnerability affects client libraries uploaded in HDFS by Cloudera Manager. The script takes care of Tez and MapReduce libraries however Oozie libraries will need to be updated manually. The following section only applies to Cloudera Data Hub and Cloudera Data Platform releases.
Follow the instructions below to secure the Oozie shared libraries:
1. Execute the run_log4j_patcher.sh on the affected cluster.
2. Navigate to Cloudera Manager > Oozie > Actions -> “Install Oozie ShareLib” to re-upload the Oozie libraries in the HDFS from Cloudera Manager.
IMPORTANT: Ensure that the Oozie service is running prior to executing the command.
Removing affected classes from Oozie Shared Libraries (HDP)
Run these commands to update Oozie share lib:
su oozie kinit oozie /usr/hdp/current/oozie-server/bin/oozie-setup.sh sharelib create -fs hdfs://ns1 oozie admin -oozie http(s)://<oozie-host/loadbalancer>:11(000|443)/oozie -sharelibupdate
For the latest updates from Cloudera, refer to Resolution for TSB-545 - Private Cloud.
Known Limitations: Cloudera Data Hub clusters using packages rather than parcels are not yet supported with this fix.
Workarounds and Mitigations
None
Get Notified about Future Security Bulletins
References
Acknowledgement
Change History
14 Jan 2022: Initial Publication
*The CVSS Environment Score is customer environment specific and will ultimately impact the Overall CVSS Score. Customers can evaluate the impact of this vulnerability in their environments by accessing the links in the Reference section of this Security Bulletin.
Disclaimer
Review the IBM security bulletin disclaimer and definitions regarding your responsibilities for assessing potential impact of security vulnerabilities to your environment.
Document Location
Worldwide
Was this topic helpful?
Document Information
Modified date:
17 January 2022
Initial Publish date:
14 January 2022
UID
ibm16541046