Known issues on FIPS-enabled clusters

You cannot connect to external SMB storage volumes on FIPS-enabled clusters

Applies to: 4.7.0 and later

The SMB CSI Driver for Kubernetes (csi-smb-driver), which is required to connect to external SMB storage volumes, is not supported on FIPS-enabled clusters.

Analytics Engine powered by Apache Spark

You cannot connect to a Cloudant database

Connecting to a Cloudant® database with Analytics Engine powered by Apache Spark is not supported.

Spark 3.3 & R is not supported.

Python 3.9 is not supported.

Common core services

Not all connectors are supported in a FIPS-enabled environment. See the information for the individual connectors at Connectors for projects and catalogs.

Services that use the Flight service

Applies to: 4.7.0 and later

On a FIPS-enabled cluster, the Flight service blocks the connection to any data source that does not support FIPS.

Data Refinery

Data Refinery flow job fails in a FIPS cluster for a SAV target file with encryption

Applies to: 4.7.0 and later

On a FIPS-enabled cluster if you run a Data Refinery flow job where the target is an SAV file and you enter an encryption key, the job will fail.

DataStage

Cannot run a DataStage® job with data from certain connections

Applies to: 4.7.0 and later

DataStage does not support the Elasticsearch connection in a FIPS-enabled environment.

Execution Engine for Apache Hadoop

You cannot connect to a JDBC data source on your CDH Cluster without configuring the database to support FIPS encryption

When you install Execution Engine for Apache Hadoop in a FIPS-enabled cluster and want to connect to your JDBC data source (Hive via Execution Engine for Hadoop or Impala via Execution Engine for Hadoop), you must also connect to a FIPS-ready CDH cluster with databases that are configured to support FIPS encryption.

You cannot use Livy to connect to a Spark cluster without loading the digest package

Applies to: 4.7.0 and later

If you need to use Livy to connect to a Spark cluster or use any other packages that depend on the digest package, you must load the digest package from a non-IPDS compliant library. To load the digest package, run the following command.
library(digest, lib.loc='/opt/not-FIPS-compliant/R/library') 
library(sparklyr)
Note: If you load the digest package, Execution Engine for Apache Hadoop will no longer be FIPS-compliant.
You cannot connect to an HDFS data source

You cannot add an SSL certificate and connecting to an HDFS data source fails.

Applies to: 4.7.0 and later

Workaround
  1. Create a Kubernetes secret that is named connection-ca-certs by using the SSL certificate.
    oc create secret generic connection-ca-certs --from-file=/tmp/certificate1.pem --from-file=/tmp/certificate2.pem --from-file=cert-other.pem
  2. Restart the connection pod.
  3. Try to establish the connection again.
You cannot connect to Impala via Execution Engine for Hadoop or Hive via Execution Engine for Hadoop data sources

Applies to: 4.7.0 and later

In FIPS-enabled clusters, you cannot connect to Impala via Execution Engine for Hadoop or Hive via Execution Engine for Hadoop data sources.

IBM Match 360

IBM® Match 360 cannot bulk load data on FIPS-enabled clusters if the Red Hat® OpenShift® Container Platform version is 4.12.30 or later

Applies to: 4.7.0 and later

When OpenShift Container Platform 4.12.30 or later is FIPs-compliant, IBM Match 360 does not support loading data sets by using bulk load jobs. This issue does not occur on versions of OpenShift Container Platform earlier than 4.12.30.

ocp4-cis-api-server-tls-cipher-suites violations can occur

Applies to: 4.7.0 and later

If you install the OpenShift Compliance Operator to scan for CIS Red Hat OpenShift Container Platform 4 Benchmark violations, you must allow an exception for the following violation.

  • ocp4-cis-api-server-tls-cipher-suites

    Clients that use IBM Java Semeru to talk to kube-apiserver in FIPS mode must use TLS ciphers.

RStudio® Server Runtimes

If you need to use Livy to connect to a Spark cluster or use any other packages that depend on the digest package, such as sparklyr, Shiny®, arulesViz, or htmltools packages, you must load the digest package from a non-FIPS compliant library. See Using Livy to connect to a Spark cluster.

Watson™ Machine Learning

For SPSS deployments, the following data sources are not compliant with FIPS.
  • Cloud Object Storage
  • Cloud Object Storage Infrastructure
  • Storage volumes

After you upgrade from Cloud Pak for Data 4.6.x to version 4.7, you might encounter issues with auto-generated notebooks that are saved from AutoAI experiments if you are running them on a FIPS-enabled cluster. If you encounter the following error, manually update the autoai-libs Python library in the notebook to version 1.14.10 to resolve the problem.

[digital envelope routines: EVP_DigestInit_ex] disabled for FIPS

Watson Knowledge Catalog

Communication with external Kafka does not work in a FIPS-enabled cluster.

Watson Studio

The Visual Studio Code extension does not work on a FIPS-enabled cluster when the Cloud Pak for Data route uses reencrypt termination.

Watson Studio Runtimes

Notebook environments for R are not FIPS-compliant. Notebook environment Runtime 22.1 on Python 3.9 is not FIPS-compliant.